Artificial Intelligence
Artificial Intelligence

5 ways cybercriminals are weaponising artificial intelligence

October 30, 2023

Recent years have seen artificial intelligence (AI) surge in popularity among both businesses and individuals. Applications of this technology are widespread, but some of the most common include computer vision solutions, natural language processing systems, and predictive and prescriptive analytics engines.

Although AI technology can certainly offer benefits in the realm of cyber-security, it also has the potential to be weaponised by cybercriminals. As such, it’s crucial for businesses to understand the cyber-risks associated with this technology and implement strategies to minimise these concerns.

Here are five ways cyber-criminals are leveraging AI technology and tips to help businesses safeguard themselves against its weaponisation.


1. Creating and distributing malware

In the past, only the most sophisticated cybercriminals were capable of writing harmful code and deploying malware attacks. However, AI chatbots are now able to generate illicit code in a matter of seconds, permitting cybercriminals with varying levels of technical expertise to launch malware attacks with ease. In addition to writing harmful code, some AI tools can generate deceptive videos claiming to be tutorials on downloading certain versions of popular software that distribute malware to targets’ devices when they view this content.


2. Cracking credentials

Many cybercriminals rely on brute-force techniques to reveal targets’ passwords and steal their credentials so they can then utilise their accounts for fraudulent purposes. Yet, these techniques may vary in effectiveness and efficiency. By leveraging AI technology, cyber-criminals can bolster their password-cracking success rates, uncovering targets’ credentials at record speeds.


3. Deploying social engineering scams

Social engineering consists of cyber-criminals using fraudulent forms of communication (eg emails, texts and phone calls) to trick targets into unknowingly sharing sensitive information or downloading harmful software. Unfortunately, AI technology could cause these scams to become increasingly common by giving cybercriminals the ability to formulate persuasive phishing messages with minimal effort. It could also clean up errors in human-produced copy to make it appear more convincing.


4. Identifying digital vulnerabilities

When hacking into targets’ networks or systems, cybercriminals usually look for software vulnerabilities they can exploit, such as unpatched code or outdated security programs. While various tools can help identify these vulnerabilities, AI technology could permit cybercriminals to detect a wider range of software flaws, thus providing additional avenues and entry points for launching attacks.


5. Reviewing stolen data

Upon stealing sensitive information and confidential records from targets, cyber-criminals generally have to sift through this data to determine their next steps – whether it’s selling this information on the dark web, posting it publicly or demanding a ransom payment in exchange for restoration. This can be a tedious process, especially with larger databases. With AI technology, cyber-criminals can analyse this data much faster, allowing them to make quick decisions and speed up the total time it takes to execute their attacks. In turn, targets will have less time to identify and defend against attacks.


Protecting against weaponised AI technology

Looking ahead, AI technology will likely contribute to rising cyber-attack frequency and severity. By staying informed on the latest AI-related developments and taking steps to protect against its weaponisation, businesses can maintain secure operations and reduce associated cyber-threats. Key safeguards for businesses to consider include adopting workplace policies that promote proper cyber-hygiene, implementing automated threat detection technology to engage in continuous network monitoring, creating detailed cyber-incident response plans and purchasing ample cyber-cover.

“Like it or not, AI is here to stay, and it’s not just the good guys that will be making the most of it. The ability to create cyber mayhem just accelerated 10-fold. You may not have the time or inclination to keep up to speed with it all, but you might be able to sleep a bit easier knowing that you have a robust Cyber Insurance package in your war chest.”

James Bishop, Client Manager – Commercial & Technology


Talk to our Cyber insurance experts today and protect your business from the worst happening.

Related insights

Related insights

View all
Combined Shape
Combined Shape
Group CEO sitting in high back chair looking to camera and smiling
Group CEO sitting in high back chair looking to camera and smiling

Question about technology insurance. Talk to one of the Macbeth team.

Send us a message