- Anthropics threat information report outlines the acceleration of AI -attack
- AI is now burning all parts of the cyberattack process
- Such an attack is identified by ‘Vibe Hacking’
One of the world’s largest AI companies, anthropic, has warned that its chatbot has become “weapons” by threat players to “commit large-scale theft and extortion of personal data”. Anthropics threat information report describes ways in which the technology is used to perform sophisticated cyberattacks.
The weapon AI makes hackers faster, more aggressive and more successful – and the threat report outlines that ransomware attacks that would have previously required years of training can now be designed with very few technical skills.
These cyber attacks are lucrative for hackers where AI is now used for fraudulent activity such as stealing credit card information and identity theft, where attackers even use AI to analyze stolen data.
“Vibe Hacking”
Defenders have long warned that AI lowers barriers to cybercrime, giving low qualified hackers the opportunity to perform complex attacks, but LLMs are now helping criminals at any time along the attacking process.
The report describes a special threat, the duber ‘Vibe-Hacking’, which refers to a campaign in which Claude was used to scale and build a data press plan. The name is a reference to the ‘vibe coding’ method of software development that is strongly dependent on AI to generate code and build applications.
Cluade’s code execution environment was used for; ‘Automatists reconnaissance, identification harvest and network penetration in scale, potentially affecting at least 17 different organizations in just the last month across governments, healthcare, relief services and religious institutions.’
Anthropics studies found that cyber criminals targeted a number of sectors focusing on data theft and extortion. These attacks resulted in the ‘compromise of personal items, including health data, financial information, government information and other sensitive information, with direct ransom occasionally requires $ 500,000.’



