Google warns that criminals are building and selling illegal AI tools – and the market is growing


  • AI tools are being custom-built for criminals, new GTIG report shows
  • These tools side step AI bumper designed for safety
  • ‘Just-in-time’ AI malware shows how criminals are developing their techniques

Google’s Threat Intelligence Group has identified a worrying shift in AI trends, where AI is no longer only used to make criminals more productive, but is also now specifically developed for active operations.

Its research found that large language models (LLMs) are particularly used in malware, with ‘Just-in-Time’ AI such as PROMPTFLUX – which is written in VBScript and works with Gemini’s API to request ‘specific VBScript obfuscation and evasion techniques to facilitate “just-in-time” self-modification that is likely to avoid static signatures.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top