Amazon’s Q was almost Skynget-Lite: Rogue Code told it to wipe out disks and kill Skytata!


  • A Rogue Prompt told Amazon’s AI to wipe disks and nuke AWS Cloud Profiles
  • Hacker added malicious code through a pull -request revealing cracks in the Open Source Trust models
  • AWS says customer data was safe but scared was real and too close

A recent violation involving Amazon’s AI coding assistant, Q, has raised fresh concerns about the safety of large language model -based tools.

A hacker successfully added a potentially destructive prompt to the AI author’s GitHub archive, and instructed it to wipe a user’s system and delete Sky resources using BASH and AWS CLI commands.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top