Minors in Tennessee are suing Musk’s xAI, alleging that Grok generated explicit images of them

xAI and Grok logos are seen in this illustration taken on February 16, 2025. — Reuters

Three plaintiffs from Tennessee, including two minors, sued Elon Musk’s xAI on Monday, alleging that it knowingly designed its Grok image generator to let people create sexually explicit content using real images of others.

The lawsuit, filed in federal court in San Jose, California, seeks class action status for individuals in the United States who were “reasonably identifiable” in sexualized images or videos generated by Grok based on real images of themselves.

The artificial intelligence company did not immediately respond to a Reuters request for comment.

After an outcry over sexually explicit content generated by the chatbot, xAI said in January that it had blocked all users from editing images of “real people in revealing clothing” and from generating images of people in revealing clothing in “jurisdictions where it is illegal.”

Governments and regulators around the world have also since launched investigations, imposed bans and required safeguards in a growing push to curb illegal and offensive material.

The lawsuit alleges that xAI failed to install safeguards to prevent its systems from generating sexual content involving minors. All three plaintiffs were minors at the time the images were generated.

Plaintiffs allege that their real photos were digitally altered into explicit content and then shared online via platforms, causing emotional distress and creating public nuisance.

They are seeking unspecified damages, attorneys’ fees and an injunction requiring xAI to cease the alleged practices.

“These are children whose school photos and family photos were turned into child sexual abuse material,” plaintiffs’ attorney Annika Martin of Lieff Cabraser Heimann & Bernstein said in a statement.

“Elon Musk ⁠and xAI deliberately designed Grok to produce explicit sexual content for financial gain, without regard for the children and adults who would be harmed.”

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top