Nearly 1 in 5 users ages 13 to 15 told Meta they saw “nudity or sexual images on Instagram” that they didn’t want to see, according to a lawsuit.
The document was released Friday as part of a federal lawsuit in California and reviewed by Reutersincludes parts of a March 2025 installment by Instagram CEO Adam Mosseri.
Mosseri said the company does not share survey results “generally,” adding that self-reported surveys are “notoriously problematic,” according to the filing.
Meta, which owns Facebook and Instagram, is facing accusations from global executives that the company’s products are harming young users.
In the US, thousands of lawsuits in federal and state courts accuse the company of designing addictive products and fueling a mental health crisis for minors.
The statistics on explicit images came from a survey of Instagram users about their experiences on the platform, Meta spokesman Andy Stone said, and not a review of the posts themselves.
The company said by the end of 2025 it would remove images and videos “containing nudity or explicit sexual activity, including when generated by AI,” with exceptions being considered for medical and educational content.
About 8% of users in the 13 to 15 age group also said they had “seen someone harm themselves or threaten to do so on Instagram,” according to the filing.
Most of the sexually explicit images were sent via private messages between users, Mosseri said in his statement, and Meta must consider users’ privacy when reviewing them.
“A lot of people don’t want us to read their messages,” he said.



