Meta survey found 19% of teens on Instagram saw unwanted nude images

Meta facing allegations from global leaders that company’s products harm young users

By
Reuters
|
Instagram app icon is seen on a smartphone in this illustration taken October 27, 2025. — Reuters
Instagram app icon is seen on a smartphone in this illustration taken October 27, 2025. — Reuters 

Nearly 1 in 5 users aged 13 to 15 told Meta that they saw “nudity or sexual images on Instagram” that they didn’t want to view, according to a court filing.

The document, made public on Friday as part of a federal lawsuit in California and reviewed by Reuters, includes portions of a March 2025 deposition of Instagram head Adam Mosseri.

Mosseri said the company does not share survey results “in general,” adding that self-reported surveys are “notoriously problematic,” according to the deposition.

Meta, which owns Facebook and Instagram, is facing allegations from global leaders that the company’s products harm young users. 

In the US, thousands of lawsuits in federal and state court accuse the company of designing addictive products and fueling a mental-health crisis for minors.

The statistic on explicit images came from a survey of Instagram users about their experiences on the platform, Meta spokesperson Andy Stone said, and not a review of posts themselves.

The company in late 2025 said it would remove images and videos “containing nudity or explicit sexual activity, including when generated by AI,” with exceptions considered for medical and educational content.

About 8% of users in the 13 to 15 age group also said they had “seen someone harm themselves or threaten to do so on Instagram,” according to the deposition.

Most sexually explicit images were sent via private messages between users, Mosseri said in his deposition, and Meta must consider users’ privacy when reviewing them.

“A lot of people don't want us reading their messages,” he said.