To note,
The Journal and Edelson conducted the same test for TikTok and Snapchat and found that neither platform recommended sexual videos to the teen accounts they created. The accounts never even saw recommendations for age-inappropriate videos after actively searching for them and following creators that produce them.
The Journal says that Meta's employees identified similar problems in the past, based on undisclosed documents it saw detailing internal research on harmful experiences on Instagram for young teenagers. Meta's safety staff previously conducted the same test and came up with similar results, the publication reports. Company spokesperson Andy Stone shrugged off the report, however, telling
The Journal: "This was an artificial experiment that doesn’t match the reality of how teens use Instagram." He added that the company "established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months."
Back in January, Meta
introduced significant privacy updates related to teen user protection and automatically placed teen users into its most restrictive control settings, which they can't opt out of. The Journals' tests were conducted after those updates rolled out, and it was even able to replicate the results as recently as June. Meta released the updates shortly after
The Journal published the results of a previous experiment, wherein it found that Instagram’s Reels
would serve "risqué footage of children as well as overtly sexual adult videos" to test accounts that exclusively followed teen and preteen influencers.