Instagram algorithms promote ‘vast pedophile network’ on the platform: Report
Instagram's recommendation algorithms are actively promoting pedophilia, according to a joint investigation from The Wall Street Journal and academics at Stanford University and the University of Massachusetts Amherst.
The report notes that Instagram's algorithms connect and promote a "vast network" of pedophiles interested in "underage sex content." The investigation reveals the terrifying extent of Instagram's recommendations, which "connects pedophiles and guides them to content sellers."

Child-Sex Abuse on Instagram
The researchers found that the Meta-owned social media platform allowed users to search hashtags related to child-sex abuse, including graphic terms such as #pedowhore, #preteensex, #pedobait and #mnsfw (minors not safe for work).
The platform offers "menus" of content users to purchase or sell and distribute videos and imagery of self-harm and bestiality. When the researchers set up a test account to access such explicit content on the network, the account was immediately recommended more accounts to follow.
Darkside of Instagram's Recommendations
However, the report found; "Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children." The WSJ report also noted that some accounts allowed buyers to "commission specific acts" or arrange "meet ups.
"Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn't merely host these activities. Its algorithms promote them," the report noted.
This means Instagram goes beyond the regular search and recommends users illicit content related to children. According to the WSJ report, "Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests."
Instagram's Efforts to Combat Pedophilia
Meta didn't acknowledge the findings of the report and a spokesperson for the company said that it would be a setting up an internal task force to investigate the child-sex abuse on the platform. The spokesperson said, "Child exploitation is a horrific crime", and that the company was investigating ways to actively defend against it.
Meta also noted that it took down 490,000 accounts that violated its child safety guidelines in January alone. And in the last two years, it removed 27 pedophile networks. Instagram also blocked thousands of hashtags associated with the sexualization of children and restricted terms from users' searches.
But are those Efforts Enough?
However, Alex Stamos, head of Stanford's Internet Observatory and former chief security officer for Meta, told the Wall Street Journal that Meta's efforts were not enough to tackle the issue of child-sex abuse on the platform.
Stamos noted that alarm bells should be ringing at Instagram as only a team of three academics with limited access was able to access such a vast network. Stamos also showed a lack of confidence in the platform's algorithms to tackle the issue, saying "I hope the company reinvests in human investigators."
The WSJ also recounted incidents where Instagram users reported posts and accounts suspected of child-sex abuse, only for them to be cleared by platform's review team. However, Meta claimed that it was unable to act on these reports due to their high volumes. You can check out the full WSJ report here.


Click it and Unblock the Notifications








