With more than two billion monthly active users, Instagram is one of the top-performing social media apps in the world, and purportedly lurking in its cyber shadows is a dangerous web of predators.
That chilling allegation — that Instagram’s recommendation algorithms have promoted a “vast pedophile network” advertising the sale of child sexual abuse material (CSAM) — came from researchers at Stanford University and the University of Massachusetts Amherst who spoke with the Wall Street Journal.
Instagram, owned by Facebook’s parent company, Meta, reportedly allowed users to seek out illicit and nefarious content using CSAM-linked hashtags like #pedowhore, #preteensex, #pedobait, and #mnsfw, an acronym for, “minors not safe for work.”
“Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests,” the researchers found.
Those iniquitous accounts often claimed to be run by the children whose abuse was posted to them. The accounts — found via the aforementioned hashtags — offered to sell users pedophilic content through “menus” of materials, including photos and videos of children harming themselves, enduring sexual abuse, and performing sexual acts with animals.
Researchers with the Stanford Internet Observatory even discovered certain accounts that would purportedly encourage buyers to request content of specific illicit acts. And, for a price, some children were allegedly made available for in-person “meet-ups,” the analysis uncovered.
In a statement to CBN’s Faithwire, Haley McNamara, vice president of the National Center on Sexual Exploitation (NCOSE), condemned the Meta-owned platform for having “enabled predators to reach children for far too long.”
“NCOSE brought the existence of grooming and pedophile networks to Instagram’s attention several years ago and continues to do so, which is why Instagram is on the 2023 Dirty Dozen List of mainstream contributors to sexual exploitation,” said McNamara. “We are glad to see Instagram take some action to remove pedophile networks, but the social media giant must continue to work proactively to prevent this from continuing to happen on a mass scale — without waiting for bad press.”
The Dirty Dozen List is an annually released catalog of 12 mainstream entities NCOSE believes to “facilitate, enable, and even profit from sexual abuse and exploitation.” The goal of the list, McNamara previously told CBN News, is to bring the issue to consumers’ attention in hopes it will motivate them to reach out to corporations, government agencies, and other organizations, urging them to change their policies and practices to better protect children from sexual abuse.
In a statement to the WSJ, a Meta spokesperson vowed to take action following the bombshell report.
“Child exploitation is a horrific crime,” read the statement from Meta. “We’re continuously investigating ways to actively defend against this behavior.”
Meta further stated it has taken down 27 pedophile networks in the last two years and is working to remove more. Additionally, the company said it actively removes accounts linked to users who buy and sell CSAM. In January alone, Meta removed 490,000 accounts violating child safety policies.
Prior to reporting on this issue, Instagram allowed users to see content its own moderating systems determined to be associated with CSAM. When users engaged such content, a notice popped up, informing them the material they were about to view “may contain images of child sexual abuse” and could cause “extreme harm” to children. It then offered users two choices: to either “get resources” or “see results anyway.” The latter option has been disabled in the wake of the WSJ report.
The social media company stated it has established an internal task force to further address the matter.
***As the number of voices facing big-tech censorship continues to grow, please sign up for Faithwire’s daily newsletter and download the CBN News app to stay up-to-date with the latest news from a distinctly Christian perspective.***