A recent investigative report by the Wall Street Journal and the Stanford Internet Observatory has uncovered a disturbing revelation regarding Instagram, which is owned by Meta. The platform has been found to harbor a vast network of organized pedophiles.
What sets this case apart is the role played by Instagram’s own algorithms, as they were discovered to be promoting pedophilic content to other individuals with similar interests. Pedophiles themselves were using coded emojis, such as a map or a slice of cheese pizza, to communicate covertly.
The investigation revealed that Instagram’s recommendation systems effectively connect pedophiles and guide them to content sellers. These recommendation systems excel at linking users who share niche interests, including those involved in illegal activities.
Researchers found that pedophilic accounts on Instagram exhibit a mix of audacity and attempts to conceal their activities. Certain emojis serve as coded messages, such as a map symbolizing “minor-attracted person” or the use of “cheese pizza,” which is an abbreviation for “child pornography.” Many of these accounts describe themselves as “lovers of the little things in life,” according to Professor Levine from UMass, as mentioned in the Wall Street Journal report.
The researchers also discovered that Instagram allowed pedophiles to search for explicit content using hashtags like #pedowhore and #preteensex. These hashtags then connected them to accounts that advertised the sale of child-sex material. Some users operated under names like “little slut for you.” Disturbingly, sellers often conveyed the purported age of the child, using phrases like “on chapter 14” or “age 31,” accompanied by a reverse arrow emoji.
This investigation has shed light on the alarming presence of pedophile networks on Instagram and highlights the platform’s role in facilitating and enabling such activities.
BREAKING: Instagram algorithm exposed promoting pedophile networks in massive investigation, video sales, ‘preteensex’ menus, in-person meetups with underage boys and girls, using emojis such as a map and cheese pizza – WSJ
— Jack Poso 🇺🇸 (@JackPosobiec) June 7, 2023
Meta, the parent company of Instagram, has claimed to have taken down 27 pedophile networks in the past two years and intends to continue removing such networks in the future.
Alex Stamos, the former chief security officer at Meta and head of the Stanford Internet Observatory, expressed concern over the company’s approach. Stamos emphasized that Meta possesses more effective tools to identify and address pedophile networks, urging the company to reinvest in human investigators.
During their investigation, researchers created test accounts within the pedophile network and immediately received “suggested for you” recommendations featuring child-sex content. They also encountered accounts linking to external trading platforms.
The pedophile community on Instagram extends beyond creators and buyers of underage sexual content. There are accounts that aggregate pro-pedophilia memes and engage in discussions about accessing children. According to current and former Meta employees involved in child-safety initiatives, the number of accounts primarily dedicated to following such content is estimated to be in the hundreds of thousands or even millions.
Brian Levine, director of the UMass Rescue Lab, stated that Instagram serves as a gateway to explicit child sexual abuse content on the internet. Levine authored a report on child exploitation over the Internet for the National Institute of Justice in 2022.
Furthermore, the report reveals that Meta accounted for 85% of child pornography reports filed with the National Center for Missing & Exploited Children. However, Stanford’s findings indicate that Meta faces more challenges than other platforms due to weak enforcement and design features that facilitate the discovery of both legal and illicit content.
Today the WSJ and Stanford Internet Observatory released important research showing pedophiles use Instagram to share material.
We're happy that Stanford and WSJ followed our steps in working on this critical issue. As an independent researcher with my team, we showed how…
— Andrea Stroppa 🐺 Claudius Nero's Legion 🐺 (@andst7) June 7, 2023
According to David Thiel, chief technologist at the Stanford Internet Observatory, Instagram’s content-discovery features, recommendations, and reliance on search and account links contribute to its problems. Thiel highlights the need for guardrails to ensure nominal safety, which Instagram currently lacks.
Sarah Adams, a Canadian mother of two, has developed a following on Instagram where she discusses child exploitation and the risks of oversharing on social media. Adams occasionally receives disturbing content from her followers related to the platform. In February, she received a message about an account labeled “incest toddlers.”
Adams briefly accessed the account, which had over 10,000 followers and featured pro-incest memes, to report it to Instagram. However, in the following days, she discovered that parents who viewed her Instagram profile were being recommended “incest toddlers” due to her contact with the account.
A spokesperson from Meta (the parent company of Instagram) acknowledged that “incest toddlers” violated their rules and admitted to an enforcement error. The company plans to address inappropriate recommendations as part of its newly formed child safety task force.
Meta also admitted to receiving numerous reports of child sexual exploitation, attributing their failure to act to a software glitch that prevented the processing of a significant portion of user reports.
Meanwhile, while Meta continues to allow pedophiles on its platforms, the website ZeroHedge remains banned.