Instagram Connects Vast Pedophile Network: WSJ
Instagram, owned by Meta, helps connect and promote a vast network of accounts dedicated to the dissemination of underage-sex content, according to investigations. The social networking platform itself claimed to be “improving internal controls.”
Instagram recommendation algorithms promote pedophile networks
An investigation by the Wall Street Journal and researchers at Stanford University and the University of Massachusetts Amherst found that Instagram’s recommendation algorithms are promoting pedophile networks. Accounts on the image-sharing network order and sell child sexual abuse content directly on the platform.
“Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the Journal and the academic researchers found,” wrote the WSJ.
Accounts of criminals are hidden from the general public
Such Instagram accounts are hidden from the eyes of most users of the platform, but they exist. The researchers found that Instagram allows people to search for explicit hashtags. The platform allows them to be linked to accounts that use the terms to advertise child sex material for sale.
Instagram accounts offering to sell illegal sex material usually do not post it openly, instead posting a “menu” of content. Then each of them works in a different way. In addition to selling content that harms children, some children are “sold” for face-to-face encounters.
Meta acknowledges the problem
After investigating, the WSJ reached out to Meta with questions. The company acknowledged the problems in its enforcement operations and said it had set up an internal task force to address the issues raised. “Child exploitation is a horrific crime,” the company said, adding, “We’re continuously investigating ways to actively defend against this behavior.”
Meta states they are working on a solution to the problem
The company said it has blocked 27 pedophile networks in the past two years and plans to remove more. After receiving requests from the WSJ, the platform said it blocked thousands of hashtags that sexualize children. In addition, Meta has banned its systems from recommending search terms to users that are related to sexual assault. The company said it is also working to ensure its systems discourage potentially pedophilic adults from communicating with each other or interacting with each other’s content.
Accessing pedophilic content on Instagram is very easy
“That a team of three academics with limited access could find such a huge network should set off alarms at Meta,” said Alex Stamos, the head of the Stanford Internet Observatory and Meta’s chief security officer until 2018, noting that the company has far more effective tools to map its pedophile network than outsiders do. “I hope the company reinvests in human investigators,” he added.
Instagram actively recommends child-sex-content
The researchers created test accounts. As soon as they browsed one such account, they were immediately struck by the “suggested for you” recommendations from alleged sellers and buyers of child sex content, as well as accounts linked to off-platform content trading sites.
The Stanford Internet Observatory used hashtags related to sex with minors and found 405 sellers. According to data collected through Maltego, 112 of these seller accounts collectively had 22,000 unique followers.
Instagram said its internal statistics show that users see child exploitation in less than one in 10,000 posts viewed. It is worth noting that Meta accounts for 85% of the child pornography reports filed to the center, including some 5 million from Instagram.
Twitter is much more active in combating child sexual abuse content
The Stanford team found 128 accounts offering to sell child sexual abuse material on Twitter. This is less than a third of the number they found on Instagram, which has a much larger overall user base than Twitter. Twitter discouraged such accounts to the same extent as Instagram. The platform, recently acquired by Elon Musk, also deleted such accounts much faster than Instagram.
David Thiel, the chief technologist at the Stanford Internet Observatory, said, “Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts.” Thiel, who previously worked at Meta on security and safety issues, added, “You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.”
Instagram displayed illegal material despite knowing about its content
In many cases, Instagram has permitted users to search for terms that its own algorithms know may be associated with illegal material. In such cases, a pop-up screen for users warned that “These results may contain images of child sexual abuse,” and noted that production and consumption of such material causes “extreme harm” to children. The screen offered two options for users: “Get resources” and “See results anyway.” Meta declined to say why it had offered the option.