A human rights campaign group claims TikTok’s algorithm pushes pornography and sexualised videos to minors. Researchers created fake child accounts, turned on safety settings, and still received explicit search suggestions. These included clips of simulated masturbation and pornographic sex. TikTok says it acted immediately after being alerted and insists it remains committed to safe experiences for young users.
Fake child profiles uncover risky content
In July and August, Global Witness researchers built four TikTok accounts. They posed as 13-year-olds using false birth dates. The platform did not request additional identification. Investigators enabled TikTok’s “restricted mode”. The company promotes this feature as a filter for sexual or mature material. Despite this, accounts received sexualised search prompts in the “you may like” section. These led to videos of women flashing underwear, exposing breasts, and simulating masturbation. At the most extreme, explicit pornography appeared hidden in ordinary-looking clips to evade moderation.
Campaign group raises alarm
Ava Lee from Global Witness called the findings a “huge shock”. She argued TikTok not only fails to protect children but actively recommends harmful material. Global Witness usually investigates how large tech companies affect democracy, climate issues, and human rights. The organisation first discovered explicit content on TikTok during unrelated research in April.
TikTok defends safety measures
Researchers reported the findings earlier this year. TikTok said it removed the flagged content and implemented corrections. But when Global Witness repeated the test in late July, sexual videos appeared again. TikTok insists it has over 50 safety tools for minors. It claims nine out of ten violating clips are deleted before anyone views them. After the report, the company said it improved its search functions and removed additional harmful material.
Children’s Codes increase platform responsibility
On 25 July, the Children’s Codes within the Online Safety Act came into force. Platforms must enforce strict age checks and prevent children from accessing pornography. Algorithms must also block content linked to self-harm, suicide, or eating disorders. Global Witness conducted a second study after the codes began. Ava Lee urged regulators to step in, stressing that children’s online safety must be enforced.
Users question search recommendations
During the investigation, researchers observed TikTok users’ reactions. Some expressed confusion at sexualised search suggestions. One wrote: “can someone explain to me what is up with my search recs pls?” Another asked: “what’s wrong with this app?”
