TikTok Reportedly Leads Children's Profiles to Explicit Material In Just a Few Taps

Per findings from a new study, TikTok has been discovered to guide profiles of minors to explicit material in just a couple of steps.

How the Study Was Conducted

Global Witness created simulated profiles using a date of birth for a minor and enabled the platform's content restriction feature, which is intended to restrict exposure to adult-oriented content.

Investigators discovered that TikTok recommended inappropriate and adult-themed search terms to multiple test profiles that were set up on new devices with no previous activity.

Alarming Recommendation Features

Search phrases recommended under the "suggested searches" feature included "very very rude skimpy outfits" and "inappropriate female imagery" – and then advanced to terms such as "hardcore pawn [sic] clips".

Regarding three of the accounts, the adult-oriented recommendations were proposed instantly.

Fast Track to Adult Material

Within minimal interaction, the study team came across explicit material from women flashing to graphic sexual acts.

Global Witness reported that the content sought to avoid detection, usually by showing the video within an benign visual or video.

Regarding one profile, the procedure took two interactions after logging on: one click on the search bar and then a second on the recommended term.

Regulatory Context

Global Witness, whose mandate includes researching technology companies' influence on public safety, said it conducted several experimental rounds.

Initial tests occurred prior to the implementation of safeguarding regulations under the United Kingdom's digital protection law on July 25th, and additional tests after the rules took effect.

Serious Findings

Researchers noted that two videos showed someone who looked like they were below the age of consent and had been submitted to the online safety group, which oversees harmful material involving minors.

Global Witness claimed that TikTok was in violation of the Online Safety Act, which requires tech companies to block children from viewing inappropriate videos such as adult material.

Government Position

An official representative for the UK communications regulator, which is responsible for monitoring the law, commented: "We appreciate the effort behind this investigation and will examine its conclusions."

Ofcom's codes for following the act state that digital platforms that carry a medium or high risk of displaying dangerous material must "configure their algorithms to block dangerous material from young users' timelines.

The platform's rules prohibit adult videos.

Platform Response

The video platform said that after being contacted from the research group, it had taken down the violating content and introduced modifications to its suggestion feature.

"Upon learning of these assertions, we acted promptly to examine the issue, take down videos that violated our policies, and implement enhancements to our search suggestion feature," commented a company representative.

Felicia Armstrong
Felicia Armstrong

A digital strategist and content creator passionate about storytelling and emerging media trends.