TikTok accused of pushing 13-year-old users toward pornographic content, watchdog report reveals

Times in Pakistan
0

 

A teenage girl using a smartphone with the TikTok app open, highlighting growing concerns about child safety and exposure to explicit content on social media platforms.

TikTok Under Fire for Allegedly Pushing Sexualized Content Toward Teen Users, Report Finds

A new investigation by UK watchdog Global Witness has found that TikTok’s algorithm may be directing teenage users toward sexually explicit content, even when accounts are set up under strict content-restriction settings. The findings have reignited global debate over social media safety for minors and age-verification accountability among tech giants.

The investigation, published on October 3, revealed that Global Witness researchers created seven new TikTok accounts in the UK, all registered as belonging to 13-year-old users—the minimum age allowed on the platform. Each account was set up on a factory-reset phone with no prior search history to ensure unbiased testing conditions.

Despite activating TikTok’s “restricted mode”—a feature meant to limit exposure to inappropriate or “sexually suggestive” material—researchers found that the platform’s search suggestions were “highly sexualized” from the very first interactions.

Sexualized Content Just “Clicks Away”

According to Global Witness, pornographic and adult-themed content appeared “just a few clicks after account setup.” Three of the test accounts were recommended explicit search terms the very first time they tapped into TikTok’s search bar.

“Our concern isn’t just that TikTok hosts explicit material—it’s that the platform’s search algorithm actively pushes minors toward it,” Global Witness said in its report.

The organization emphasized that such exposure is especially troubling given the platform’s popularity among teens, raising questions about TikTok’s content moderation policies and age-gate enforcement.

TikTok Responds: “We Took Immediate Action”

In response to the findings, a TikTok spokesperson told CNN the company was investigating the claims and had already taken steps to address the issues.

“As soon as we were made aware of these claims, we took immediate action to investigate, remove content that violated our policies, and strengthen our search suggestion features,” the company said.

TikTok added that it has “more than 50 safety tools and settings” designed to protect teen users and ensure age-appropriate experiences. The platform also highlighted that it removes nine out of ten policy-violating videos before they’re ever viewed.

Millions of Underage Accounts Removed Monthly

TikTok insists it proactively removes around six million underage accounts globally every month through a mix of automated detection technology and human moderation.

The company’s transparency report for January–March 2025 showed that 30% of all removed content violated policies against “sensitive and mature themes.” Moderation teams, TikTok said, are trained to detect signals suggesting an account may belong to someone under the age of 13.

Despite these efforts, the Global Witness findings suggest that TikTok’s systems are still exposing minors to harmful content, either due to algorithmic flaws or inadequate enforcement.

Legal Implications Under the UK’s Online Safety Act

The revelations come shortly after the UK’s Online Safety Act introduced new child-protection rules that took effect in late July. The act compels tech companies to verify user ages and prevent minors from accessing harmful content, including pornography and self-harm material.

Media lawyer Mark Stephens told Global Witness that the report’s findings could amount to a “clear breach” of the new legislation.

TikTok did not directly respond to Stephens’ comments when approached for clarification.

Under the Online Safety Act 2023, companies that fail to meet the new safety standards can face hefty fines and increased scrutiny from regulators such as Ofcom, which oversees online video-sharing platforms in the UK.

Although the act also applies to platforms based outside the UK, digital rights advocates like the Electronic Frontier Foundation (EFF) have criticized the law for potentially threatening user privacy through strict age-verification demands.

Global Witness: Testing Before and After the Law Took Effect

Global Witness confirmed it ran its initial tests before the Online Safety Act’s child-protection provisions were fully enforced and conducted additional trials afterward to assess whether TikTok’s performance had improved.

However, the results were similar across both phases, raising doubts about the effectiveness of TikTok’s compliance efforts.

TikTok’s Safety Efforts and “Teen-Focused” Features

TikTok says it is taking its compliance “seriously,” claiming to have implemented robust safeguards aligned with Ofcom’s regulatory framework since 2020.

In recent years, the platform has rolled out several teen-safety features, including:

  • A “guided meditation” mode designed to help young users manage screen time.

  • The disabling of late-night notifications for teens.

  • Stricter privacy settings for users under 16.

Still, critics argue that TikTok’s focus on algorithmic engagement often undermines its own safety commitments, allowing harmful content to slip through moderation gaps.

Broader Pressure on Tech Giants

TikTok is far from the only tech company facing scrutiny over child safety online. Platforms like YouTube and Instagram have also introduced new AI-driven age-verification tools and privacy protections for teen accounts.

In August, YouTube launched a system that uses artificial intelligence to estimate user age, automatically enabling stricter controls for younger audiences. Similarly, Instagram made all new teen accounts private by default last year to prevent unsolicited contact and adult interactions.

Still, regulators and advocacy groups say self-regulation isn’t enough, urging governments to impose stricter accountability for algorithmic harm and data collection practices involving minors.

A Growing Global Concern

The Global Witness report underscores a growing international concern: social media algorithms may be exposing children to adult content faster than traditional moderation systems can respond.

As the world’s most downloaded app, TikTok wields immense influence over youth culture and digital behavior. Experts warn that if platforms like TikTok don’t address these issues swiftly, they could face legal and reputational consequences under evolving digital safety laws.

While TikTok has pledged to strengthen its protections, Global Witness argues that real progress will only come with independent oversight and transparency around how recommendation algorithms function.

For now, as policymakers and tech firms grapple with the tension between user engagement and child protection, one thing is clear: keeping young users safe online has never been more urgent—or more complex.

Tags

Post a Comment

0 Comments

Post a Comment (0)
3/related/default