TikTok releases report on teen safety as platforms try to preempt further regulator scrutiny

The news: TikTok has released a report with Praesidio Safeguarding, an independent agency, tracking the popularity and impact that viral self-harm-related content and dangerous "challenges" on the app have on teens and children, per Social Media Today.

  • The report is the latest move from TikTok and other social platforms to signal to users and advertisers that it’s taking steps to protect children and avoid issues like the ones that have plagued Instagram and Facebook this year. Issues with the latter services have brought increased regulatory scrutiny to competitors.

More on this: The Praesidio report, based on a survey of over 10,000 teens, parents, and teachers across 10 countries including the US and UK, found that while awareness of viral online “challenges” was high across all three groups, only 21% of teens actually took part in a challenge of any kind.

  • In fact, most teens felt neutral or positive about challenges. Only 2% felt that a challenge they participated in was dangerous, and just 11% reported a negative impact. Meanwhile, 54% felt the impact was neutral, and 34% felt it was positive. Among teens who identified a positive impact from participating, 64% said it helped improve their friendships and relationships.
  • Still, the reasons why teens participated in these challenges show how susceptible they can be to dangerous ones. Fifty percent of teens cited “getting views, comments, and likes” as a main reason for taking part in a challenge, and another 46% ranked “impressing others” in their top three reasons.
  • TikTok isn’t the only platform where potentially dangerous trends can go viral, but it’s had its share of challenges reach the mainstream and create a moral panic. The report also found that 46% of teens wanted more information about the risks of challenges. In a blog post published Wednesday, TikTok said it is working to provide those resources.

Why this matters: After a fall report on Instagram found that the app had harmful effects on teens and children, social platforms have been racing to introduce safety features for underage users and to get the word out to regulators and advertisers that their products are safe for children.

  • Last month, Snap, TikTok, and YouTube were called to testify before the US Congress about each platform’s effects on children. At the hearing, TikTok said it would support regulatory changes that would prevent or increase the difficulty of gathering data from minors.
  • TikTok recently added features that include redirecting users to resources when they search for terms related to eating disorders and self-harm.

"Behind the Numbers" Podcast