Teens can be exposed to potentially harmful content related to suicide and eating disorders just minutes after creating a TikTok account, new research finds, likely heightening scrutiny of the impact of this app on its younger users.
The Center for Countering Digital Hate’s recent report on TikTok revealed a chilling reality: In less than eight minutes, users can access content related to suicide and eating disorders.
As social media platforms continue to expand their reach, parents and guardians should remain vigilant for content of concern targeting younger audiences.
A research study created eight new accounts in four countries to examine how TikTok users engage in conversations about body image and mental health.
Using the platform’s minimum age requirement, account holders aged 13 participated in this study by briefly interacting with related content.
After analyzing data over a 30-minute period, the CCDH revealed that the app was recommending videos about body image and mental health issues at an alarming rate of one video every 39 seconds.
Lawmakers are looking for solutions to protect teens from potential privacy and security risks from the popular TikTok app. State and federal agencies have launched investigations to determine if it is an appropriate platform for young people.
Over the past year, executives of social media platforms have had to answer questions from Congress about how their sites might affect young users, particularly adolescent girls, and whether they might be contributing to negative mental health outcomes.
The hearings highlighted a pressing need for increased scrutiny in determining what content should appear on our screens.
After intense public hearings and alarming revelations from Facebook whistleblower Frances Haugen concerning Instagram’s detrimental effects on teens, the companies promised reform.
However, recent findings by the CCDH suggest there is still more work to be done to make all digital platforms safer for young users.
“The results are every parent’s nightmare: young people’s feeds are bombarded with harmful, harrowing content that can have a significant cumulative impact on their understanding of the world around them, and their physical and mental health,” Imran Ahmed, CEO of the CCDH, stated.
TikTok has called out a recent study for its limited sample size, short-term analysis window, and scrolling of topics unrelated to the research.
While these flaws call into question any conclusions drawn about the user experience on their platform – TikTok is keenly aware that accuracy still matters in interpreting user data.
“This activity and resulting experience does not reflect genuine behavior or viewing experiences of real people,” the TikTok spokesperson said in an interview with CNN.
“We regularly consult with health experts, remove violations of our policies, and provide access to supportive resources for anyone in need.”
The spokesperson added: “We’re mindful that triggering content is unique to each individual and remain focused on fostering a safe and comfortable space for everyone, including people who choose to share their recovery journeys or educate others on these important topics.”
The CCDH believes in shining a light on all stories about eating disorders, whether positive or negative.
Recently, many people have shared inspiring tales of recovery that empower those struggling with the condition and strengthen their belief for better days ahead.
TikTok said it is dedicated to keeping its users safe with its wide variety of innovative protection barriers. Recent updates include filters that can sort out potentially inappropriate content for a more age-appropriate viewing experience.
In July, TikTok took its platform to a whole new level with a terrific trio of features. First, they introduced the “maturity score,” which helps users decide if they are comfortable viewing content that contains complex or mature themes.
Additionally, it offers tools allowing people to set time limits on videos and take regular screen breaks – all while providing insight via a dashboard tracking app usage.
Furthermore, TikTok provides parents with various tools to ensure their children have safe and enriching social networking experiences.
US Senator Richard Blumenthal’s team recently conducted an eye-opening experiment on social media algorithms, creating a fake Instagram account of a 13-year-old girl to investigate the regulation of pro-eating disorder accounts.
By doing so, they uncovered how easily preventative measures could be circumvented and the importance of closer monitoring by tech giants like Instagram.
Read also: Warding Off Respiratory Diseases
On Instagram, users increasingly found themselves recommended to follow accounts related to extreme dieting. This development was noted by the senator, who reported on the phenomenon in an interview with CNN.
TikTok has made a firm commitment to promoting safety and well-being among its users, banning content that could potentially lead to self-harm or suicide.
The company had demonstrated an impressive commitment to keeping its platform safe, swiftly removing 93.4% of videos that violated their policies on suicide and self-harm within 24 hours with no views and 97.1% before any reports were filed between April – June this year.
To protect those at risk of self-harm, the spokesperson told CNN that searching for harmful words or phrases, including #selfharm, will bring up no results. Instead, viewers will be directed towards local support resources providing guidance and help when they need it most.
Despite some progress in regulating TikTok content, the CCDH believes further legislation is required to ensure children are safeguarded from any inappropriate material.
“This report underscores the urgent need for reform of online spaces,” the CCDH’s Ahmed said.
“Without oversight, TikTok’s opaque platform will continue to profit by serving its users – children as young as 13, remember – increasingly intense and distressing content without checks, resources or support.”