Newly un-redacted internal documents show TikTok fostered an environment that chose its public image over effective policies to address teen mental health, struggling to prioritize safety over engagement.
The documents, part an ongoing lawsuit against TikTok featuring swaths of digitally redacted internal materials, were part of the Kentucky Attorney General’s filing. Despite redactions, the documents were able to be read by Kentucky Public Radio (and later reviewed by NPR) before being resealed under court order.
TikTok is being sued by 14 attorneys general across the nation, who allege across various individually-filed lawsuits that the platform falsely advertised its addictive algorithm, endangering children.
CDC confirms link between teen social media use and mental health struggles
The lawsuits focus on several allegedly harmful aspects of the social media platform, including beauty filters, the For You Page (FYP), and TikTok Live. According to TikTok’s own internal research, users only need to watch 260 videos before they could become addicted to the app. That same research found that “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety.” It acknowledge that the algorithm had “better engagement” with young people.
Mashable Light Speed
Remedial measures, such as screen time alerts and limits, were implemented and promoted on the app despite internal research suggesting they would not have a measurable effect on users. The screen limit tool resulted in only a 1.5-minute drop in usage, according to the documents.
The app’s negative impact on body image was well noted, too, with the platform allegedly prioritizing more conventionally attractive users in the FYP algorithm — executives ignored suggestions to add informative banners or awareness campaigns on popular videos and beauty filters. Executives were also aware that young users were often exposed to videos featuring suicidal ideation and eating disorder content as it slipped through moderation and into algorithm “bubbles.”
In a statement to NPR, a TikTok spokesperson said, “Unfortunately, this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to community safety. We have robust safeguards, which include proactively removing suspected underage users, and we have voluntarily launched safety features such as default screentime limits, family pairing, and privacy by default for minors under 16.”
TikTok’s internal memos mirror similar communications among Meta leaders, including CEO Mark Zuckerberg, who passed on suggestions to address bullying and mental health. The internal communications were unsealed in a Massachusetts lawsuit that accused Meta of being a major player in the youth mental health crisis. Meta, TikTok, and several other social media platforms have been mired in state, school, and parent-led lawsuits on behalf of young users.