Gadgets

It is reported that TikTok is aware of its negative effects on young users

TikTok executives and employees are well aware that its features encourage compulsive use of the app, and its associated negative mental health effects, according to NPR. The broadcaster reviewed unredacted documents from the lawsuit filed by the Kentucky Attorney General’s Office as published by Kentucky Public Radio. More than a dozen states sued TikTok in the past few days, accusing it of “perjury [that it’s] it’s safe for young people.” Kentucky Attorney General Russell Coleman said the app was “specifically designed to be a hypnotic device, guiding children who are still in the process of developing proper self-control.”

Most of the documents submitted by the lawsuits had the information corrected, but Kentucky’s was wrong. Apparently, the TikTok study found that “compulsive use is associated with a number of negative mental health effects such as loss of analytical skills, memory formation, contextual thinking, depth of conversation, empathy and increased anxiety.” TikTok executives also knew that compulsive use could interfere with sleep, work and school obligations, and even “connecting with loved ones.”

They reportedly knew, too, that the app’s time management tool is useless at keeping young users away from the app. While the tool set the default app usage limit to 60 minutes per day, teens still spent 107 minutes on the app even when it was open. That’s only 1.5 minutes shorter than the average usage of 108.5 minutes per day before the tool was launched. Based on internal documents, TikTok based the tool’s success on how it “develops[ed] public trust in the TikTok environment through media coverage.” The company knew that this tool would not work, there is one text that says “[m]inors don’t have the administrative task of controlling their screen time, while adults do.” Another text is reported to have said that “in all the most engagement metrics, the younger the user, the better the performance.”

In addition, TikTok is reportedly aware that “filter bubbles” exist and understands how dangerous they can be. Employees conducted internal studies, according to the documents, where they found themselves drawn to negative filter bubbles shortly after following certain accounts, such as those that focused on painful content (“painhub”) and sad (“sad words”). They are also aware of content and accounts that promote “thinspiration,” which is associated with junk food. Due to the way TikTok’s algorithm works, its researchers have found that users are placed in filter bubbles after 30 minutes of use in one place.

TikTok is struggling with moderation, too, according to the filing. An internal investigation found that underage girls on the app were receiving “gifts” and “coins” for going live. And the top of the company has reportedly instructed their moderators not to remove users who are reported to be under 13 years of age unless their accounts actually say they are under 13 years of age. NPR said TikTok also admitted that a large amount of content that violates its rules passes its moderation strategy, including videos that make common pedophilia, glorify minor sexual abuse and physical abuse.

TikTok spokesman Alex Haurek defended the company and told the organization that the Kentucky AG’s complaint “selects misleading quotes and takes outdated documents out of context to misrepresent our commitment to public safety.” He also said that TikTok “has strong safeguards, including proactively removing suspected users under the age of 16” and that it has “voluntarily introduced safety features such as automatic screen restrictions, family pairing, and automatic privacy for children under the age of 16.”


Source link

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button