A,Teenage,Girl,Dances,And,Shoots,A,Video.,Selfies.,The

Is TikTok Bad for Your Mental Health?

April 28, 2023

TikTok changed the internet forever, but has it changed our brains too? The impact of TikTok is undeniable, with over 1 billion global users on the platform. Its explosive growth had other social media platforms scrambling to match their format, with “Reels” quickly being added to Facebook and Instagram. Not only has the platform captured the attention of other competitors; the government and mental health professionals are keeping a close eye on it as well.

“Digital fentanyl,” US congressman Mike Gallagher calls the app, “it’s highly addictive and destructive and we’re seeing troubling data about the corrosive impact of constant social media use, particularly on young men and women here in America.”

To be clear, the government is more concerned about the security risks of this Chinese-owned app. But there might be truth in Gallagher’s claim. According to a Pew Research Center survey of American teenagers, about 86% of teen TikTok users are on the platform daily with 25% saying they are on the app constantly.

Its addictive nature and popular use have mental health experts concerned, which begs the question: is TikTok bad for your mental health?

Effects of Social Media on Mental Health

Concern about how social media might be harmful is nothing new – lawmakers have been scrutinizing social media platforms for their potential influence on teens for years.

In a 2021 US congressional hearing, Meta (formerly called Facebook) was sharply questioned about how Instagram’s services hurt young people. This hearing was called when The Wall Street Journal reported internal findings that Meta has definitively known about their platform’s negative impact on teens. According to Meta’s internal research findings, one in three teens have reported that Instagram made their body image issues worse. Moreover, 13% of British teens and 6% of American teens could link their suicidal thoughts to Instagram.

In the same hearing, Meta defended itself stating that many teens reported positive experiences on Instagram, including the times the app has helped with mental health. But introduction of image-enhancing filters and measurable “like”-ability put these findings at question, with Texas Senator Ted Cruz accusing them of “cherry-picking” favorable parts of their research.

Social media platforms across the board can be linked to poor psychological health. A study by JAMA Psychiatry showed that adolescents who spend three hours or more on social media daily had increased risks of mental health issues such as depression and suicide.

This might be startling, but social media has also been shown to have positive effects on younger users as well. According to Pew Research Center, 80% of teens say that social media helps them feel more connected to what’s happening in their friends’ lives, 71% say they feel like they have a place to show their creativity, and 67% say social media makes them feel like they have people who can support them through difficult times.

And thankfully, social media companies are beginning to take responsibility in protecting the wellbeing of their users, especially teens. Meta has zeroed in on creating safe, age-appropriate experiences for teens on their platforms. From offering a numberless liking function on Instagram to preventing advertisers from targeting teen Facebook and Instagram users, Meta has been updating their policies to protect this population.

TikTok’s For You Page’s Algorithm

TikTok might be a new beast when it comes to its possible impact on its users, and this is specifically due to its algorithmic feed. It’s no secret that its For You Page has one of the most sophisticated algorithms on the market. It ushered in the era of suggestion-based feeds, replacing the chronological-style format of other platforms.

When the algorithms are pushing suggested content, extremist views tend to be promoted due to its attention-grabbing nature. In a 2021 report, 70% of extremist content was recommended by YouTube to users. TikTok is no exception to this; the non-profit Center of Countering Digital Hate published a report claiming it takes less than 5 minutes from signing up to see suicide and eating disorder promotion content on your feed.

TikTok stands firm in its policy of not allowing content promoting/normalizing suicide and self-harm, reporting the removal of 93.4% of this kind of content at zero views, 91.5% within 24 hours, and 97.1% before any reports from April to June 2022. They also ban certain hashtags such as #selfharm and #eatingdisorder, displaying resources and hotline numbers instead if these terms are searched.

But even if this kind of content is being suppressed, there is no denying that TikTok’s algorithm can still be overwhelming. There is no pacing to the For You Page; whatever is the most interest-piquing will be pushed. When scrolling, users can switch from cute animal videos to reports of the latest geopolitical crisis within minutes. Especially those who engage the most with harrowing content, the algorithm will continue to filter more distressing content to them. Constantly being fed upsetting information can lead not only to burnout but depression as well.

Marginalized users have also felt that the TikTok algorithm itself has replicated the inequalities they already face in real life. Black content creators have reported that their content has been suppressed or “shadowbanned” on the platform. And this isn’t a stretch to believe, as in 2019 TikTok admitted to censoring content from users who identified as disabled, fat, or LGBTQ+ in a well-intentioned but ignorant attempt at minimizing bullying.

 

Attention Spans and “TikTok Brain”

TikTok doesn’t optimize for clicks and engagements like other platforms – it prioritizes view time. And they do it well, as users spend a whopping average of an hour and a half per day on the app according to SensorTower, far surpassing any other social media app. This is especially concerning considering the standard film times of TikToks are between 15 seconds to 3 minutes.

This binging of short videos has led many to believe that the attention spans of teens are at risk. Douyin, another Chinese-owned app like TikTok, conducted a study on the neurological effects of the app on college students. Results showed that algorithm-selected videos activated the reward center of the brain much more than random videos.

Hit after hit of dopamine delivered by short videos personalized to each user may lead to addictive responses in the brain, and ultimately “TikTok brain”.  Many are worried that “TikTok” brain will lead to problems with focus, short-term memory, and attention.

TikTok Romanticizing Mental Illness

TikTok’s algorithm is so fast at detecting interest, it’s almost scary. Users have meme’d on its uncanny ability to anticipate users’ desires, leading to many sexuality/gender realizations and, more concerningly, the rise of self-diagnosing mental illnesses.

While is laudable that many are using the app to promote mental health awareness, not all information pushed on the platform is accurate or even correct. Diagnoses can only be received by mental healthcare professionals, but due to TikTok many teens and young adults are self-identifying and self-treating themselves for conditions such as ADHD, OCD, autism, Tourette’s syndrome, etc.

On top of this, the line between TikTok promoting mental health awareness and TikTok romanticizing mental illness is blurring. The app is well-known for its intimate nature, where many feel comfortable revealing deeply personal information about themselves and their relationships. Trauma-dumping on the app has become increasingly common.

The support and comfort one might be looking for from online communities can be helpful, but trauma-dumping can unintentionally normalize and feed into problematic thoughts and behaviors. For example, “thinspo” has trended where members online are bonding over their wish to be skinnier and posting skinny inspiration. It might be comforting for some to know they are not alone in these thoughts; for others, it promotes and glorifies eating disorders.

One of the biggest appeals of the app is the unfiltered nature of the content it promotes. But this extreme culture of “candidness” has led to a lack of tact and awareness of one’s impact on others.

 

How to Protect You and Your Teen’s Mental Health on TikTok

TikTok is looking for more ways to better support its community, and has developed safeguards and a well-being guide in the process. They outline what and how to share their own stories of mental health in a safe and respectful way. They caveat that sharing these kinds of stories should not be deemed as diagnosis, treatment, or advice. They also link to mental health resources, and what to do if a user comes across a distressed community member.

On top of this, they have well-being features built into the app. They have a daily screen time feature that allows users to limit how much time they can spend on the app before they need to use a passcode to access the app again. They also have a screen time dashboard, which gives a summary of one’s time spent on TikTok including information like cumulative time spent daily and number of times app was open daily.

When users scroll, they often see screen time break ads and sleep reminders. A video will pop up notifying the user that they’ve scrolled on the app uninterrupted for some time, or that it’s late at night and it’s time for bed. You can also mute push notifications for specific periods of time; for users aged 13 to 17 their scheduled mute time is preset from 9-10 PM to 8 AM and cannot be changed.

One of the best well-being promoting features on the app is Family Pairing. This is when parents can link their own account to their teen’s and set parental controls. Parent controls include:

  • Setting daily screen time limits, which for teens between 13 to 17 is already turned on by default to 1 hour
  • A screen time dashboard, where you can get a summary of your teen’s TikTok screen time
  • Scheduling additional mute times for push notifications
  • Content restrictions
  • Ability to turn off searching functions
  • Deciding if your teen’s account is private or public, and if it can be recommended to others
  • Restricting who your teen can send messages to, or turn off direct messaging completely
  • Deciding who can view your teen’s liked videos, and who can comment

If you are concerned about your teen’s mental health and substance abuse, Hillcrest Adolescent Treatment Center provides a cellphone-less rehabilitation haven for teens to be able to focus on their recovery. We create personalized treatment plans with your teen’s unique goals and needs in mind, with family therapy encouraged if possible. We believe family involvement is instrumental in the recovery process. For more information, contact our admissions team today.