TikTok Is Becoming Deadly, and It’s all for the Content


Trigger Warning: This article contains graphic details about accidental deaths and suicide.

According to The Drum, TikTok has now surpassed two billion all-time global downloads and saw 315 million installs during the pandemic.

While the majority of the content on the platform is engaging and fun — with even a positive impact for those using it while locked indoors — the app has a more sinister side with users either being injured or in some cases, losing their lives all in the name of content.

On October 9, it was reported that Areline Martinez, a 20-year-old woman from Mexico had been killed in a tragic accident while filming a fake kidnapping for her account.

According to Mexico News Daily, the prank went horribly wrong when she was shot in the head by a .45-caliber bullet.

The outlet also reported that at least 10 people were present at the scene, where police found Martinez’s hands and feet bound. They don’t know why the gun, which was meant to be a prop, was loaded.

Video of the “kidnapping”, which took place moments before her death, has been circulating on social media and shows the mother-of-one sitting on a chair pretending to struggle.

As serious and as shocking as this is, it’s not the first time that something of this nature has occurred — and it likely won’t be the last.

Earlier this year, a challenge in the UK had parents urging their children not to take part in the ‘Jump Trip Challenge’ or ‘Skullbreaker’ after two teens died attempting it.

The challenge requires tricking unassuming victims into jumping up before others pull their feet away which results in them landing on their spines and heads.

A TikTok spokesperson told the Mirror back in February that the “safety and well-being of our users is a top priority at TikTok.

“We do not allow content that encourages, promotes, or glorifies dangerous challenges that might lead to injury and will remove any such reported content,” they said.

Previous to that, a 17-year-old, Karim Sheikh, died of suffocation in India after an attempt to make a TikTok went horribly wrong. Sheikh tied himself to an electricity pole with his face covered with a plastic bag while his friends — all under the age of 18 — tried to help him escape. The teen was struggling to breathe but his friends thought he was playing it up for the cameras. 10 minutes later he was dead.

Sadly, deaths involving the app have also incurred due to teenagers taking their own lives on camera for the world to see.

According to The Sun, 14-year-old Molly Russell took her own life after viewing methods of self-harm on the app. Speaking to media, her father Ian said: “Molly’s suicide smashed like a wrecking ball into my family life… I have no doubt that social media helped kill my daughter.”

More recently, a 33-year-old man tragically took his own life on a live stream with TikTok unable to stop it from being reshared.

While the video originated on Facebook, the death of US Mississippi resident Ronnie McNutt was widely circulated on the app in clip form. McNutt was suffering from mental health issues and had PTSD after serving in the Iraq War.

At the time, a Facebook spokesperson said:

“We removed the original video from Facebook last month on the day it was streamed and have used automation technology to remove copies and uploads since that time.”

So, is it the responsibility of TikTok (and other social media platforms) to ban, monitor and highlight the dangers of these apps, or is it the responsibility of the community who use and create content with it?

The debate is still ongoing, however it’s up to us to help end the cycle of dangerous behavior, all in the name of content.

If you or someone you know needs help, please contact BeyondBlue on 1300 224 636 or Lifeline on 13 11 14.

Read more stories from TheLatch— and follow us on Facebook.