A disturbing video purporting to show a suicide is reportedly doing the rounds on the popular short video app TikTok, reigniting debate about what social media platforms are doing to limit circulation of troubling material.
According to media reports, the video first showed up on Facebook in late August but has been re-uploaded and shared across Instagram and TikTok — reportedly sometimes cut with seemingly harmless content such as cat videos.
TikTok users have warned others to swipe away quickly if they see a video pop up showing a man with long hair and a beard.
A statement by TikTok quoted by
Our systems have been automatically detecting and flagging these clips for violating our policies against content that displays, praises, glorifies, or promotes suicide.
We are banning accounts that repeatedly try to upload clips, and we appreciate our community members who’ve reported content and warned others against watching, engaging, or sharing such videos on any platform out of respect for the person and their family.
Schools and child safety advocates have warned parents to be alert for the possibility their child may see — or may have already seen — the video if they are a TikTok or Instagram user.
Is anyone else’s FYP full of videos warning people on TikTok of a graphic video going around? Said video seems to have gone around 4chan a few days ago and I guess has made its way to TikTok. I haven’t seen it on TikTok, but I’m seeing a flood of these types of warning videos. pic.twitter.com/hZPQQg1tXd
— julia alexander (@loudmouthjulia) September 7, 2020
The sad reality is users will continue to post disturbing content and it is impossible for platforms to moderate before posting. And once a video is live, it doesn’t take long for the content to migrate across to other platforms.
Pointing the finger at individual platforms such as TikTok won’t solve the problem. What’s needed is a coordinated approach where the big social media giants work together.