TikTok has announced changes aimed at protecting young people from potentially harmful 'challenges' and hoaxes that continue to pop up on the widely used platform.
Recent research by the social media site into how teenage users view dangerous content has revealed high-risk challenges and hoaxes are having a negative impact on mental health - regardless of whether a young person chooses to participate in them.
Head of Safety Public Policy for TikTok Alexandra Evans told 9Honey Parenting although the research showed not many young people were participating in the dangerous challenges, there were other reasons to be concerned.
"Kids are not participating in online challenges in great numbers and they're not participating in really risky challenges, which is obviously really hugely encouraging, but definitely not an excuse to pack up and go home", she said.
Surprisingly, the research found the damage being done by the dangerous challenges or hoaxes was often caused by well-intentioned warnings published on the platform.
Young people reported that while warnings about hoaxes and challenges had the intention of protecting users from the content they were about to consume, seeing the warning itself could cause harm.
"When we talk about the negative impact of being exposed to a hoax, a lot of that is to do with that secondary content because that takes up so much more space across all platforms now," Ms Evans said.
"So the decision has been taken that if someone is warning against a hoax, it may be well-intentioned, but it is essentially promoting something that we know is not to be true."
Tiktok has decided it will now remove such warnings, along with posts about the original challenge or hoax. It will continue to allow conversations to take place that promote accurate information and dispel panic.
"We know that being exposed to a warning is a real credible threat. Teens are likely to experience harm, and that harm is to their mental health," Ms Evans said.
While many of the challenges that emerge on TikTok are fun and harmless, there have been a number of concerning ones that have encouraged young people to take part in risky activities.
One of the most recent, and biggest, was the Milk Crate Challenge that encouraged users to run across a pyramid of stacked milk crates to see how far they can get before falling to the ground.
The trend emerged during the peak of the pandemic in 2021 and lead to a number of physical injuries to people who attempted to make it across the pyramid all for a video.
As part of TikTok's safety features, once the challenge was identified as being dangerous any videos linked to the challenge and its associated hashtags were removed by the platform.
"Our community guidelines are absolutely crystal clear that if it's going to cause serious injury, or potentially life-changing, or even death injuries, then it's absolutely not allowed on our platform. We will remove that content", Evans confirmed.
The new research also found 46 per cent of teens wanted more information to help understand the risk. So with the help of leading youth safety experts, the platform has implemented a number of new features including a prompt to encourage community members to visit their Safety Centre when searching for the challenge or hoax, and displaying additional resources.
Ms Evans is calling it a 'teachable moment' and an opportunity to get their users to learn a bit more about risk and danger.
They have also redeveloped their technology that alerts their safety teams to a sudden increase in violating content linked to hashtags to capture potentially dangerous behaviour.
Though Ms Evans admits that the platform does have a hard time keeping up with dangerous challenges and removing the content quickly.
"They [challenges] kind of morph and they move from different platforms. And you can also have a challenge and then somebody tries to replicate it, but they do it in a slightly different way. And it's within that additional twist that you can see danger," she explains.
Part of the changes includes new information for both users and caregivers in TikTok's Safety Centre dedicated to challenges and hoaxes. This comes after 37 per cent of caregivers who were surveyed admitted they find it difficult to talk about hoaxes with their teenagers without prompting interest in them.
In October the Federal Government released draft legislation, an Online Privacy Code, that recommends new laws and regulations designed to protect children using social media platforms.
TikTok is one of the platforms that may need to adhere to the legislation that includes rules that require parental permission for those aged under 16 years of age to sign up and holds social media giants accountable for the way they treat the personal information of children.