TikTok Announces Launch of Hashtag Content Filters and New Safety Tools
TikTok has announced new content moderation tools for the short-form video platform to make it safer for children, along with new hashtag filters for its For You and Following feeds.
The first version of its new system to restrict certain types of content from being viewed by teens is called "Content Levels," which is due to launch in the next few weeks.
While adult content is banned on the platform, TikTok says some content in the app may contain "mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences." Content Levels is designed to classify such content and assign to it a maturity score, which will keep it from being seen by users aged between 13 and 17.
Initially, TikTok says Trust and Safety managers will assign the scores to increasingly popular videos or those reported by users in the app, and the system will be expanded over time to offer filtering options for the entire community, not just teens. In its finished version, the system will ultimately allow creators to classify their content, similar to the way movies, TV shows, and video games use age ratings.
In addition to Content Levels, TikTok will soon launch hashtag filters that provide users with a new level of control over what appears in the For You and Following pages. Users will be able to designate specific words or hashtags they don't want to see in their feeds, and the app will automatically filter them out.
The feature is intended to go beyond merely filtering out mature or problematic content, and could also be used to stop TikTok's algorithm surfacing topics that users are sick of seeing or that they just don't care about. TikTok's example is using it to block dairy or meat recipes if you're going vegan, or to stop seeing DIY tutorials after a home project is completed.
The new content moderation features follow a 2021 Congressional inquiry into social apps like TikTok regarding how their algorithmic recommendations could be promoting harmful eating disorder eating content to younger users. More recently, TikTok was also sued by parents whose children died after attempting dangerous challenges allegedly seen on the platform.