TikTok, which still intends to take down content featuring nudity or harmful messaging, sees this setting as a way for creators to prevent minors from encountering content made specifically for adult audiences.
“Perhaps a comedy routine is better suited for people over age 18,” TikTok said in October. “Or, a host may plan to talk about a difficult life experience and they would feel more comfortable knowing the conversation is limited to adults.”
Building off of its original announcement, the safety feature will now include a system TikTok is developing that automatically identifies and restricts certain types of content from being accessed by teens, including sexually explicit, suggestive or borderline content.
“Our aim is to quickly identify and remove violative content from our platform and prevent borderline or suggestive content from being recommended to or searchable by teen accounts,” the company wrote in a recent blog post.
Creators will now also be asked to specify content made for adult audiences.
Last year, TikTok announced Content Levels, which aims to prevent certain content with more mature or complex themes from reaching teen users. The popular social media app says that it has prevented teen accounts from viewing over one million sexually suggestive videos in the past month.
TikTok said in a blog post that it will be expanding the feature globally over the next few weeks.