“Most of our community members aim to follow our policies,” wrote Julie de Bailliencourt, TikTok's global head of product policy. “But there is a small minority of people who repeatedly violate our policies and don’t change their behavior.”
The company's updated system for account enforcement is designed to banish this minority of users in a more efficient way.
Now, if someone posts content that violates one one of TikTok's Community Guidelines, they will earn a strike and the content will be removed. Depending on how severe an offense is, one strike may be all it takes to issue a permanent ban from the platform. These offenses include “promoting or threatening violence, showing or facilitating child sexual abuse material, or showing real-world violence or torture.”
However, not all offenses will land a permanent ban. “There may be a stricter threshold for violating our policy against promoting hateful ideologies, than for sharing low-harm spam,” the company explains.
Strikes will expire from an account's record after 90 days, but if a user accrues a high enough number of cumulative strikes across policies and features, they will also be permanently banned.
Therefore, TikTok's strike system seems based more in an infraction's level of abuse, as well as the timing of repeated infractions, rather than a specific number of strikes.
TikTok says that it is introducing this new system in part because creators have found the old system confusing to navigate, as it could “disproportionately impact creators who rarely and unknowingly violate a policy.”
“Repeat violators tend to follow a pattern,” TikTok says. “Our analysis has found that almost 90% violate using the same feature consistently, and over 75% violate the same policy category repeatedly.”
For greater transparency, the company is also rolling out a new feature in its Safety Center that provides users with an overview of their personal status surrounding strikes and penalties, as well as reports they have made on other accounts.