TikTok has released its latest Community Guidelines Enforcement Report, which shows how the rising social media company is enforcing guidelines and increasing the removal of videos and other content that do not meet those guidelines in Q2 of this year.
Due to policy violations and other issues, the social-video social networking company removed over 113 million video clips between April and June. This marks an 11% increase over Q1, which follows a general quarter-over-quarter trend.
Due to the app’s continued growth, the company now is removing more videos. According to TikTok, much of this removal is proactive and happens before users ever see the harmful content.
Internal data shows that the proactive removal of videos improved from 83.6% in Q1 to 89.1% in Q2, with removal of videos at zero views improved from 60.8% in Q1 to 74.7% in Q2. Video removals in under 24 hours improved from 71.9% to 83.9%.
The report shows that under the category of “Minor Safety,” “Nudity and Sexual Activity Involving Minors” is the most common reason for the removal of videos on the app.
Fake accounts make up the majority of account removals. Thirty-three million fake profiles were removed in Q2, which translates to an increase of 62% from Q1.
Due to the rise in fake accounts and the spread of misinformation, Tiktok says it will continue to develop its systems, investing in technology-based flagging as well as moderation and a new proactive fact-checking program.
“We have more than a dozen fact-checking partners around the world that review content in over 30 languages,” TikTok states. “All of our fact-checking partners are accredited by the International Fact-Checking Network as verified signatories of the International Fact-Checking Network’s code of principles.”
Since starting the fact-checking program in Q1 of this year, the company says it has identified 33 new misinformation claims, resulting in the removal of 58,000 videos from the app.