The plan was perhaps inevitable, as every social-media platform eventually becomes a tool for political propaganda, fake news and misinformation.
TikTok has earned a mass following for its viral videos that tend to be fun and lighthearted, making it especially popular with U.S. teens. Its plan to clamp down on election-related misinformation indicates the platform may be quickly learning the lessons of past mistakes by social-media companies
TikTok will rely on the AP, which this year will monitor about 7,000 races nationwide, for help on screening out claims about election outcomes. Considering that TikTok has about 100 million users in the U.S., many of whom share their own videos on the platform, the company faces a daunting task.<
“Out of an abundance of caution, if claims can't be verified or fact-checking is inconclusive, we'll limit distribution of the content,” Eric Han, head of safety at TikTok U.S., said in a blog post.
TikTok’s move, which is similar to the efforts of social-media companies like Facebook and Twitter, comes as the platform faces a chance of being banned in the U.S., given national-security concerns. TikTok is owned by ByteDance, a Chinese company that is subject to the country’s laws on sharing information about citizens with the government.
TikTok has claimed that it doesn’t share personal information about Americans with the Chinese government, but the U.S. government wants safeguards to ensure that never happens. Unless TikTok finds a new owner that satisfies the U.S. government, the app may not be available in the U.S. after Nov. 12.
Following the intense scrutiny of Facebook in the wake of the 2016 election, which led to a record $5 billion fine to settle a Federal Trade Commission complaint, social-media companies want to avoid accusations of election interference. However, it’s too early to tell whether TikTok’s preemptive move to limit election misinformation will help to avoid a same fate.