Watchdogs: Threads Already Compromised, Needs Content Guardrails ASAP

On Thursday, 24 civil rights, digital justice and pro-democracy organizations urged Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri to establish “robust and equitable” safety and transparency policies" for the new messaging platform Threads without delay. 

A letter spearheaded by Free Press, Accountable Tech and Media Matters for America points out that because of Instagram users’ seamless onboarding process for Threads –– allowing users to sign in with their Instagram credentials –– the same harmful accounts on Instagram now exist on Threads, which has amassed more than 100 million users in record time. 

The groups signing onto the letter say that enforcement issues and gaps in Meta’s terms of service are already apparent on Threads. They go on to claim that the tech giant has a long history of “inadequate rules and inconsistent enforcement” across its suite of apps.

“Warning signs are already flashing,” the letter reads. “Since Threads launched, new users have been testing the boundaries of the platform’s moderation and enforcement,” including “neo-Nazi rhetoric, election lies, COVID and climate change denialism… bigoted slurs… targeted harassment of and denial of trans individuals’ existence, misogyny, and more.”

Pointing to Twitter’s recent content-moderation rollbacks under Elon Musk’s ownership, the groups claim that Meta has “purposefully” not extended Instagram’s fact-checking program to Threads, while removing a policy to warn users when they are attempting to follow a “serial misinformer.”

 “Without clear guardrails against future incitement of violence, it is unclear if Meta is prepared to protect users from high-profile purveyors of election disinformation who violate the platform’s written policies,” the letter continues.

Meta has not provided researchers with “the most basic tools” to analyze activity on the week-old app, it adds. 

The letter specifically outlines three ways Meta can instill a safer user experience on Threads:

*Immediately implement robust policies to keep incitements to violence and hate off Threads.

*Invest in robust protections against algorithmic manipulation and equitable policy enforcement.

 *Prioritize transparency and engagement with civil society.

“Meta must implement basic moderation safeguards on Threads now or the platform will become as toxic as Twitter,” elaborated Nora Benavidez, Free Press senior counsel and director of digital justice and civil rights. “Under Musk, Twitter has thumbed its nose at content moderation and is failing as a business as a result. As 2024 approaches, it’s especially urgent for Meta to take action as we see many prominent social-media networks retreat from the sort of health and safety standards that are essential to slowing the spread of election-related disinformation. Putting protections in place now isn’t just good for democracy; it’s good for the business of social media.” 

1 comment about "Watchdogs: Threads Already Compromised, Needs Content Guardrails ASAP".
Check to receive email when comments are posted.
  1. Ben B from Retired, July 14, 2023 at 10:19 p.m.

    Twitter was so soft with putting people in Twitter jail I won the appeal a year ago just being funny didn't mean any harm in it tweeting Love Island USA I didn't get the 12-hour ban. I'm off Twitter I wouldn't give my number to them I should've been grandfathered in with email only I didn't appeal it I'd lose so I take my 1-0 record. You can't put the paste back into the tube when it comes to disinformation/misformation which is in the eye of the beholder by the way, social media there will always be a dark side to it, and toxic for those that seek it out I don't see social media ever being toxic free in my opinion.

    No matter what these groups want it to be I don't like Media Matters since Media Doesn't Matter to them if it did they go after everyone and not just conservatives that's my beef with them. Coverstive bad liberal good and can do no wrong.

Next story loading loading..