On Thursday, 24 civil rights, digital justice and pro-democracy organizations urged Meta CEO Mark Zuckerberg and Instagram head Adam Mosseri to establish “robust and equitable” safety and transparency policies" for the new messaging platform Threads without delay.
A letter spearheaded by Free Press, Accountable Tech and Media Matters for America points out that because of Instagram users’ seamless onboarding process for Threads –– allowing users to sign in with their Instagram credentials –– the same harmful accounts on Instagram now exist on Threads, which has amassed more than 100 million users in record time.
The groups signing onto the letter say that enforcement issues and gaps in Meta’s terms of service are already apparent on Threads. They go on to claim that the tech giant has a long history of “inadequate rules and inconsistent enforcement” across its suite of apps.
“Warning signs are already flashing,” the letter reads. “Since Threads launched, new users have been testing the boundaries of the platform’s moderation and enforcement,” including “neo-Nazi rhetoric, election lies, COVID and climate change denialism… bigoted slurs… targeted harassment of and denial of trans individuals’ existence, misogyny, and more.”
Pointing to Twitter’s recent content-moderation rollbacks under Elon Musk’s ownership, the groups claim that Meta has “purposefully” not extended Instagram’s fact-checking program to Threads, while removing a policy to warn users when they are attempting to follow a “serial misinformer.”
“Without clear guardrails against future incitement of violence, it is unclear if Meta is prepared to protect users from high-profile purveyors of election disinformation who violate the platform’s written policies,” the letter continues.
Meta has not provided researchers with “the most basic tools” to analyze activity on the week-old app, it adds.
The letter specifically outlines three ways Meta can instill a safer user experience on Threads:
*Immediately implement robust policies to keep incitements to violence and hate off Threads.
*Invest in robust protections against algorithmic manipulation and equitable policy enforcement.
*Prioritize transparency and engagement with civil society.
“Meta must implement basic moderation safeguards on Threads now or the platform will become as toxic as Twitter,” elaborated Nora Benavidez, Free Press senior counsel and director of digital justice and civil rights. “Under Musk, Twitter has thumbed its nose at content moderation and is failing as a business as a result. As 2024 approaches, it’s especially urgent for Meta to take action as we see many prominent social-media networks retreat from the sort of health and safety standards that are essential to slowing the spread of election-related disinformation. Putting protections in place now isn’t just good for democracy; it’s good for the business of social media.”