X Cuts Election Integrity Staff, After Promising Expansion

Less than a month after Elon Musk’s X announced that it would resume taking ads from political candidates and parties — and publicly promised to expand staff dedicated to limiting disinformation and other threats to U.S. elections — the social platform has instead laid off more election integrity staff. 

News of the apparent about-face could give advertisers already holding back investment in the platform further reason for concern about brand safety. 

X is cutting about half of the global team dedicated to election integrity, starting with the only four staff based in Ireland (including the team’s leader, Aaron Rodericks) and affecting more than half of the North American team, according to The Information, citing three sources familiar with the situation. 

advertisement

advertisement

Musk appeared to confirm, posting this message on X Wednesday afternoon in response to that report: “Oh you mean the ‘Election Integrity’ Team that was undermining election integrity? Yeah, they’re gone.” 

Earlier yesterday, X CEO Linda Yaccarino — recruited from her NBCUniversal ad-chief post by Musk to win back advertisers — told The Financial Times that X planned to expand its election and safety teams globally. 

X lost about half of its advertising revenue to big-brand defections after Musk acquired it in October 2022 and decimated content moderation and election policies staff as part of mass layoffs. 

The platform had about two dozen election integrity employees worldwide when Musk acquired Twitter last year. He has since eliminated about 80% of the company’s workforce. 

Musk pulled the platform out of the European Union’s 2022 Code of Practice of Disinformation in May, prior to renaming it X. Google, Facebook, Instagram, TikTok and Microsoft have all committed to adhering to the code, which applies across 27 countries. 

In X’s late-August corporate blog post about taking political ads, X stated that an expanded team would “focus on combating manipulation, surfacing inauthentic accounts and closely monitoring the platform for emerging threats,” and said it planned to add a civic integrity and elections lead focused on combatting disinformation. 

X also said it would establish a “global advertising transparency center” to allow users to review paid political posts being promoted on the platform — a capability required under the EU’s Digital Services Act (DSA) — and use “robust screening processes to ensure only eligible groups and campaigns are able to advertise.” 

Also on Wednesday, the EU released its first report on social media platforms’ performance on handling disinformation under the act, concluding that X’s levels of misinformation and disinformation are significantly higher than other platforms. 

X is the platform “with the largest ratio of mis/disinformation posts,” Vera Jourova, vice president of the European Commission, said in a statement. X also ranked highest in discoverability of mis/disinformation. Facebook and Instagram came in second and third. 

X disputed the conclusions in a series of posts, saying the platform disagrees “with the overall framing of this data and believe[s] that the data does not fit the narrative being covered in the media,” adding: “This important debate should take into account the full range of actions taken by platforms & recognize the importance of protecting free expression.” 

But the platform insisted that it is still “committed to complying with the DSA.” 

Platforms that fail to comply with the DSA can be fined up to 6% of their annual global revenue. 

Jourova has said that dropping out of the disinformation code does not excuse Twitter from having to comply with it, and has warned that EU regulators will be watching the platform closely.

Next story loading loading..