It’s been apparent since at least the 2016 presidential election — and probably well before — that social media spreads and amplifies misinformation and disinformation. Its toxic effect has damaged societies across the globe. Never mind the Arab Spring of 2011; the negative impact of social media far outweighs the benefits.
It’s divided citizens into warring camps as deplorable algorithms feed people what they most respond to -- even if that's wrong and harmful. Social media and search engines like Google supercharge conspiracy theories. The impact on individuals is harmful too, with studies showing it causes addiction and depression.
A lot of this is because social media is largely unregulated. Tech companies have no legal responsibility or liability for content posted by users. There’s been some effort to flag false content and offer context, but generally, it’s no match for the reinforcing power of the content to begin with.
advertisement
advertisement
So when the European Commission sought last year to strengthen its 2018 code of practice on disinformation, there was potential for real change.
Platforms should empower consumers with the information they need to judge the trustworthiness of sources, the European Commission indicated. “Users should have access to tools to understand and flag disinformation and safely navigate in the online environment,” the Commission stated.
So it’s disappointing that this policy might actually be toothless. NewsGuard, a company that researches websites for trustworthiness and then issues ratings based on that research, on Thursday issued a statement faulting the European Commission for not doing enough.
While the revised code does take important steps to demonetize disinformation, and major platforms are committing to improve systems and controls for ad placement, the code doesn’t require that platforms commit to its terms.
“We are disappointed to see that only Microsoft among the large platforms committed to this measure,” said NewsGuard Co-CEO Steven Brill. “Meta (Facebook), Google, Twitter, and TikTok refused in their statements of commitment to agree to protect their users by providing information about the trustworthiness of sources, despite the encouragement in the guidance that platforms take this crucial step.”
The code only states that platforms “could” empower their users, without it being mandatory. “The fact that the major platforms cooperated on revising this self-regulatory instrument, yet exempted themselves from being held to account on the Commission’s recommendation to empower users with context that can help them avoid mis- and disinformation, shows that their participation amounted to little more than making hollow promises for favorable publicity,” Brill added.
The effect, NewsGuard said, is that social-media companies and Google are paying lip service to the concept and “playing word games and emphasizing the word 'could' instead of 'should.'”
Users continue to be bombarded with “content choices made by secret, unaccountable algorithms intended to empower their eyeballs-at-all-costs business model, rather than empower the people they are supposed to be serving,” the NewsGuard statement said. “This key initiative of the code now appears to be dead and buried — until the Commission acts to end its naïve dependence on the willingness of these platforms to act in the public interest voluntarily,” Brill said.
One current example of the need for a stronger code is Russia’s disinformation about its invasion of Ukraine, which continues to be spread by the platforms. The Kremlin-operated RT, for example, has become the largest source of news on Google’s YouTube, and one Google executive claimed the disinformation service was “authentic,” and without “agenda or propaganda.”
“Aggressive Russian disinformation about its invasion of Ukraine has added urgency to the need to give consumers the information they need about who is feeding them the news on the digital platforms,” NewsGuard Co-CEO Gordon Crovitz said. “NewsGuard analysts have so far identified 229 websites publishing Russian disinformation — far more than the two [RT and Sputnik News] so far sanctioned by most digital platforms following European Commission Guidance.”
The European Commission code, Crovitz said, will continue to fail to protect users until the platforms provide independent information about the journalistic standards of the news and information through their services — which often act as the useful idiots for propagandists, he said.