Commentary

Anti-Social: Most Platform Users Fear Misinformation Will Affect The 2024 Election

Social media users distrust a great deal of what they see on the big platforms, and many favor legislation to control misinformation and hate speech, judging by a study from Media.com, a new profile-based network.  

Of the social users polled, 70% are at least somewhat concerned that misinformation will affect the 2024 presidential election. 

Moreover, 51% favor increased regulation and 62% urge legal action against platforms that allow misinformation to spread.  

That said, 63% are confident they can spot misinformation on social media. However, 60% admit they had shared information that later turned out to be false.  

As to the impact of misinformation, 68% feel it causes confusion, 64% believe it undermines trust, and 60% says it influences public opinion. 

advertisement

advertisement

It is not clear if this level of distrust is carrying over to legitimate publishers who post on social media.  

Media.com surveyed 1,005 U.S. social media users in January 2024.  

Facebook gets the lowest rating among platforms, with 55% saying it does a poor job of curbing misinformation. TikTok and X are next, with 44% each.  

What should the platforms do? Of those surveyed, 57% say the platforms should fact-check all content, 55% call for identity resolution for all profiles to prevent bots and 42% call for an automatic ban for spreaders of false information.

"Misinformation and fake profiles are eating away at trust and confidence which is critical to a functional society” says James Mawhinney, founder and CEO of the Media.com network. “These survey results show there is a very real concern about the impact of misinformation."

Mawhinney adds, "Social networks in their current forms are breeding grounds for misinformation. It is inevitable that they will ultimately be forced to introduce measures to help curb the spread of fake profiles and misinformation." 

 

 

Next story loading loading..