This week saw the publication of two studies, funded by Facebook’s Meta, that explored if there is any truth to the oft-mentioned reason for our divided nation: that it is all the fault of the echo chamber that is social media. And ironically, even the results of these studies are varied, depending on whom you ask.
One study is published in Science from the American Association for the Advancement of Science, and the other in Nature, an equally highly regarded title.
Each publication has review boards, and articles have science written into every second sentence. I am going to trust that reviewers have aligned behind the methodology, the credentials of the writers/scientists, and the validity of the presented papers.
The first study is called “Asymmetric ideological segregation in exposure to political news on Facebook,” and you can find the publication here: https://www.science.org/doi/10.1126/science.ade7138. The second study, named “Like-minded sources on Facebook are prevalent but not polarizing,” is here: https://www.nature.com/articles/s41586-023-06297-w
advertisement
advertisement
In the first study, we find that echo chambers do, in fact, echo. And that they do so more for conservatives than for Democrats, because “there are far more homogeneously conservative domains and URLs circulating on Facebook.” Democrats tend to believe that the rise of “the right” is largely due to unencumbered exposure to falsehoods, half-truths and hateful content. Republicans tend to believe that social media and The Deep State “the government” tries to limit or downright prohibit publication of their beliefs and limit exposure for favored politicians, news outlets and personalities on the right.
The study findings go diametrically against the argument that social media are liberally biased or constrained, noting “We also observe on the right a far larger share of the content labeled as false by Meta’s 3PFC”.
The second study tried to understand if being exposed to political news and viewpoints that align with your personal political viewpoints drive you further/deeper into those viewpoints -- and conversely, if being exposed to opposing viewpoints will change your personal viewpoints. Both political movements typically believe that Meta (and all other social media) should do more to ensure that their viewpoints are easily accessible and not censored in any way.
The study demonstrates that social media does not work in this way. It acknowledges the existence of the echo chamber, in which the algorithms work hard to serve you as much content as possible that aligns with what you have engaged with before. And when the scientists served a group of “believers” in either camp with more content from “the other side” their beliefs did not change.
All these are important findings. Sadly, they will do nothing to convince believers of the argument that social media are liberally biased or constrained that they are wrong. That is because the echo chambers are so effective at what they do (see study 1). And as we advertisers know: It is really, really hard to change a perception.