In the old media model prevailing before the rise of the Internet, an individual belonging to a group of people with shared interests might happen across a publication or piece of content relevant to those interests and then share it with other members of the group. Now, however, the dynamic is apparently reversed: in the age of social media, content acts as a sorting mechanism that brings together people with common interests in groups.
That’s especially true of content relating to distinct narratives, including conspiracy theories, according to a new study by researchers from Italy and the U.S., which shows how this type of narrative content creates polarized communities of likeminded people, otherwise known as “echo chambers.”
The study, titled “The spreading of misinformation online” and published in the Proceedings of the National Academy of Sciences, used Facebook data to examine how content from two categories, both with narrative characteristics – conspiracy theories and scientific news – spreads across the social network, amplified by growing numbers of users in “cascades.”
The researchers looked at 32 public Facebook pages devoted to conspiracy theories and 35 devoted to scientific news, tracking all the posts over a five-year period from 2010-2014, focusing on the “sharing trees” in which users propagated the content as well as the length of time it took for a piece of content to reach its maximum distribution.
The researchers also factored in the number of “likes” users gave the content, indicating agreement or approval, as a measure of polarization. In other words, the more Facebook users gave a particular piece of content “likes,” the more likely they were already predisposed to agree with it, and therefore more likely to form a polarized group.
The study uncovered some differences in how conspiracy theories and science news spreads: for one thing, science news tended to reach maximum distribution more quickly, while conspiracy theories had a longer growth curve, gradually reaching more people over time.
However, there were also some key similarities – namely, the mechanism by which content sharing trees formed “homogenous clusters,” or groups of people who were more likely to share and re-share content from the same sources. Thus, the researchers find, “although viral patterns related to distinct contents differ, homogeneity is clearly the driver of information diffusion,” and “different contents generate different echo chambers, characterized by a high level of homogeneity inside them.”Looking to the larger implications for society, the authors conclude: “Users tend to aggregate in communities of interest, which causes reinforcement and fosters confirmation bias, segregation, and polarization. This comes at the expense of the quality of the information and leads to proliferation of biased narratives fomented by unsubstantiated rumors, mistrust, and paranoia.”