Commentary

Political Awareness In The Facebook Age

We’re just finished one presidential convention, heading toward another -- and it’s not my job to discuss the candidates or their positions. Instead, I want to discuss the way we come to conclusions about the candidates and their positions: the quality of our data, the depth of our knowledge, the level of our ability to make informed decisions.

The problem is filter bubbles, and I’ve talked about them before.

When I go on Facebook, if I have a history of reading Web sites and articles that trend liberal, surprise! My NewsFeed shows me liberal stories. If my history is conservative, the reverse. (The Wall Street Journal has an excellent demonstration of this if you want to see it in action.)

advertisement

advertisement

Surrounded by validation of our existing opinions, it becomes unfathomable that anyone could think differently from us.

We lament, “Who are these idiots voting for the other party? Every single news item just proves our party is right!” But it would be more accurate to say every news item we see proves our party is right -- while every news item the other side sees proves their party is right.

So we become increasingly polarized, increasingly intolerable of anyone whose opinions don’t match our own, increasingly drawn to divisiveness and away from understanding.

And the problem is not that Facebook has a political bias as a company. The problem is that its success metrics causes it to create content expressly biased for each of us.

Facebook lives and dies through our eyeballs. And our eyeballs are obtained through an algorithmic pandering that caters to every one of our human behavioral quirks: that we’re more likely to look at stuff we agree with, that we’re more likely to respond to negative content than to positive content, that we prefer simplistic and hyperbolic headlines to thoughtful and complex investigations.

Eyeballs. Pageviews. Unique visitors. Time on site. Quarterly results. Nowhere in those metrics is there a category for “a more-informed and aware public.”

Historically, the job of helping us be more informed and aware falls to the Fourth Estate: the news media. But the news media can’t get our attention, because we spend all our time on Facebook.

As a society, we need to be exposed to neutral information (to the extent that information can be neutral). We need to be able to see content that is consistently substantive, that we will sometimes agree with and that will sometimes challenge us, depending on our particular biases.

But last month, Facebook changed its algorithm to favor friends and family content at the expense of news content. The New York Timesreported, “The side effect of those changes… is that content posted by publishers will show up less prominently in news feeds, resulting in significantly less traffic to the hundreds of news media sites that have come to rely on Facebook.”

I know we’re terrified of the government getting involved in free speech. (According to The Daily Dot, “Even in the midst of the ‘trending topics’ scandal, polls showed a meager 11 percent of Americans were comfortable with the government imposing regulations regarding content on social networking sites like Facebook.”) But there is a public good at stake here, and we can find a happy medium.

If your site has more than a certain number of visitors who spend more than a certain amount of time there; if your site can be considered a primary source for information and can be proven to influence voter behavior, you should be required to surface a certain amount of content that could be considered neutral, much the way public TV stations are required to make space for public service announcements.

We don’t all need to agree with each other. But when we disagree, we should at least know why.

2 comments about "Political Awareness In The Facebook Age ".
Check to receive email when comments are posted.
  1. Kenneth Hittel from Ken Hittel, July 22, 2016 at 2:44 p.m.

    Let me speak in defense of "filter bubbles." Some of us have spent years of our lives building, refining, revising, deepening our ideological biases on -- experience and knowledge. Whether or not we use Facebook -- I do not -- we indeed, over time, build up "filters," we are aware of them -- because we, generally, are aware -- and we are not, therefore afraid of them, or limited by them. We are not all Facebooked by our prejudices because we are well aware of how and why they have been built up as prejudices. As for "neutral"-- sorry, there is rarely such a thing and it is always, as even-handed always is, bullshit.

  2. Anthony Livshen from Centriply, July 25, 2016 at 4:24 p.m.

    The government imposing the standards for what is considered "neutral" will undoubtedly get messy. I'd prefer that Facebook/Twitter add a feature that could diasbles their entire algorithm and allows posts to show up in chronological posting order. It might also be innovative of them to post alternative viewpoint news articles alongside different news articles. For example. next to Fox News' coverage of a terrorist attack, have a button to the side to see the CNN and ABC story.

    Still, I don't think this is an area where government regulation could steer the ship.

Next story loading loading..