Commentary

Facebook Drops "Disputed" Flags, Cites Bias Among Users

Following the 2016 election cycle, Facebook was under harsh scrutiny for the spread of fake news stories that may have tipped the election in Trump’s favor. To counter this, the company formed a fact-checking coalition that would investigate suspicious articles and alert users if a shared article was deemed factually incorrect.

Facebook partnered with outlets like the Poynter Institute and Politifact to review news stories via independent fact checkers. 

Politifact reports applying a false label to at least 1,722 URLs over the course of the last year. Just this fall, following the Las Vegas shooting, the outlet identified fake articles with titles like “Celine Dion’s 16-year-old Rene-Charles Angelil’s was among the 58 victims who were killed.” There were also fake reports surrounding Hurricanes Harvey, Irma and Maria, along with the recent Alabama special Senate election.

advertisement

advertisement

The program was criticized at its start, with some users calling the practice censorship of the internet, a ludicrous claim considering fake news stories exist and have swayed social-media users with false information. Remember "Pizzagate?"  

The criticism didn’t stop Facebook from attempting to weed out stories from users feeds. However, according to the company, a strange trend emerged. Flagging some articles as disputed made some users more likely to believe them, rather than less.

Tessa Lyons, a product manager with Facebook, wrote in a statement: “Academic research on correcting misinformation has shown that putting a strong image, like a red flag, next to an article may actually entrench deeply held beliefs — the opposite effect to what we intended. Related Articles, by contrast, are simply designed to give more context, which our research has shown is a more effective way to help people get to the facts. Indeed, we’ve found that when we show Related Articles next to a false news story, it leads to fewer shares than when the Disputed Flag is shown.”

Today, the company announced a new plan for tagging fake news going into the new year.

Relying on two fact checkers to label a story fake would take an exorbitant amount of time, which allowed a dubious piece to make its way around the site. To counter this, the platform, working with fact-checking partners, will offer similar stories from alternative outlets to add context to the news a user is receiving, rather than label stories fake without an alternative. This requires only one fact checker to review a story, cutting down the amount of time it takes to label it. 

The issues surrounding last year’s election were a wake-up call to how the internet and social media can be weaponized by those with dangerous intentions. Facebook, which boasts more than 2 billion users, is grappling with the fall out. Its scrambling to find solutions only reinforces the severity of the issue. 

3 comments about "Facebook Drops "Disputed" Flags, Cites Bias Among Users".
Check to receive email when comments are posted.
  1. Chuck Lantz from 2007ac.com, 2017ac.com network, December 22, 2017 at 12:37 p.m.


    Maybe this excellent new plan will spread to household product labelling.  Since some people ignore a "poison" warning on labels, the best course is to just remove such labels entirely, and include information about similar non-poisonous products.  Brilliant. 

  2. Paula Lynn from Who Else Unlimited replied, December 22, 2017 at 3:54 p.m.

    It's too technically difficult for fbeast for it to put on fake rather than a red flag. And people keep believing they are winners when they put 3 coins into the machine and get back 2 with big winner signs.

  3. John Grono from GAP Research, December 22, 2017 at 5:22 p.m.

    I think Ray Davies captured it in his song Lola with "a mixed up, muddled up, shook up world".

    There is a chunk of the population who believe (out of convenience) statements or claims that when later shown to be wrong (e.g. the person that died is still alive) become even more ardent.   The POTUS is videoed doing something of making a statement that he later refutes saying "never happened" yet a chunk of the population still believes that it didn't occur.

Next story loading loading..