Last week, Facebook made the controversial announcement that it would drop the
“Disputed” tag from stories fact checkers found to contain false information. Instead, “Related Content” will appear around posts that seem suspicious or contain
information that is outright false in an attempt to add context.
What the platform is really saying is it will no longer take
responsibility for those stories that are clearly fake, making the rounds among its more than 2 billion users — and the damage caused in their wake.
advertisement
advertisement
For a company that insists its focus is tech, not media, the power it’s able to exert over the news landscape is alarming. Following the 2016
election, when claims against the platform accused it of contributing harmful and false news that swayed the election, Mark Zuckerburg, cofounder and CEO, doubled down by stating
once again, Facebook is not a media company, and further, fake news isn’t harmful, citing only 1% of the news found on the site as fake.
Small numbers do not negate harm, as several
intelligence agencies have pointed out. In an increasingly polarized political environment, they can cause irreversible damage.
Just a few weeks
after the 2016 election, the site introduced its “Disputed” tag, teaming up with fact checkers across the globe to help debunk dubious articles. Fact checkers will continue to work with
the site, but with a more nuanced approach.
Part of the problem with this new approach and the excuses being touted by the site, is Facebook’s
relaxed response to bad journalism. In addition to doing less to account for the spread of dubious stories, the social network insists the algorithms are to blame for what people see and don’t
see.
According to the company, therein lies the proof it's not a publisher, but a distributor of other people's information. But, humans are
the creators of those algorithms, and therefore, responsible for the information spread.
As more people go to social media to get their
daily news, what is a platform’s responsibility to its audience? How can a company like Facebook, paying millions a year to produce its own native content, not to mention soaking up over 20% of
the digital ad market, continue to operate the status quo?
If the public cares about quality journalism and stamping out fake news, regardless of
political affiliation, it’s going to have to put pressure on platforms like Facebook and others. It must demand these “tech” companies take responsibility for their share in the
damage done by the fake media they circulate.
The future of journalism will be stronger for it. After all, the press has a constitutional charge: It is a watchdog to power.