Yesterday, Facebook’s head of security policy, Nathaniel Gleicher, published an editorial in the Des Moines Register making a case that the company has “made large investments in teams and technologies to better secure our elections” since its widely criticized roles in the 2016 U.S. presidential election.
Facebook has, of course, also been widely criticized for its decisions not to attempt to screen for or ban false political ads, or limit microtargeting, in relation to this year’s presidential elections — as well for insufficient safeguarding of users’ information within and outside of the elections context. All of which is a backdrop as Facebook and the other big tech companies face government antitrust scrutiny and calls from critics for their breakup.
“When Iowans go to caucus [next month for the Democratic presidential nomination], they should do so knowing that Facebook has made wholesale improvements to how we approach election security,” writes Gleicher. “But you should also know that we’re not resting on any progress and continue to find ways to improve. We remain committed to fighting election interference, increasing transparency, and giving more people more information about what they see online.”
“What happened in 2016,” Gleicher argues, “was a [sic] wake up call. Facebook — and our country — was caught off guard by Russia’s attack on our elections using social media, fake accounts, forged documents and other forms of manipulation. It forced wholesale changes in how we, as a company and as a nation, approach these issues.”
Critics have pointed out that Facebook’s screening of ads and content leading up to the 2016 election was irresponsibly inept, including its failure to scrutinize ad orders placed in Russian.
The editorial’s list of Facebook’s changes include tripling the size of its safety and security staff, to 35,000; creating rapid response centers to identify and remove content that violates its behavior policies, which will operate during all caucuses and primaries; and working “in much closer partnership” with law enforcement agencies, state officials and other tech companies to combat potential interference.
Since 2016, the company has helped fight disinformation campaigns in more than 200 elections around the world, according to Gleicher.
In the 2018 U.S. midterm elections, it dismantled more than 100 accounts “likely linked to the Russian-backed Internet Research Agency from Facebook and Instagram,” and removed “more than 45,000 pieces of misleading content aimed at suppressing voting.”
On the political ad front, he points out the company’s increased requirements to verify that those wanting to run ads are actually located in the countries in which they want to run them, and its new public disclosure of who paid for such ads, where they ran and some information about who they reached.
The editorial reiterates that Facebook policies prohibiting hate speech, real-world harm, voter suppression, and election interference are enforced in regard to ads, as well as user posts.
Facebook has also partnered with 56 independent fact-checking organizations, in 44 languages, “to give people more information about what they’re seeing,” Gleicher writes.