Facing intense scrutiny for failing to police foreign disinformation agents, Facebook has decided to hire more humans to review ads running on its network.
“We are hiring 1,000 more people to our global ads review teams,” Tom Channick, corporate communications manger at Facebook, said Monday.
Earlier in the month, Facebook finally admitted that Russian operatives placed around 3,000 paid ads on its platform over the past two years.
“In reviewing the ads buys, we have found approximately $100,000 in ad spending from June of 2015 to May of 2017 -- associated with roughly 3,000 ads -- that was connected to about 470 inauthentic accounts and pages in violation of our policies,” Facebook Chief Security Officer Alex Stamos wrote at the time.
In response, the tech titan said it would begin forcing Pages to disclose the source of funding behind political ads.
Facebook is also working with Congressional investigators and special counsel Robert Mueller, as part of their probes of Russia's interference in the 2016 U.S. presidential election. “We support Congress in deciding how to best use this information to inform the public, and we expect the government to publish its findings when their investigation is complete,” Facebook founder and CEO Mark Zuckerberg recently said in a streamed message.
Telegraphing the decision to hire more human ad reviewers, Zuckerberg also recently promised to strengthen Facebook’s ad review process for political ads.
Over the next year, Facebook also plans to increase its investment in security and “election integrity” by adding more than 250 people across related teams, Zuckerberg recently said.
Adding to Facebook’s image problem, it was recently revealed th at it temporarily allowed advertisers to target users based on keyword combinations, such as “Jew hater” and “How to burn jews.”
It wasn’t until ProPublica brought the anti-Semitic categories to Facebook’s attention last week that the company removed them from its ad-targeting menu.
In response, Facebook said its ad-buying system is not perfect. “There are times where content is surfaced on our platform that violates our standards,” stated Rob Leathern, product management director at Facebook.
While accepting some blame for carelessly catering to anti-Semites, Facebook took issue with ProPublica attributing the hateful ad categories to an “algorithm.” The categories in question were self-reported, based on how users filled out their profiles, according to the company.
Along with increasing pressure at home, foreign nations are push ing Facebook to clean up its platform. In addition to Twitter and other U.S. tech companies, EU regulators say Facebook has six months to curb hate speech and terrorist-related content on their platforms.