Last week, Facebook made an event of releasing what it called its first quarterly “Widely Viewed Content Report,” stressing that “transparency is an important part of everything we do.”
That report, covering the second quarter, showed harmless interactions among friends and groups about “pets, cooking, family and relatable viral content” — not hate speech or COVID misinformation — dominating U.S. users’ content.
However, it turns out that there was an earlier “Community Standards Enforcement Report,” covering Q1, that was close to release, but was suppressed because Facebook executives were concerned that it put a negative light on the company, reports The New York Times.
While the 20 most-viewed links in the Q1 report were also to non-political content like recipe sites, the most-viewed link — seen by nearly 54 million U.S. Facebook accounts — was a news article from the South Florida Sun Central (republished by The Chicago Tribune) with a headline suggesting that the COVID vaccine caused the death of a Florida doctor.
When the medical examiner’s report came out months later, stating that there wasn’t enough evidence to determine whether the vaccine contributed to the physician’s death, “far fewer people on Facebook saw the update,” per The Times.
The 19th-most-viewed Facebook page in Q1 — seen by 81.4 million accounts — was a "Trending World" page on the Epoch Times, “an anti-China newspaper” that has promoted the QAnon conspiracy theory and spread misinformation about voter fraud prior to the 2020 election. (Fox News ranked 18th in the Q1 draft report, with 81.7 million viewers.)
Facebook has barred the Epoch Times from advertising due to repeated violations of its political advertising rules.
In the released report covering Q2, "Trending World"’s subscription link was among the most viewed, but by 44.2 million accounts, or about half those shown in the Q1 report.
Facebook spokesperson Andy Stone is quoted as confirming that the company considered making the content report public earlier, “but since we knew the attention it would garner… there were fixes to the system we wanted to make.”
“You can’t trust a report that is curated by a company and designed to combat a press narrative rather than real meaningful transparency,” Brian Bedford, Facebook’s former vice president of marketing, asserted to the Times. “It’s up to regulators and government officials to bring us that transparency.”
In response to heightened criticism from lawmakers and watchdog groups — and President Biden’s (later toned-down) accusation that Facebook is “killing” people by spreading misinformation about the coronavirus vaccines — Facebook has been touting the millions of pieces of COVID-related misinformation it has deleted (18 million since the pandemic began).
Facebook has also gone on the offensive in recent times. It accused the Biden administration of scapegoating the company for missing its vaccination goal. And in response to evidence that Facebook was used to help incite and organize the January 6 Capitol insurrection, the company blamed smaller, less well-monitored social platforms.
Earlier this month, Facebook garnered wide criticism for disabling the personal accounts of researchers at New York University who have been studying political ads on the platform.