Facebook Oversight Board To Review Decisions About 'Hate Speech,' 'Dangerous' People

Facebook's new independent “oversight board,” which is tasked with evaluating the company's content moderation decisions, has selected its first six cases for review.

Three of the matters deal with posts that were removed for alleged violations of Facebook's policies against “hate speech,” while two others involve posts that allegedly ran afoul of the company's policies regarding nudity, “dangerous individuals and organizations,” and “violence and incitement.”

The board, which received 20,000 complaints by users since last month, stated Tuesday that it is giving priority to matters that “have the potential to affect lots of users around the world, are of critical importance to public discourse or raise important questions about Facebook's policies.”

The board provided a brief description of the posts under review, and is inviting public comment on them.

Based on the summaries, the posts raise thorny questions about how Facebook decides to balance its editorial standards with free speech values.

For example, the post that was removed for supposedly violating Facebook's policies on dangerous individuals and organizations included a quote attributed (apparently wrongly) to Nazi Propaganda Minister Joseph Goebbels, about “the need to appeal to emotions and instincts, instead of intellect and on the unimportance of truth,” according to the board's written summary.

“The user indicated in their appeal to the Oversight Board that the quote is important as the user considers the current US presidency to be following a fascist model,” the organization wrote.

Also on Tuesday, the Facebook critics behind the group “The Real Facebook Oversight Boardsaid it would hold three hearings over decisions it disagrees with.

“The Facebook Oversight Board is a toothless body, with too many loopholes to address the massive harms on the site,” founding member Roger McNamee stated.

The group's first case will address Facebook's decision to allow former White House strategist Steve Bannon to remain on the platform, despite suggesting in a video post that Dr. Anthony Fauci should be beheaded.

3 comments about "Facebook Oversight Board To Review Decisions About 'Hate Speech,' 'Dangerous' People".
Check to receive email when comments are posted.
  1. Tony Jarvis from Olympic Media Consultancy, December 2, 2020 at 6:01 p.m.

    As eloquently posited by Charlie Warzel in the NY Times, "Facebook is Too Big for Democracy". https://www.nytimes.com/2020/09/03/opinion/facebook-zuckerberg-2020-election.html?searchResultPosition=10
    Facebook Oversight Board's, whether "Real" or so called "independent" and merely window dressing, will not stop the, "most powerful unelected man in America" per Mr. Warzel.  I suggest, the new Genghis Khan.  Either Fakebook must be made responsible for everything it publishes to the same established levals of moral, ethical and legal decency standards, or higher, as other major media or it should be closed down. 
    Many have asked, who put the "mock" in democracy in the US?  The leading candidtae is crystal clear.  As the journalist Max Read stated in 2017, "Facebook has grown too big, and its users too complacent, for democracy". 
    When are you going to help save democracy and unsubscribe?

  2. Charles Pierce from Private replied, December 2, 2020 at 11:41 p.m.

    Facebook is not a "publisher" in the true sense of the word.

    It creates a platform whose primary purpose it that people or groups can create and share their messages. The phone companies are not responsible for conversations you and I create and share using their technology.

    Facebook does have some editorial or publisher-like content in that it has chosen certain content to post from some news organizations. And, it also accepts and displays advertising to its viewers. So, you could say when it publishes content it pays for or shows ads, then that part of its actions could be subject to traditional legal frameworks.

    The most important point thought is what should Facebook do about people who create and share their own content? Legally, private corporations can run their platforms independent of first amendment considerations so they can choose their form of censorship on this platform. I doubt phone companies could monitor our conversations and censor them though as they depend on common carrier access and are regulated by the government, so censorship by those companies could be subject to the constitution since the government regulates them.

    Ultimately, if people want to lie or promote falsehoods with their own content, and if other people blithely believe everything they hear is true, then it is the people and not the medium that bear the most responsibility.

  3. Ed Papazian from Media Dynamics Inc, December 3, 2020 at 9:14 a.m.

    Charles, you make the point that telephone companies are not responsible for private conversations---which is perfectly valid. However, those private conversations can't be "followed" or circulated by tens of thousands of people. That's the key difference. If FB only alllowed one person to communicate with another specific individual---or even particular sets of pre-determined individuals-----but nobody else, then it does, indeed, operate like a telephone company. But this is not the case. Instead, FB allows people to create their own networks where incendiary information or "news" as well as less harmful stuff can "go viral" and reach hosts of people who may not be well informed about the subject matter, yet be influenced by it. A lot of people are thinking that some sort of control over such situations is needed.

Next story loading loading..