Commentary

Digital Platforms Cautiously Begin Content Moderation

Facebook, YouTube, Google, and GoDaddy have all moved cautiously to limit or remove hate speech, white supremacists, and neo-Nazis from their platforms -- while Reddit and 4Chan have so far remained on the sidelines, as the platforms of choice for both Trump supporters and outspoken extremist groups. 

FACEBOOK: Facebook removed several groups and individuals from its service and Instagram for what it calls violations of terms banning hate speech. “It's a disgrace that we still need to say that neo-Nazis and white supremacists are wrong — as if this is somehow not obvious," Mark Zuckerberg wrote in a blog post.

Groups removed included Red Winged Knight, Awakening Red Pill, Genuine Donald Trump, White Nationalists United, Right, Wing Death Squad, Awakened Masses, Vanguard America and Physical Removal.

GO DADDY: Go Daddy has banned The Daily Stormer after it mocked the victim of a deadly car attack during the protests in Charlottesville. 

GOOGLE: Google has banned The Daily Stormer as well. Gab, the social network that has become popular among members of the alt-right, was removed from the Google Play Store this week for violating Google’s hate speech policy.

YOUTUBE: YouTube has removed The Daily Stormer YouTube channel. 

TWITTER: A Twitter account claiming to be The Daily Stormer was suspended.

PAYPAL: In a blog post,  the company said it "strives to navigate the balance between freedom of expression and open dialogue -- and the limiting and closing of sites that accept payments or raise funds to promote hate, violence and intolerance."

PayPal said it will no longer let anyone use its payment services to sell paraphernalia associated with far-right hate groups. This includes "organizations that advocate racist views, such as the KKK, white supremacist groups or Nazi groups," said PayPal SVP of corporate affairs and communication Franz Paasche. And ApplePay has employed the same policy.

REDDIT: Meanwhile Alexis Ohanian, CEO and co-founder of Reddit, has remained almost entirely out of the conversation -- even as his site is at the center of the alt-right movement. Instead, he chose to tweet about his wife Serena William’s pregnancy cravings: “So cute...night cravings for Serena ...artichoke” Ohanian was his post on Aug. 18.

His only direct comment on Charlottesville was on Aug. 12, when he tweeted “Disgusted by white nationalists at my alma mater. Greatest gen. gave their lives so 'Blut und Boden' would NOT be chanted proudly in America.”

Apparently he is disgusted by the alt-right on his former campus, but not on his current technology platform. 

4chan, the other site that hasn’t responded to the continued issues of hate speech, was founded by Christopher “Moot” Poole in 2003. In 20015 Poole sold the site to Hiroyuki Nishimura. Even as the site has grown in importance and traffic, its ownership and business practices have remained murky.  

Who is Hiroyuki Nishimura? The owner of 4chan has a mixed reputation in Japan: an entrepreneur, a free speech activist, a celebrity, and some say a genius.

But an official at the National Police Agency, speaking on background to The Daily Beast,  said that things like 2chan and 4chan could be wonderful tools for organized crime -- if they fell into the wrong hands.

According to the 4Chan website, “Since its creation in 2003, 4chan has grown to become one of the world's largest forums, serving approximately 680,000,000 page impressions to over 22,000,000 unique visitors per month with 11 million in the U.S." 

We’ve reached out to 4chan and Nishimura to ask them about what actions, if any, they’re implementing to limit or remove hate speech and Neo-Nazi posts on the site. So far, no response. 

So why does it matter that platforms are acting to limit hate speech? And what are the implications of Ohanian’s Reddit and Nishimura’s 4chan remaining on the sidelines? 

Writing in Quartz,Tim Squirrell, a researcher with the Alt-Right Open Intelligence Initiative, had some thoughtful things to say about the collection of groups known as the “alt-right: “The alt-right isn’t one group. They don’t have one coherent identity. Rather, they’re a loose collection of people from disparate backgrounds who would never normally interact: bored teenagers, gamers, men’s rights activists, conspiracy theorists and, yes, white nationalists and neo-Nazis. But thanks to the internet, they’re beginning to form a cohesive group identity.”

Squirrell’s data show a clear path of growth within the alt-right community, driven in large part by Ohanian’s willingness to allow all but the most pernicious child porn, rape, and violent images and postings to remain on Reddit.

Says Squirrell, “r/The_Donald is a Reddit community with over 450,000 subscribers. They are forming a coherent group identity, represented in the language they have begun to speak, which coalesces around their common hatred of liberalism and their love of Donald Trump.”

1 comment about "Digital Platforms Cautiously Begin Content Moderation".
Check to receive email when comments are posted.
  1. Michael Margolies from Michael Margolies Photography & Design, August 23, 2017 at 4:54 p.m.

    I am glad to see these sites taking down these sites and blocking accounts that endorse hate and racism. So when are they going to start doing so for the alt-left sites calling for murder of the President, killing of conservatives, lying and creating propaganda content to scare off ordinary moderates whom thy wish to silence? When will you start banning leftist groups who routinely shout down hate speech, threats and violence aimed at conservatives?

    Oh, silence, why am I not surprised.

Next story loading loading..