Last week, both YouTube and Facebook gave the web video channel InfoWars and its editor Alex Jones a long overdue time-out, but it's important to put those actions in perspective.
Back in March of 2017, Paul Joseph Watson — then InfoWars editor — tweeted, "I'm not sure the left understand the monumental ass-whupping being dished out to them on YouTube."
Writing on Vice at the time, Tom Whyman agreed, saying: "On YouTube...left-wing voices are seemingly non-existent – apart from that communist child – while right-wing voices dominate. But it's not just InfoWars – there is a whole ecosystem of right-wing YouTubers, a vast horde of unloved grown-up boys droning on and on about why we need to halt immigration; why we need to ban toilets; why we need to kill all girl-children at birth."
“Where YouTube is effective is, it provides a vehicle for this pseudo-intellectualized bigotry,” Brian Levin, director of the Center for the Study of Hate and Extremism, at California State University, told The Guardian. “It’s marketed to a young white male audience that feels disenfranchised and feels a sense of unease at things going on at universities and workplaces.”
Central to the criticism of YouTube, and to some extent Facebook, is the recommended content, which tends to amplify elements of extremism within a right-wing filter bubble.
“It drags you further and further,” Keegan Hankes, research analyst with the Southern Poverty Law Center, said in The Guardian. “The more you watch it, the more indoctrinated you get.”
“The reach that a platform like YouTube gives you is phenomenal compared to how these groups used to recruit, which is in their local towns,” said Hany Farid, senior adviser to the Counter Extremism Project.
Media Matters, a media watchdog and progressive standard-bearer, released a study last week examining 463 prominent Facebook pages that regularly posted political content. One key finding was that images (memes) posted by right-leaning pages were the best-performing content. On average, right-leaning meme pages had almost twice the interaction rates of right-leaning and left-leaning pages, with more than four times the interaction rates of politically unaligned pages.
How can Democrats respond to the rise of a meme war? “Democrats, progressives, never-Trumpers, and others should post more aggressive ‘anti-Trump’ memes,” said Alan Rosenblatt, social media researcher and strategist at Lake Research Partners and turner4D.
But it’s not that simple. Rosenblatt explains: “A mix of anti-Trump memes and memes that speak to the damage caused by his policies (health care, income inequality, economic uncertainty, human rights abuses at the border, immigrant labor, etc.), along with positive memes about how to fix his damage, are warranted."
So, is the suspension of InfoWars the start of a new, more aggressive platform attempt to dial down dangerous and provably false hate speech and vitriol?
Facebook suspended Infowars founder Alex Jones but did not suspend the actual Infowars Page, or any pages where the InfoWars videos that violated policy were posted. Wait, what? Jones continues to maintain that the Sandy Hook shooting tragedy was a hoax, and while he's got a 30-day time-out, those theories remain posted.
Recode, trying to figure out exactly what the current Facebook policy is, and how it works, posted: "The reason Jones was suspended, but his Pages are still up, is that Jones posted the same bad content to multiple pages, drawing multiple strikes against his record. So if Jones shared three bad videos to three different pages, for example, he would receive nine total strikes, whereas each Page would receive just three."
So, it appears, nine strikes, you're out (for 30 days).
Back in the good old days of human editors, people made judgments about what should and what shouldn't end up on-air, or in the pages of print newspapers. But the speed and volume of content on the internet swamps human review. And so technology-driven companies like Twitter are trying to outrun bots and trolls with smarter algorithms.
“In May, we started using behavioral signals and machine learning to reduce people’s ability to detract from healthy public conversation on Twitter,” explained Kayvon Beykpour, Twitter's product lead. "This approach looks at account behavior and interactions with other accounts that violate our rules.”
But this machine-learning approach, shockingly, downgraded some "prominent conservative names" by not auto-filing their names in the Twitter search bar. Conservatives said they'd been "shadow banned" after Vice reported their digital demotion. But Twitter responded that the downgrade was not related to the conservatives' political views, but instead was based on “behavioral signals” used to downgrade accounts that were not “healthy” to the public discourse.
And yet, tweets from President Trump attacking fellow Republicans, Democrats, Mexicans, LGBTQ, reporters, #MeToo advocates — the list goes on and on — somehow don't trigger Twitter's "healthy" algorithm? It certainly calls the objectivity of the social technology into question.
It's easy to simply dismiss extreme right-wing social media as "amateurish" and driven primarily largely by the unexamined prejudices and anxieties of angry white men. But responding in kind isn't the answer. The solo rant to camera is an inherently autocratic voice.
Still, the future of the web is VIDEO. And reasonable progressive voices urgently need to find a way to use it — and fast.