Facebook Finally Moves To Squelch QAnon, As Danger Is Magnified

In one of its broadest policy changes to date, Facebook is banning all accounts of the dangerous conspiracy group QAnon from its platforms, and has reportedly taken down millions of accounts and other formats being used by the group to organize and encourage violence, particularly leading up to the election.

As of yesterday, Facebook began removing Facebook Pages, Groups and Instagram accounts for representing QAnon, Facebook posted on its blog. “Our Dangerous Organizations Operations team will continue to enforce this policy and proactively detect content for removal instead of relying on user reports.”

Facebook also now prohibits running ads that “praise, support or represent militarized social movements and QAnon.”

“We’re starting to enforce this updated policy today and are removing content accordingly, but this work will take time and will continue in the coming days and weeks,” Facebook acknowledged.

Facebook said its actions represent updates on measures it started on August 19 to “disrupt the ability of QAnon Militarized Social Movements to operate and organize on our platform.”

In the first month, it removed over 1,500 Pages and Groups for QAnon containing discussions of potential violence and over 6,500 Pages and Groups tied to more than 300 Militarized Social Movements, the platform reported, “but we believe these efforts need to be strengthened when addressing QAnon.”

Now, it will remove any Facebook Pages, Groups and Instagram accounts representing QAnon even if they contain no violent content. It is not targeting individual posts, however.

“While we’ve removed QAnon content that celebrates and supports violence, we’ve seen other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups, which diverted attention of local officials from fighting the fires and protecting the public,” Facebook’s post noted. “Additionally, QAnon messaging changes very quickly and we see networks of supporters build an audience with one message and then quickly pivot to another. We aim to combat this more effectively with this update that strengthens and expands our enforcement against the conspiracy theory movement.”

Facebook also said its new policies were in part a response to “evidence that QAnon adherents are increasingly using the issue of child safety and hashtags like #savethechildren to recruit and organize.”

The platforms will now “direct people to credible child safety resources when they search for certain child safety hashtags. In addition, content about QAnon and child safety is eligible for fact checking through our third-party fact-checking program. Content that is debunked will be reduced in News Feed and filtered from Explore and hashtags on Instagram, will receive a label (so that people who see it, try to share it or already have, will see more context), and it will be rejected as an ad.”

QAnon followers buy into a deranged theory that “posits that high-profile Democrats and Hollywood celebrities are members of a child-eating cabal that is being secretly taken down by President Donald Trump, and that members of this fictitious cabal will soon be marched to their execution,” in NBC News’ description. “The conspiracy theory relies on posts from Q, an anonymous user of the extremist message board 8kun, which was formerly called 8chan, who has been wrongly predicting the roundup of prominent Democrats since October 2017.”

QAnon has been a major force in disseminating false information about COVID-19, vaccinations and Democratic Presidential candidate Joe Biden (including the false rumor that he used an earpiece during last week’s debate), among other subjects.

Given QAnon’s power for spreading dangerous misinformation and encouraging violence, “It is imperative that Facebook dismantle [QAnon’s]  infrastructure,” Joan Donovan, research director of the Shorenstein Center on Media, Politics and Public Policy at the Harvard Kennedy School, told NBC News. “Without Facebook, they are not rendered inert, but it will make it more difficult to quickly spread disinformation.”

“Of course, this all could have been done sooner, before Q factions aligned with militia groups and anti-vaxxers, to curtail the spread of medical misinformation and the mobilization of vigilante groups,” she added.

Next story loading loading..