Google Lifts Political Ad Blackout; YouTube Finally Bans Election Fraud Claims

Alphabet's two platforms, Google and YouTube, seem to be going in opposite directions regarding preventing false election content from being spread, leading some to wonder if it might be engaging in a balancing act, amid the country's divisiveness and Congressional calls for dismantling the tech giants.

On Wednesday, Google informed advertisers that it is lifting its month-long ban on political advertising as of today.

Although the giant search platform said it put the ban in place in the first place due to concerns that ads could disseminate misinformation during the election period, the move to lift it is occurring despite the current runup to the two crucial Georgia senate runoffs in January.

Early voting begins later this month for those elections, which are will take place on January 5 against a volatile backdrop of false claims of election victory and pressure on Georgia election officials from outgoing President Trump and his supporters.

Trump held a rally in Georgia last week in which he urged Republicans to come out and vote for the Republican senate candidates, at the same time he repeated false claims about election fraud and Georgia officials’ supposed participation in same.

The lift of the advertising ban will, among other things, possibly clear the way for Google ads by One America News Network, the right-wing outlet that’s won Trump’s favor by repeating and spreading his false claims, noted CNBC.

In its email to advertisers, obtained and reported on first by The Wall Street Journal, Google said it no longer considers “the postelection [sic] period to be a sensitive event,” but “will still rigorously enforce our ads policies, which strictly prohibit demonstrably false information that could significantly undermine trust in elections or the democratic process, among other forms of abuse.”

Meanwhile, more than a month after Joe Biden won the presidency, Alphabet’s other big platform, YouTube, said in a blog post yesterday that it will begin taking down videos that “mislead people by alleging that widespread fraud or errors changed the outcome of the 2020 U.S. Presidential election, in line with our approach towards historical U.S. Presidential elections.”

The reasons cited: The safe harbor deadline for the U.S. Presidential election was yesterday and “enough states have certified their election results to determine a President-elect.” (No mention of Joe Biden’s, or Trump’s, name.)

“However, the policy will still likely leave up videos that were uploaded in the past weeks that push false claims,” according to CNBC, which points out that YouTube “has been repeatedly criticized for allowing election misinformation, taking a more hands-off approach than its social media peers,” Facebook and Twitter.

In its post, YouTube said that, since September, it has “terminated” more than 8,000 channels and “thousands of harmful and misleading elections-related videos” for violating its existing policies — and that more than 77% of those removed were taken down before they had 100 views. 

Next story loading loading..