Brand, Ad Safety Issues Spread From Google To Social Media Platforms

The advertising boycott related to extremist content on YouTube will cost Alphabet $750 million, an analyst predicted Monday -- and another, Pivotal Analyst Brian Wieser, just doesn't think Google is working quickly enough to solve the problem.

"CMOs were genuinely unaware of the problem and related risks given the range of issues they are charged with managing," Wieser wrote in a research note published Monday. He estimates about a 1% revenue impact this year and next, assuming that the problem is settled soon.

The issue is quickly spreading from Google to Facebook, according to Eric Feinberg, founder of Global Intellectual Property Enforcement Center (Gipec). The former Madison Avenue executive turned tech geek set out several years ago to find ad-supported content linked to terror and hate groups. He co-developed a patent issued in December that relies on deep Web integration to find keywords and coding linked to terrorism and hate speech on sites.

"We're not going after one tree at a time," Feinberg said. "We understand how the root structure and the soil interact to make the whole forest work. That's how the patent was designed."

In the patent's abstract, the inventors describe it as a "computerized system and method for detecting fraudulent or malicious enterprises."

Feinberg said this patent will only work if it is allowed to integrate with Google's and Facebook's advertising and content technology. "We don't say certain words and phrases that ISIS uses, so we built a database of key communication strands and certain keywords used by hate groups and extremists and certain characteristics of the videos to identify them," he said.

Advertising agencies have recently come forward admitting that Google DoubleClick has technology within the platform to block ads from serving on specific sites, but Feinberg -- who is co-author of the patent -- said that's not enough.

Feinberg recently connected with Dallas Police Department Sergeant Demetrick Pennie, president of the Dallas Fallen Officer Foundation, to further the cause.

"Facebook created a phenomena in streaming media the world is not ready for," Pennie said. The content also indexes and serves up in Google search queries based on keywords and hashtags.

Google has posted several blogs detailing changes to their policy, and Facebook interrupts the live streams as quickly as possible when they are reported to the company when someone violates its Community Standards.

Facebook has given people a way to report violations during a live broadcast, and it also monitors live videos if they hit a certain threshold of popularity. The team works around the clock to review content and urges people to contact law enforcement directly if they become aware of a situation where the authorities can help.

But Pennie said all this is not enough. On YouTube and Facebook and other social media sites, certain hashtags tie content about killing police to advertisements from automotive manufacturers like Ford and Dodge, Pennie said. "Some are rap videos," he said. "I'm just trying to get someone to pay attention."

Based on Pennie's personal interactions with the Dallas 2016 shooter ambush in which several officers were killed, he brought two lawsuits against Google, Facebook and Twitter, the latest in January. He attributed the murders to the hate content on these Internet sites and each company's refusal to take down the content.

In the past year there have been more than a half a dozen suicides broadcast live and a couple of attack on Facebook, Pennie said, where brands subsidize extremist and disturbing content.

Next story loading loading..