Whatever the result of the legal proceedings against four Google employees in Italy, the trial will be a watershed moment for video search.
Let's go back to the facts. The events began
in 2006, with a horrible episode of Italian children bullying an autistic schoolmate that was captured on a video and posted to YouTube. "The video was totally reprehensible," writes Matt Sucherman,
Google vice president and deputy general counsel, Europe, Middle East and Africa, in a post on Google's blog,
"and we took it down within hours of being notified by the Italian police."
Nevertheless, last week the Italian courts held Google executives criminally responsible for allowing the
autistic child's privacy to be violated. Although the executives were absolved of defamation charges, three of the executives (one of whom no longer works for Google) were convicted of privacy
violations. An appeal is in the works.
The Web almost immediately filled with support for Google. On the logistical level, Google's supporters argue, there's no way for executives of a site
filled with literally years of constantly updated user-generated content to stand guard over every upload that the site sees. And on the legal level, Google's supporters argue that there
shouldn't be a difference between Google's level of culpability and, say, the Postal Service's culpability for hate mail that gets delivered through its systems, or the phone companies for offensive
calls. The U.K.'s former Information Commissioner has gone so far as to say that the Italian case gives privacy laws a "bad name."
But for all the legal discussion, this is ultimately a search problem -- specifically, a problem for video search. SEO experts have long known that the engines are far better at understanding text
than at understanding video. Often, the engines' understanding of video clips comes from understanding the text surrounding a given clip, not from the clip itself. Thus, it's hard for engines to
create an automated flagging and takedown system that could know when its video uploads aren't OK. Instead, YouTube relies heavily on its users to notify Google of objectionable content. As a result,
it can take "hours" for an offensive clip to be removed -- as the Italian courts will readily tell you.
Because Google's legal troubles are search troubles, I see a silver lining in all
this. The legal threats may be the excuse that Google was looking for to work harder on improving the video search capabilities it has. That work will save Google from future legal trouble; it will
also help as Google looks to stay ahead of Hulu and Co. and remain the leader in online video content.
This wouldn't be the first time YouTube's legal challenges became long-term
opportunities for improving video content identification in highly monetizable ways. Consider YouTube's growth from its early days as a copyright-suit target, to its emerging status as a sought-after
partner among movie studios and music companies. As I've written about before, much of that
transformation is due to of Google's development of digital rights "fingerprinting" technology, which automatically identifies pirated clips within YouTube and gives rights owners the option of either
removing or monetizing the content they own. This technology didn't exist when Google first acquired YouTube.
Of course, as difficult as locating infringing video content may be, it's
still a search for clearly defined information, pulled from a clearly organized database. Using automation to find content that's generally offensive is a much fuzzier -- and therefore much more
difficult -- problem to solve. But that isn't to say that the base work of a content-flagging system isn't beginning to take form. Google
Goggles can already identify an object and search for it, based on the picture the user takes through a mobile device; clearly, Google is developing mastery at identifying an object without the
assistance of surrounding text. Meanwhile, parts of the security industry already have automated systems in place that know what suspicious activity looks like, so moving-image flagging devices do
exist.
Of course, the kinds of developments I'm thinking of are years away from where we are now. But when they finally do happen, they'll be a boon for making YouTube a safer place --
and video search a far more useful feature -- in ways that both users and the Google monetization engine could use well.
Which is why a better video search experience just may be Google's
silver lining to its legal woes.