YouTube Aims To Crack Down On Facial Recognition

Google's YouTube has changed its terms of service to make clear that it doesn't want images on the platform to end up in facial-recognition databases.

The new terms, which took effect Wednesday, state that users are prohibited from collecting or harvesting data that could be used to identify people, such as “faces” and “usernames,” without the subjects' permission.

YouTube's prior terms of service also prohibited users from collecting data that could be used to identify people, but didn't explicitly list “faces” as an example. 

The move comes around nine months after it came to light that the startup Clearview AI scraped billions of images from YouTube (as well as Facebook and other social media services) in order to build a facial recognition database. Clearview then reportedly sold the database to police departments, governmental agencies and private companies around the country.

YouTube and the other web companies demanded that Clearview stop gathering data and delete any images it had already harvested.

It's not clear how YouTube will attempt to enforce the facial-recognition ban. The company says it can terminate accounts of people who violate its terms, but that won't necessarily stop people from gathering data.

For its part, Clearview, which is facing lawsuits in federal and state courts, says its activities are shielded by free speech principles.

“Clearview’s collection and use of publicly-available photographs are protected under the First Amendment,” lawyers for the company wrote last month in a motion asking Cook County Circuit Court Judge Pamela Meyerson to throw out a lawsuit brought by the ACLU, which argues that Clearview is violating an Illinois biometric privacy law.

“Central to this case is the indisputable proposition that all information of potential relevance to this case is and has been publicly available,” Clearview argues. “Courts have repeatedly held that individuals have no right to privacy in materials they post on the Internet.”

Next story loading loading..