Commentary

Google To Scan Files In Its Cloud Services For Illegal, Harmful Content

A Google policy announced last week gives the company permission to scan files for illegal and harmful content.

The policy focuses on Google's drive cloud-storage service, restricting access to files that violate company guidelines and terms of use.

Google, using algorithms that scan the content, will keep a closer eye on harmful content ranging from cybercrime offenses to the protection of copyright law and child sexual abuse.

When Google identifies a file violates the company’s Terms of Service or program policies, the owner will see a flag next to the file name. The owner will not have the option to share the file. It will no longer be publicly accessible, even to people who have the link. 

The owner of the file will receive an email notifying them of the action, and alert them of how to request a review if they think it is a mistake. 

advertisement

advertisement

For items in shared drives, the shared drive manager will receive the notification.

Google said it also may remove or lock content that violates applicable local laws. 

On January 5, 2022, will update its Terms of Service to ensure those using its services will understand what to expect from Google, and what Google expects from those using its services.

The list of don’t also includes “misleading content”, unauthorized pictures of minors, blood and drastic depictions of violence as well as propaganda by violent organizations and movements. Copyrighted content may not be shared without permission or links to websites on which the material can be downloaded illegally.

Google’s decision to scan files in its cloud service follows Apple to do something similar.

In August, Apple said it would roll out a technology to detect and report known child sexual abuse material to law enforcement, but in a way that will preserve user privacy.

The idea is to block potentially sexually explicit photos sent and received through a child’s iMessage account.

Another feature intervenes when a user tries to search for terms related to Child Sexual Abuse Material (CSAM) through Siri and Search.

Next story loading loading..