Google engineers on Thursday released their vision of how the company plans to develop open privacy standards that are more in line with consumer expectations. With the help of the web community, Google will develop the standards that will partly restrict fingerprinting on the web and improve the way browser cookies are classified, among other tasks.
“Over the last couple of weeks, we’ve started sharing our preliminary ideas for a Privacy Sandbox, a secure environment for personalization that also protects user privacy,” wrote Justin Schuh, director of Chrome engineering, in a blog post.
The new approach, dubbed Privacy Sandbox, aims to ensure that ads remain relevant for users who are willing to share their data with websites and advertisers.
The standards will anonymize aggregate user data and keep much more of it on the device, rather than store it in the cloud. This will reduce the risk of the data being compromised or stolen.
In addition to the company’s vision, Google also released a summary of each proposal, which the company collectively refers to as the Privacy Sandbox.
For starters, Google engineers are exploring how to deliver ads to large groups of like-minded people without allowing individually identifying data to leave the user’s browser.
This builds on what Google calls Differential Privacy techniques. The company has been using this technique in Chrome for nearly five years. It collects anonymous telemetry information and shows that it is possible for a browser to avoid revealing that someone’s a member of a group who likes dogs, for example.
The group also will work on conversion metrics -- a step that will help publishers and advertisers determine whether the ads served actually lead to more business. Google and Apple have published ways to evaluate how some of these use cases can be addressed, but the thinking remains in the early stages of content.
“The proposals are a first step in exploring how to address the measurement needs of the advertiser without letting the advertiser track a specific user across sites,” Schuh wrote.
Proposals also include fraud protection and protecting sandbox boundaries. “Removing certain capabilities from the web causes developers to find workarounds to keep their current systems working rather than going down the well-lit path,” he wrote.
“We’ve seen this recently in response to the actions that other browsers have taken to block cookies -- new techniques are emerging that are not transparent to the user, such as
fingerprinting.”
Developers used workarounds to collect “tiny bits of information that vary between users, such as what device they have or what fonts they have
installed.” Combining several of these small data points, they can generate unique identifiers that can then be used to match a user across websites.
And unlike cookies, users cannot delete their fingerprint, which would stop them from being identified.
“This subversion of user choice is wrong,” wrote Schuh -- something that has led Google to develop the Privacy Sandbox.