Google, Facebook, Twitter and other large tech companies would have to offer people the ability to opt out of receiving some forms of personalized content, under a proposed bipartisan bill unveiled Thursday.
The Filter Bubble Transparency Act, introduced by Sen. John Thune (R-South Dakota) and co-sponsored by Sens. Mark Warner (D-Virginia), Marsha Blackburn (R-Tennessee), Jerry Moran (R-Kansas) and Richard Blumenthal (D-Connecticut), would require large tech companies to disclose whether they display material to particular users based on personal data collected from them, including their web-browsing and search history.
The bill would also require the tech companies to offer a version of their services that doesn't return results based on personal factors, like web-browsing history.
The Federal Trade Commission would be tasked with enforcing the law. The measure would only apply to companies that collect data from more than 1 million users and make more than $50 million per year.
The proposed bill distinguishes between what it terms “opaque” algorithms and “input-transparent” algorithms.
“Opaque” algorithms, under the bill's definition, draw on data that people don't provide to use the service -- including their “history of web searches and browsing, geographical locations, physical activity, device interaction, and financial transactions.”
The “input-transparent” algorithms draw on material provided by users in order to use the service -- such as terms they are currently typing into the search boxes, speech patterns, saved preferences, and current geolocation data.
The proposed bill would require platforms that to notify users about the existence of “opaque” platforms, and also make available an “input-transparent” version.
Earlier this year, a group of lawmakers introduced a separate bill that also aimed to address how tech companies use algorithms. That measure, the Algorithmic Accountability Act, introduced by Sens. Ron Wyden (D-Oregon), Cory Booker (D-N.J.) and Rep. Yvette D. Clarke (D-N.Y.), would require companies to study whether their algorithms pose risks to privacy, as well as whether they may result in inaccurate, unfair or discriminatory decisions.