Social Giants Manipulate Users Into Giving Up Privacy: Norwegian Study

Facebook and Google use design, “dark patterns,” and threats to manipulate consumers into accepting loss of privacy, putting the firms at possible odds with GDPR and U.S. law, according to a study by the Norwegian Consumer Council, a group funded by the Norwegian government.  

The objective is to “nudge users of Facebook and Google, and to a lesser degree Windows 10, toward the least privacy-friendly options to a degree that we consider unethical,” the study notes.  

It was not clear at deadline whether these findings would lead to punitive action. The study notes that they could apply to any number of digital giants.

On May 25, the day GDPR took effect, the Austria-based group None Of Your Business filed complaints against Facebook, Google, Whatsapp and Instagram, arguing that “Tons of ‘consent boxes’ popped up online or in applications, often combined with a threat, that the service cannot longer be used if users do not consent.” In addition, the French consumer group is challenging the idea of “forced consent.” It is the second European organization to do so in the wake of GDPR.

Overall, the Norwegian report does not paint a positive picture.

“The popups from Facebook, Google and Windows 10 have design, symbols and wording that nudge users away from the privacy friendly choices,” it states. “Choices are worded to compel users to make certain choices, while key information is omitted or downplayed.”

In addition, “none of them lets the user freely postpone decisions. Also, Facebook and Google threaten users with loss of functionality or deletion of the user account if the user does not choose the privacy intrusive option.”

Among these practices are “privacy intrusive default settings, misleading wording, giving users an illusion of control, hiding away privacy-friendly choices, take-it-or-leave-it choices, and choice architectures where choosing the privacy friendly option requires more effort for the users.”

Dark patterns are defined as “features of interface design crafted to trick users into doing things that they might not want to do, but which benefit the business in question.”

To start with, the social media giants benefit from the fact that most users never change — or even look at — their default settings. They default to the “least privacy friendly option,” the study says.

But it adds that “personal data that is processed for purposes outside of the core functions of the service should not be mandatory or enabled by default.”

Case in point: Facebook GDPR’s popup “requires users to go into ‘Manage data settings’ to turn off ads based on data from third parties. If the user simply clicks ‘Accept and continue’, the setting is automatically turned on. This is not privacy by default.”

Similarly, “Google’s settings for ads personalisation and for sharing web & app activity requires the user to actively go into the privacy dashboard in order to disable them,” the study continues.

In contrast, Google’s settings to “store Location History, Device Information, and Voice & Audio Activity are turned off until the user actively enables them,” the study continues.

In the case of face recognition, Facebook “effectively hides privacy-intrusive default settings from the user.” Consumers who want to keep that feature turned off “have to go into the settings and actively select off.  

However, Windows 10 requires users to click on every choice to proceed. This “is an example of giving users an explicit choice, rather preselecting an option that is preferred from the service provider’s side.”

In addition, Windows 10 does not threaten the user with loss of data, or force the person to make an immediate decision.

The study notes, though, that “Facebook and Google provide their services free of charge, and monetize user data. Microsoft’s Windows 10 is not dependent on the same level of user data monetization.”

Another questionable practice is manipulative design. For instance, “the contrast of blue buttons for accepting, and dull grey to adjust settings away from the default, is an example of design intended to nude users by making the ‘intended; choice more salient.

From an ethical point of view, we think that service providers should let users choose how personal data is used to serve tailored ads or experiences.”

This report was written with funding from the Norwegian Research Council and the Norwegian ministry for Children and Equality and with input from academics from the ALerT research project, BEUC, and Privacy International.

 

Next story loading loading..