Companies are increasingly turning to “dark patterns,” such as pre-checked boxes or fine-print disclosures, to dupe consumers into making purchases or parting with their data, according to a new staff report by the Federal Trade Commission.
The report, which the agency released Thursday, discusses several types of so-called dark patterns -- broadly defined as manipulative design practices that trick consumers.
The staff report focused on practices involving deceptive advertisements, confusing cancellation policies, and privacy interfaces that bury options to reject data sharing.
“Our report shows how more and more companies are using digital dark patterns to trick people into buying products and giving away their personal information,” Samuel Levine, head of the consumer protection bureau, stated. “This report -- and our cases -- send a clear message that these traps will not be tolerated.”
While all commissioners voted to release the report, Noah Phillips expressed reservations about the phrase dark patterns”
“I just don't think the term dark patterns is very helpful,” he said, adding that the phrase “reduces rather than adds clarity.”
Phillips also said some of the practices flagged in the report aren't in themselves “unfair” or “deceptive.”
He called particular attention to the report's section on privacy, which included a statement that companies should both minimize data collection and “avoid subverting consumers’ privacy choices.”
Phillips said that while he realizes some people don't like targeted advertising, ad personalization doesn't in itself run afoul of the FTC Act's prohibition on unfair or deceptive trade practices.
Among other privacy-related recommendations, the staff report urged businesses to “avoid default settings that lead to the collection, use, or disclosure of consumers’ information in a way that they did not expect,” and to “make consumer choices easy to access and understand.”
The report also says companies that collect sensitive information should give consumers “information that they need to make an informed decision,” such as whether that data will be shared with others.
Several states, including California, Colorado and Connecticut have recently prohibited companies from using “dark patterns” to obtain people's consent to the use of their data.
In California, privacy regulations specifically state that businesses can't present consumers with double-negatives, such as “don't not sell my personal information.”
On the national level, federal lawmakers introduced a bill last year that would have explicitly prohibited large web companies from designing interfaces in ways that thwart people's “autonomy,” or “decision-making” regarding privacy options.
The self-regulatory group Network Advertising Initiative recently said it was concerned “about the use of dark patterns that have led to increased attention and regulation.” In May, the organization issued a best practices guide that included a recommendation for companies to make opt-in and opt-out options equally accessible, and to give them a similar look.