FaceApp's Broad Privacy Policy Draws Scrutiny

FaceApp, an app that first surfaced in 2017, saw a renewed surge of popularity this week as celebrities like the Jonas Brothers, Carrie Underwood and Kevin Hart posted photos depicting how they may look in the future.

The app, which was developed in Russia, uses artificial intelligence to digitally edit photos of faces.

This week, FaceApp's old age filter -- which shows how a face could appear decades in the future -- was especially popular.

But that's not the only filter. The app also allows people to edit photos in a variety of ways, including making faces appear more masculine or feminine, or changing subjects' skin color.

Like many apps, FaceApp asks downloaders for access to more information than it seemingly needs. Specifically, FaceApp requests access to users' entire photo libraries, although the app only needs access to whatever selfie people want to alter.

Despite its broad demands for data, FaceApp only appears to upload files that users have selected for revision, according to researcher Will Strafach.

FaceApp's privacy policy also appears to allow the company to retain photos in perpetuity -- although the company says most images are deleted within 48 hours.

Still, the app's connections to Russia, combined with a lack of written safeguards for users, has stirred concerns on Capitol Hill.

On Wednesday, Senate Minority Leader Chuck Schumer (D-NY) asked the Federal Trade Commission and the FBI to investigate the app.

“I have serious concerns regarding both the protection of the data that is being aggregated as well as whether users are aware of who may have access to it,” Schumer wrote.

The lawmaker highlights FaceApp's connection to Russia, writing it would be “deeply troubling” if citizens' sensitive personal information “was provided to a hostile foreign power actively engaged in cyber hostilities against the United States.”

FaceApp's CEO counters that no data from users is being used and says no data is “transferred” to Russia.

Schumer raises another issue that FaceApp may not be able to answer so easily. The lawmaker suggests the app's privacy policy relies on “dark patterns” -- meaning the company pushes users to agree to intrusive privacy terms.

“In order to operate the application, users must provide the company full and irrevocable access to their personal photos and data,” Schumer writes. “In practice providing this level of access to a user's data could mean that any photos taken with the application could be used publicly or privately in the future without a user's consent. Furthermore, it is unclear how long FaceApp retains a user's data or how a user may ensure their data is deleted.”

The senator continues: “These forms of 'dark patterns,' which manifest in opaque disclosures and broader user authorizations, can be misleading to consumers and may even constitute a deceptive trade practice.”

FaceApp is hardly the only web company to be accused of using dark patterns to get users to agree to intrusive privacy terms. On the contrary, last June, the Norwegian Consumer Council accused tech companies of duping people into accepting questionable privacy settings.

The organization's report, “Deceived by Design: How tech companies use dark patterns to discourage us from exercising our rights to privacy,” called out Silicon Valley giants Google and Facebook, arguing that those companies manipulate users by "intrusive" default settings, and by requiring people to navigate through numerous screens in order to opt out.

Earlier this year, Sens. Mark R. Warner (D-Virginia) and Deb Fischer (R-Nebraska) introduced a bill that would prohibit companies with more than 100 million active users from using dark patterns in privacy policy interfaces.

Whether the FTC will attempt to tackle “dark patterns” remains unknown. But any action by the FTC regarding privacy policies is likely to have repercussions that extend far beyond a viral photo-editing app.

Next story loading loading..