Companies that intend to use "facial recognition technology" to identify people based on their images should consider the privacy implications before moving forward. That's according to the Federal Trade Commission's new staff report, "Facing Facts: Best Practices for Common Uses of Facial Recognition Technologies."
In the report, the FTC discusses various potential uses of facial recognition technologies, and proposes standards for implementing "privacy by design" in services. Specifically, the agency recommends that social networks shouldn't deploy technology that enables automatic tagging -- or matching names and faces in photos -- without first explaining how the technology works and allowing people to opt out.
The FTC also says that companies should obtain opt-in consent in some circumstances -- such as when technology would allow someone to identify specific individuals based on their images. For instance, the report mulls a situation where a developer creates an app that allows people to upload photos of strangers and then learn their identities.
"If such an app were to exist, a stranger could surreptitiously use the camera on his mobile phone to take a photo of an individual who is walking to work or meeting a friend for a drink and learn that individual’s identity -- and possibly more information, such as her address -- without the individual even being aware that her photo was taken," the report says. "Given the significant privacy and safety risks that such an app would raise, only consumers who have affirmatively chosen to participate in such a system should be identified."
What about the "Minority Report" scenario -- that is, the use of facial recognition software in digital marketing-related signs? The FTC recommends a notice-and-choice approach on a "sliding scale," based on how much information is gathered. For instance, if a marketer only gathers age and gender -- and doesn't store images -- a "walk-away choice" might suffice. That option would involve posting a notice before consumers encounter the signs, so that people can avoid them by literally walking away.
But if a company stores images, or tracks consumers across more than one sign, "the privacy risks become much greater, and therefore the companies should provide consumers with more robust transparency and choices," the report says.
One commissioner, J. Thomas Rosch, dissented from the report, saying it "goes too far, too soon." He writes: "I do not believe that such far-reaching conclusions and recommendations can be justified at this time."