Facebook Face-Recognition Tool Under Fire
Guess we were wrong to assume that Facebook had grown too powerful for privacy-related criticism. No, the social giant is now being forced to apologize for the way it rolled out its face-recognition system. "A group of privacy watchdogs drawn from the EU's 27 nations will study the measure for possible rules violations," Bloomberg reports.
"Tags of people on pictures should only happen based on people's prior consent and it can't be activated by default," the group tells Bloomberg, before promising to "clarify to Facebook that this can't happen like this."
In case you didn't now, "Facebook has quietly expanded the availability of technology to automatically identify people in photos," writes Reuters -- thus, "renewing concerns about the privacy practices of the world's top social networking service."
Now, "The social network said that it should have done more to notify members about the global launch," BBC News writes. Along with apologizing, however, a Facebook spokesperson justified its technology to BBC News, saying that there had been "misconceptions" about what it does.
"Facebook must have been tripping if it thought it could enable automatic photo face-tagging without also automatically tripping the interest of European privacy regulators," writes paidContent.
Under the headline, "Europe belatedly freaks out about Facebook facial recognition feature," Gawker writes: "You remember when Facebook rolled out its creepy facial recognition technology last December in the U.S. ... Now it's available worldwide. Let's point and laugh as Europe freaks out about the same thing we did six months ago."
Along with privacy watchdogs, however, "Facebook users are expressing concern for its new facial recognition technology," DailyTech writes. "The new facial recognition technology, which was announced in December but only introduced to a small test group, is basically Facebook's way of creating a huge, photo-searchable database of its users," PCWorld notes. "And yes, it's terrifying."