UK May Order Snap To Pull Chatbot Over Teen Privacy

Privacy authorities in the United Kingdom could soon order Snapchat to halt processing data collected through its new artificial intelligence chatbot, My AI.

The nation's Information Commissioner's Office on Friday issued a “provisional” finding that the social platform didn't sufficiently evaluate potential risks to users under 18 before rolling out My AI earlier this year.

“If a final enforcement notice were to be adopted, Snap may be required to stop processing data in connection with ‘My AI,’” the agency wrote. “This means not offering the ‘My AI’ product to UK users pending Snap carrying out an adequate risk assessment.”

Information Commissioner John Edwards stated that the provisional findings “suggest a worrying failure by Snap to adequately identify and assess the privacy risks to children and other users before launching ‘My AI’.”

advertisement

advertisement

The agency added that the findings are only preliminary, and don't mean that Snap violated any laws.

When the company debuted My AI, it said the tool could “recommend birthday gift ideas for your BFF, plan a hiking trip for a long weekend, suggest a recipe for dinner, or even write a haiku about cheese for your cheddar-obsessed pal.”

Snap also warned users at the time to avoid sharing “secrets” with the chatbot, or relying on it for advice.

A 2021 law in the UK requires tech companies to follow sweeping privacy regulations when handling data of users under 18, and also requires companies to design services with minors' “best interests” in mind.

California recently passed a comparable law, the Age Appropriate Design Code, that aims to regulate how online companies display content to minors, as well as how companies collect data from those users.

Last month, a federal judge in California blocked enforcement, ruling that the statute appears to violate the First Amendment.

Next story loading loading..