Google Urges FTC To Weaken Children's Privacy Rules

Google is asking the Federal Trade Commission to revise children's privacy rules in ways that would make it easier for YouTube to collect data from people who watch videos aimed at young children.

The FTC currently presumes that people watching child-oriented videos are themselves children. That presumption hinders companies like YouTube from collecting personal data from those viewers, including cookie-based data used to serve targeted ads.

Google argued this week that adults also watch material aimed at children, and that the company shouldn't be required to treat those viewers differently than other users over the age of 12.

“Adults watch favorite cartoons from their childhood or teachers look for content to share with their students,” the company wrote in a post on the YouTube Creator blog. “We support allowing platforms to treat adults as adults if there are measures in place to help confirm that the user is an adult viewing kids content.”

Earlier this year, Google's YouTube agreed to pay $170 million to settle allegations that it violated the Children's Online Privacy Protection Act, which prohibits website operators from knowingly collecting data used for behavioral advertising from children younger than 13 without their parents' consent.

YouTube promised at the time to limit data collection on videos made for children, even if the people watching the videos are older than 12. The company also promised to stop serving behaviorally targeted ads on videos made for children, and to turn off comments and notifications on those videos.

YouTube's comments to the FTC this week come in response to the agency's call for input about whether to revise the regulations that implement the Children's Online Privacy Protection Act. Among other questions, the FTC sought public comment about a potential rule change that would allow platforms like YouTube to collect data from users ages 13 and older who watch videos aimed at younger children.

Specifically, the agency asked whether platforms “that identify and police child-directed content” should “be able to rebut the presumption that all users of the child-directed third-party content are children thereby allowing the platform to treat under and over age 13 users differently.”

Andrew Smith, the head of the FTC's consumer protection bureau, said in September that he anticipated the agency would revise at least some regulations. He suggested the agency was considering making it easier for platforms to serve behaviorally targeted ads to users over age 12 who view videos that appear to be directed at children.

At the time, Smith elaborated to MediaPost on possible ways platforms like YouTube may be able to differentiate between users likely to be younger than 13 and older ones. Among others, YouTube could turn off commenting for users who don't have Gmail accounts, and then only collect data from users who leave comments. (Google requires Gmail users to be at least 13 years old.)

Four U.S. Senators recently asked the FTC not to weaken the regulations. “Now is not the time to pull back,” Sens. Ed Markey (D-Mass.), Richard Blumenthal (D-Conn.), Josh Hawley (R-Mo.), and Marsha Blackburn (R-Tenn.) wrote. “As children's use of technology continues to increase, so too does the appetite by tech giants for children's personal information.”

1 comment about "Google Urges FTC To Weaken Children's Privacy Rules".
Check to receive email when comments are posted.
  1. John Grono from GAP Research, December 11, 2019 at 4:34 p.m.

    I am amazed.    Surely of all demographic groups, children should be the ones whose privacy is protected.

    Shame Google, shame.

Next story loading loading..