Senate To Mark Up Teens Online Safety Bill

The Senate Commerce Committee is scheduled to mark up a controversial bill Thursday would regulate how online platforms display content and ads to users under 17.

The Kids Online Safety Act, reintroduced in May by Senators Richard Blumenthal (D-Connecticut) and Marsha Blackburn (R-Tennessee), would require platforms to take “reasonable measures” to prevent and mitigate potential harms associated with social media use -- including depression, eating disorders, and online bullying -- when displaying material to users the platforms know or should know are 16 or younger.

The proposed law also would require platforms to use the most privacy-protective default settings for teens. Other provisions would require platforms that allow ads to minors to label all ads, say why minors are being targeted for particular ads, and disclose all commercial endorsements.

While the law would not require teens to have their parents' permission to use social media, it would require platforms to inform parents that their children have accounts.

More than 200 organizations -- including Fairplay, the American Academy of Pediatrics and the Center for Digital Democracy -- recently urged lawmakers to advance the legislation.

“The Kids Online Safety Act seeks to hold social media companies accountable after their repeated failures to protect children and adolescents from the practices that make their platforms more harmful,” the groups said in a letter sent to lawmakers Tuesday. “The enormity of the youth mental health crisis needs to be addressed as the very real harms of social media are impacting our children today.”

A previous version of the bill advanced out of the Senate Commerce Committee last July. 

The current bill differs in a few ways from last year's. One change is that this year's bill only applies when platforms know or “reasonably should know” that users are under 17. Another is that this year's version specifically says platforms can allow minors to search for and request content. The current bill also says claims regarding harmful content should be backed up with “evidence-informed” medical information.

The measure tasks the Federal Trade Commission and state attorneys general with enforcement.

Critics say the revised bill is problematic for several reasons, including that it could prevent minors from accessing material protected by free speech principles. The First Amendment generally protects a wide range of content that might be harmful -- such as photos associated with eating disorders.

Ari Cohn, free speech counsel at the think tank TechFreedom, wrote in May that even though the new bill says platforms can allow minors to seek out content, other provisions could subject platforms to liability for displaying potentially harmful material -- even if requested by users.

“The most ‘reasonable’ and risk-averse course remains to block minors from accessing any content related to disfavored subjects, ultimately to the detriment of our nation’s youth,” Cohn wrote.

The digital rights group Electronic Frontier Foundation said the bill “unreasonably buckets all young people into a single category,” and “requires surveillance of minors by parents.”

The group added that even though the bill discusses design issues, it would actually regulate content.

“The designs of a platform are not causing eating disorders,” the organization wrote in May. “As a result, [the bill] would make platforms liable for the content they show minors, full stop. It will be based on vague requirements that any Attorney General could, more or less, make up.”

The Children and Teens’ Online Privacy Protection Act is also scheduled for a markup on Thursday. That bill would prohibit websites and apps from collecting a broad range of data -- including device identifiers, biometric information and geolocation -- from users between the ages of 13 and 15 without their express consent.

Next story loading loading..