
Six U.S. Senators on Wednesday
urged the Republican leadership of the House of Representatives to bring the controversial Kids Online Safety And Privacy Act to a vote by the end of the year.
The bill, which would impose
sweeping restrictions on tech companies' ability to serve content to minors and handle their data, was approved by the Senate in July, but stalled in the House.
The senators contend the
measure will help protect children “from the harms caused by social media and other online platforms.”
“While the internet and digital tools have helped kids connect with
others and the world around them, these benefits have come at a profound cost -- our children are experiencing emotional, mental, and physical harm from their use of digital platforms,” Senators
Maria Cantwell (D-Washington), Ted Cruz (R-Texas), Ed Markey (D-Massachusetts), Bill Cassidy (R-Louisiana), Richard Blumenthal (D-Connecticut) and Marsha Blackburn (R-Tennessee) say in a letter to House Speaker Mike Johnson (R-Louisiana) and Majority Leader Steve Scalise (R-Louisiana).
advertisement
advertisement
The online safety portions of the bill aim to tackle potential harms associated with social media use, including depression, eating disorders, and online bullying. Those safety provisions would
require tech platforms to use “reasonable care” to avoid harming minors via design features such as personalized recommendations, notifications and appearance-altering filters. The measure
tasks the Federal Trade Commission with issuing guidance to platforms regarding online safety.
The bill's privacy provisions would prohibit websites and apps from collecting personal
information -- including data stored on cookies, device identifiers and other pseudonymous information used for ad targeting -- from teens between the ages of 13 and 15, without their explicit
consent. The bill would also ban targeted marketing to children and teens, and would require tech companies to allow users to delete personal information from users under the age of 17.
“Studies show the more time youth spend on social media, the greater the risk they will suffer from poor mental health, disordered eating, and diminished sleep quality,” the senators
argue in their new letter. “The use of targeted advertising results in kids being shown ads for alcohol, tobacco, diet pills, or gambling sites. These risks are multiplied because social media
platforms are designed to be addictive, so that kids will spend more time online.”
The bill's online safety provisions have proven particularly controversial. While some youth advocates
support those provisions, a broad array of opponents -- including civil liberties groups and tech industry organizations -- say the bill would empower government officials to effectively censor
content they deem inappropriate for teens.
The digital rights group Electronic Frontier Foundation warned that the online safety provisions would enable the FTC "to decide what kind of content
'harms' minors, then investigate or file lawsuits against websites that host that content.”
Interactive Advertising Bureau executive vice president Lartease Tiffith likewise stated
this summer that the organization “is particularly worried about provisions that may limit free speech, effectively placing government oversight on platform content decisions.”
Some opponents have expressed specific concerns that the bill would allow government officials to target platforms that allow teens to access LGBTQ content, on the theory that such material is
harmful.
Conservatives also have voiced fears that the bill will encourage platforms to aggressively filter any controversial speech, including posts that carry right-wing messages. The
anti-abortion group Students for Life Action tweeted this summer that the bill hreatens “pro-life free
speech.”
Some states have passed their own versions of kids online safety bills, but it's not clear those efforts will survive constitutional challenges.
In California, for
instance, judges temporarily blocked key portions of a law that would have required online companies likely to be accessed by users under 18 to evaluate whether the design of their services could
expose minors to “potentially harmful” content, and to mitigate that potential harm.