Senators on Tuesday confronted executives from TikTok, Snap and YouTube with horror stories about young people and social media, attributing all sorts of tragedies to social media platforms.
The lawmakers told of young users who were harmed after participating in “blackout challenges” that involved choking themselves, or developed anorexia after spending time on social media, or were injured in car crashes after using Snapchat's “speed filter” to document how fast they were driving.
Whether it's fair to blame teens' car accidents, eating disorders or other self-destructive behavior on social media is up for debate. After all, many young people have struggled with eating disorders -- not to mention driven recklessly -- long before the internet came around.
Fair or not, politicians are loudly accusing social media platforms of causing harm to young users.
“We need stronger rules to protect children online,” Senator Richard Blumenthal (D-Connectiut), chairman of the Commerce Committee's subcommittee on consumer safety, said Tuesday at the start of the hearing.
advertisement
advertisement
“Big Tech must stop putting profits before people,” he added on Twitter.
Several other lawmakers piled on, relating stories about constiuents' children who suffered harm -- physical as well as psychological -- after spending time on social media.
Blumenthal -- who has loudly criticized Facebook for allegedly promoting problematic content -- made it clear that he also has TikTok, Snap and YouTube in his sights.
"Being different from Facebook is not a defense. That bar is in the gutter,” he said. “What we want is not a race to the bottom, but really a race to the top."
Despite the tough talk, any attempts to regulate content on social media will face obvious and significant First Amendment hurdles.
But privacy laws that regulate how companies can collect and use data for commercial purposes might stand up in court against a challenge to their constitutionality. (Last year, a federal judge in Maine rejected broadband carriers' request to immediately block a privacy law requiring them to obtain users' opt-in consent before drawing on their web activity to serve them ads.)
At Tuesday's hearing, Senator Ed Markey (D-Massachusetts), who has long proposed new limits on data collection, argued that privacy laws could address the perceived harms of social media.
“The problem is clear: big tech preys on children and teens to make money,” he said. “Now is the time for legislative solutions to these problems, and that starts with privacy.”
Markey added: “Today, a 13-year-old girl on these apps has no privacy rights. She has no ability to say no -- no, you can't gobble up data about me; no, you can't use that data to power algorithms that push toxic content towards me; no, you can't profile me to manipulate me and keep me glued to your apps.”
He noted that his proposed Children and Teens’ Online Privacy Protection Act would prohibit websites and apps from collecting personal information -- including IP addresses, device identifiers and other data that can be connected to specific devices -- from teens between the ages of 13 and 15 without first obtaining their explicit consent.
That bill would also prohibit online companies from using behavioral-advertising techniques -- such as serving ads based on web-browsing activity -- on users the companies have reason to believe are under 13.
Current law requires website and app operators to obtain parental consent before collecting personal data, including data used for ad targeting, from children known to be under 13.
Markey asked the company representatives at the hearing whether they would support his proposed bill.
TikTok's Michael Beckerman said the company supported an update to the current children's online privacy law, and liked Markey's approach, but also wanted to see changes to the age-verification procedures.
Snap's Jennifer Stout, vice president of global public policy, said the company agreed there should be “additional protections” for young people, but stopped short of endorsing Markey's proposal.