Lawmakers throughout the country, as well as parents and teenagers, have accused social media platforms addicting young people by serving them with personalized content, sending notifications and automatically playing video after video.
Now, the Federal Trade Commission appears to be gearing up to weigh in on social media use by teens.
On Thursday, the agency said it plans to hold a virtual workshop to study the “use of design features on digital platforms aimed at keeping kids, including teens, online longer and coming back more frequently.”
The FTC elaborated that the workshop will delve into whether design features result in “more engagement or time spent on digital platforms,” the effects of design features on minors' physical and psychological health, and what measures “might be effective, feasible, and consistent with the current legal landscape.”
advertisement
advertisement
News of the workshop comes amid increasing efforts to regulate how social platforms display material to teens.
For instance, California Governor Gavin Newsom late last week signed a bill requiring social media platforms to display content to known minors in reverse chronological order, as opposed to algorithmically curating their feeds, unless parents consent to the curation.
Earlier this year, New York enacted similar legislation.
Both state laws also restrict platforms from sending notifications to minors during certain hours.
In California and New York, the statutes were touted as measures that would help combat social media addiction.
But critics say the laws unconstitutionally interfere with platforms' First Amendment right to wield editorial control. Opponents also argue that algorithms benefit users by weeding out problematic posts -- such as bullying comments -- that would appear if comments were displayed chronologically.
Neither one of those bills has been challenged in court -- at least not yet.
But a federal appellate court recently blocked key portions of a different California bill -- the Age Appropriate Design Code -- that also would have regulated social media companies' ability to display content to minors.
That statue requires online companies likely to be accessed by users under 18 to evaluate whether the design of their services could expose minors to “potentially harmful” content, and to mitigate that potential harm. The law also includes privacy provisions, such as a requirement to configure default settings in a privacy-protective way, unless the business can show a “compelling reason that a different setting is in the best interests of children.”
The FTC workshop will take place on February 25, 2025.