During a five-hour session on Thursday, Democratic and Republican members of the House Energy & Commerce Committee alike hammered the CEOs of Facebook, Google and Twitter about disinformation, privacy and children’s safety issues — and made it clear that the era of self-regulation is over.
“This panel has done something truly rare in Washington these days: It has united Democrats and Republicans,” declared Rep. Angie Craig (D-Minn.). “Your industry cannot be trusted to regulate itself.”
“This hearing marks a new relationship between all of us today — there will be accountability,” concurred Rep. Bill Johnson (R-Ohio).
The tech giants were criticized for failing to control extremist, violence-inspiring content including hate speech and conspiracy theories, as well as misinformation (some of it now shown to be spread by Russia) about vaccines and COVID-19.
Some members on both sides of the aisle stressed the harmful effects on society and democracy of rampant disinformation — no surprise, given that the hearing’s official topic was: “Disinformation Nation: Social Media's Role in Promoting Extremism and Disinformation.”
Dorsey acknowledged that Twitter bore some responsibility in enabling the insurrectionist attack on the U.S. Capitol on Jan. 6, although he added that there was a broader context. “It’s not just about the technological systems that we use,” he said.
Zuckerberg at first denied that his platforms were in part responsible — instead blaming the current divisive political climate, former president Trump and the rioters—but later backpedaled, saying: “Certainly there was content on our services and, from that perspective, I think that there is further work we need to do.”
Pichai sidestepped the issue, saying that those at Google “always feel a deep sense of responsibility," but “worked hard this election.”
The CEOs repeatedly defended their efforts to curb dangerous content, while admitting that such efforts will likely never be 100% successful.
Zuckerberg, for example, maintained that while extreme, hate-baiting content might drive more clicks in the short term, “it's not good for our business or our product or our community for this content to be there… It's not what people want, and we run the company for the long term."
Committee Chairman Rep. Frank Pallone (D-NJ) slammed the CEOs. “You definitely give the impression that you don’t think that you’re actively in any way promoting this misinformation and extremism, and I totally disagree with that," he said. “You're not passive bystanders — when you spread misinformation, actively promoting and amplifying it, you do it because you make more money.”
While some Republicans continued to assert that the platforms discriminate against conservatives in their content moderation and policies — a claim that’s been punctured by multiple studies — a bipartisan emphasis on protecting children, in particular, seemed to take precedence this time.
Facebook’s plan to launch a version of Instagram for those under 13 took heavy flak, for example.
"This committee is ready to legislate to protect our children from your ambition," said Rep. Lori Trahan (D-Mass.).
"Big Tech is handing kids a lit cigarette, and hoping they get addicted for life," asserted Rep. Johnson.
Various members plan to propose bills that would attempt to address a host of complex issues, including banning “surveillance advertising” (see MediaPost’s separate coverage); modifying or eliminating Section 230, the regulation that protects platforms from legal ramifications related to user-generated content and moderation decisions; and banning harmful content targeting children.
The hearing had a broader backdrop, of course: antitrust suits by both state legislators and the Federal Trade Association.
In addition, even as the hearings were underway on Thursday, the FTC announced that it is poised to begin the process of issuing binding regulations, as opposed to merely issuing recommendations on, unfair methods of competition, to deter “novel harms of the digital economy.”