Commentary

The 'Big Tobacco' Moment For Social Media

Be honest, how many times have you discussed "dwell time" and "session frequency” with your agency or marketing team? We’d look at a chart showing a user spiraling down a three-hour rabbithole of auto-playing videos and infinite scrolls and think the targeting was super-efficient.

But this week, a jury in Los Angeles just looked at those same charts and called them something else: a defective product.

For the first time ever, a U.S. jury has found Meta and Google (YouTube) liable for platform addiction. Hot on the heels of that moment, a New Mexico jury slapped Meta with a $375 million verdict over child safety and misleading "safety" claims.

This isn't just another "Big Tech is mean" news cycle. Many have called this the Big Tobacco moment for the social media industry. And if you’re a CMO, your media plan may just have become a potential legal and ethical liability.

For years, platforms hid behind a legal framework called Section 230, a “shield” that says they aren't responsible for what people post. But these recent rulings did something different: They ignored the content and went after the conduct.

advertisement

advertisement

The courtroom arguments weren't about a specific bad video or objectionable meme. It was about the design of the platform. The infinite scroll? That’s not a feature; it’s a hook. The aggressive autoplay? That’s a dopamine trap. The court ruled that these platforms are designed to be addictive by nature—and that the companies knew the harm they were causing.

For decades, and by the platform’s own guidance, they claimed to be neutral utilities. We assumed if our "brand safety" filters kept our pre-roll ads away from extremist content or the “wrong” target audience, we were the good guys.

But here’s how these verdicts have reframed the advertisers’ position: We’ve been funding the R&D of addiction. And with that, are advertisers therefore complicit?

Let’s look at this through the lens of your 2026/27 strategy. If Meta and Google are forced to dismantle the very features that drive their "efficiency" (if they have to kill the infinite scroll or opt everyone out of algorithmic "hooks") your reach is going to diminish. Your "cost per minute" is going to skyrocket, because those minutes will suddenly become much harder to manufacture. And it will be much harder to identify and single out specific groups of users, based on their interest or behaviors.

The "efficiency" we’ve been bragging about to our boards was built on a foundation that a jury just declared negligent. That’s a structural collapse of the very medium.

So what should you do? Well, you don’t need to panic and delete your accounts today (there will be appeals), but you do need to stop being a passive advertiser.

Ask your media agency for a breakdown of how much of your budget is going toward "forced" or "hook-based" engagement versus intentional, lean-forward viewing. If your ROI is entirely dependent on a user being "trapped" in a feed, you’re in need of a redo.

Traditional brand safety is about adjacency: What am I next to? Start asking if the platforms you support are legally compliant with these new standards of design safety.

And if the largest part of your budget is in the duopoly, you’re overleveraged in a damaged model. Move toward environments where the user is by choice, not by "loop.”

As it turns out, the algorithm might just be a legal liability.

1 comment about "The 'Big Tobacco' Moment For Social Media".
Check to receive email when comments are posted.
  1. Gordon Hotchkiss from Out of My Gord Consulting, March 27, 2026 at 1:22 p.m.

    Maarten- thanks for doing this. You beat me to the punch. The timing is interesting - my last post was suggesting looking at alternative destinations with smaller but more trusting communities.

Next story loading loading..