Commentary

Social Media Has No Incentive To Protect Children

Let’s you and I go back in time.

The year is 2012, and I want you to imagine you’re the CEO of a company called YouTube.

Your company is doing pretty well -- you were bought by Google for $1.65 billion --but you’re still not particularly mainstream. You don’t compete against the broadcast networks or even really cable tv. You’re best known for a video of young David coming home from the dentist, high as a kite.

You decide you want to do better. So you set a goal. The goal is to reach a billion hours of videos viewed per day.

Now, by everything you’ve been taught at business school, this is an awesome goal. It’s specific, it’s measurable, it’s attainable yet hairy and audacious, set up so everyone knows if they’re winning or losing.

And it works. It galvanizes the team. They start working toward the goal and they start optimizing the algorithm. And what you learn is that, if you want to drive video views, the more shocking the content, the better.

advertisement

advertisement

By 2014, you’re a third of the way there. And then a senior VP of somethingorother says, “Hey, boss, you know how we’ve been optimising for shocking content because it gets more video views? Well, it looks like our algorithm might be working a little too well. We’re actually recommending content that is unethical. Immoral. Possibly illegal. Violence. Pornography. Bestiality. Sexualised images of children. And this content isn’t just showing up for people who search for it explicitly. It’s showing up for people who started out looking at animated clips for kids. What do you want us to do?”

What would you do? Would you try to change the algorithm to favor videos that are a little less aggressive, or would you ignore it?

Perhaps you imagine that you would obviously change the algorithm. But consider how powerful the incentives are to look the other way:

-- Your performance is measured by how much money the company makes -- and you’re making heaps.

-- Your performance is monitored by the board of directors -- and yours is thrilled.

-- Your performance might be impacted by legal constraints -- but you didn’t create the content, and you’re not liable for it.

And remember, your “billion hours of video views” goal is WORKING.

So what would you do?

I’ll tell you what YouTube did: they ignored it.

They ignored it because 100% of their incentives were aligned for them to ignore it.

But that’s just one company, with one old story. Surely it doesn’t embody all of social media, right?

Last month, NPR reported on a number of internal documents that came out as part of an investigation into TikTok. These documents showed an average user is likely to become addicted to the platform in just 35 minutes.

TikTok’s own research found that, “compulsive usage correlates with a slew of negative mental health effects like loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety… [C]ompulsive usage also interferes with essential personal responsibilities like sufficient sleep, work/school responsibilities, and connecting with loved ones.”

One document said, “As expected, across most engagement metrics, the younger the user, the better the performance.” And those young users are getting exposed to a whole heap of dodgy stuff. Despite having policies banning certain types of content, internal documents acknowledge “leakage rates,” like 35.71% of “Normalization of Pedophilia”; 33.33% of “Minor Sexual Solicitation”; and “100% of “Fetishizing Minors.”

Like YouTube, TikTok’s incentives are aligned for them to ignore harm done to kids. Referring to a tool theoretically designed to reduce time spent on the platform, a project manager said, “Our goal is not to reduce the time spent,” while another employee said the goal is to “contribute to DAU [daily active users] and retention” of users.

With little incentive for social media platforms to regulate themselves, and huge rewards for exploiting vulnerable young people, Australia’s plan to ban social media for children under 16 makes complete sense.

If we want to take care of our kids, we have to recognize that social media CEOs aren’t rewarded for protecting young people: not by their boards or by the market or even by their users. It takes external policies, regulation, frameworks or even bans.

Let’s go, Australia. And more like that, please.

3 comments about "Social Media Has No Incentive To Protect Children".
Check to receive email when comments are posted.
  1. Steve Rosenbaum from SustainableMedia.Center, November 15, 2024 at 2:55 p.m.

    Kaila - so many things to say here. "their incentives were aligned for them to ignore it" - we're seeing the absolute result of misaligned incentives, with no penalty for the pain and damage that the current "hate-for-profit" model embraces. Certainly, it's possible that if parents who've lost children to the power of addiction algorithms could get in front of a jury, they might start to inflict economic pain that would be material. However, Section 230, which gives the platforms absolute indemnity, seems unlikely to be struck down, though who knows, there may be bipartisan support, we shall see. 

  2. Ben B from Retired, November 15, 2024 at 11:49 p.m.

    It's personal responsablity in my opinion and the parents job if they want their teens on soical media not the governement which is a form of censorship which is wrong. I believe that section 230 will be reformed and that social media will be sued it will just be pocket change to the companies that own social media platforms along with a fine and that will be it in my opinion. Does seem to be bipartisan support seems that there are bills I know the senate has passed it the house may put it up for a vote either Biden or Trump will sign it into law.

  3. Kaila Colbin from Boma replied, November 21, 2024 at 11:18 p.m.

    I can see how it seems like personal responsibility. At the same time, pulling kids off social media when all the other kids are on it is like pulling them out of school or not letting them play sport. If everyone is on it, it's a massive challenge to take one kid off. When kids are surveyed, most of them say they'd love to abandon social media but they can't because that's where their friends are.

Next story loading loading..