This law, he argued, was why “early adopters are always fleeing online social networks [e.g., LinkedIn], only to join a new one [e.g., Facebook]. They’re fleeing the spam.”
Up until then, this law had held true. They had fled Friendster. They had fled GeoCities. They were shortly to flee MySpace -- to the newfangled site Hugh had mentioned, Facebook.
For a while, it seemed Facebook might fall prey to the same phenomenon. Certainly, we all got sick of requests to visit our friends’ farms. By 2010, “South Park” had made an episode called “You have 0 friends.”
But something shifted. This online social network flirted with being a swampy mush of spam, but never went all the way. According to Wikipedia, by 2011, “Facebook was removing about 20,000 profiles daily for violations such as spam, graphic content and underage use, as part of its efforts to boost cyber security.”
advertisement
advertisement
Zuckerberg and his team were smart enough to realize that spam was driving people away -- quickly enough to do something about it. And they were smart enough to realize that reducing revenue in the short term was well worth it if it meant continuing to grow the user base.
So instead of focusing on pushing more ads to their existing user base -- thereby alienating the very eyeballs they needed to sell -- they started focusing on “quality.”
I put “quality” in scare quotes because what's really meant by that is “what people want to see.” And I put “what people want to see” in scare quotes because what that really means is “what makes people click and stick.”
There's a critical difference between “quality” and “what makes people click and stick.” It’s like the difference between eating healthfully and eating junk food. Junk food is easier to sell. We don’t have to work for it. It speaks to our lizard brains, not our executive brains.
Likewise, we’re much more likely to stick around for online content that doesn’t make us work. If things get uncomfortable, we tend to run -- and Facebook didn’t want you running.
So instead, they started showing you stuff that validated you. Stuff that agreed with your pre-existing ideas. Stuff that made you feel smart, switched-on, seen.
And if you didn’t agree with me? No problem. You got to see stuff that validated you. Stuff that agreed with your pre-existing ideas. Stuff that made you feel smart, switched-on, seen.
This maybe doesn’t seem so bad. But turns out that just validating us isn’t enough, either. To keep us clicking, the ante needs to be continually raised; the dopamine hits need to keep coming.
These feedback loops fed themselves: The more they showed, the more you clicked; the more you clicked, the more they showed.
And while I’ve written before that Hugh’s Law may be finally coming true for Facebook, the more I think about it, the more I realize we need a revision of Hugh’s Law.
So here’s Colbin’s Law: “All social networks drive towards extremes.”
The job of an automated system is to find what works and amplify it. To create positive feedback loops. Those feedback loops self-reinforce and become stronger, and it becomes harder and harder to achieve balance or moderation.
But if we want to have any hope of a constructive, inclusive future that works for everyone, we’ll need to find a way toward balance.
In real estate, as we all have heard, it's all about "location, location, location."
In social media, it's all about "moderation, moderation, moderation."
But since moceration costs money and cuts profits, it ain't gonna happen. It's really that simple.