Yes, I’m belatedly jumping on the #AlexfromTarget bandwagon, but it’s in service of a greater truth that I’m trying to illustrate. Last column, I discussed the "Unintended Consequences of Technology." I think this qualifies.
Furthermore, this brings us full circle to Kaila Colbin’s original point, which started this whole prolonged discussion: "It is up to us to decide what is important, to create meaning and purpose. And, personally, I think we could do a better job than we’re doing now."
So, why did the entire world go ga-ga over a grocery bagger from Texas? What could possibly be important about this?
Well, nothing -- and that’s the point. Thinking about important things is hard work. Damned hard work -- if it’s really important. Important things are complex. They make our brains hurt. It’s difficult to pin them down long enough to plant some hooks of understanding in them. They’re like eating broccoli, or doing push-ups. These things may be good for us, but that doesn’t make them any more fun.
Remember the Yir Yoront from my last column, the tribal society that was thrown into a tailspin by the introduction of steel axes? The intended consequence of that introduction was to make the Yir Yoront more productive. The axes did make the tribe more productive, in that they were able to do the essential tasks more quickly, but the result was that the Yir Yoront spent more time sleeping.
Here’s the thing about technology. It allows us to be more human -- and by that I mean the mixed bag of good and bad that defines humanity. It extends our natural instincts. It’s natural to sleep if you don’t have to worry about survival. And it’s also natural for young girls to gossip about adorable young boys. These are hard-wired traits. Deep philosophical thought is not a hard-wired trait. Humans can do it, but it takes conscious effort.
Here’s where the normal distribution curve comes in. Any genetically determined trait will have a normal distribution over the population. How we apply new technologies will be no different. The vast majority of the population will cluster around the mean. But here’s the other thing: that “mean” is a moving target. As our brains “re-wire” and adapt to new technologies, the mean that defines typical behavior will move over time. We adapt strategies to incorporate our new technology-aided abilities. This creates a new societal standard -- and it is also human to follow the unwritten rules of society. This creates a cause and effect cycle.
So technologies enable new behaviors that are built on top of the foundations of human instinct, society determines whether these new behaviors are acceptable -- and if they are acceptable, they become the new “mean” of our behavioral bell curve. We bounce new behaviors off the backboard of society.
So, much as we may scoff at the fan-girls that gave “Alex” insta-fame -- ultimately it’s not the girls' fault, or technology’s. The blame lies with us. It also lies with Ellen DeGeneres, the New York Times, and the other barometers of societal acceptance that offered endorsement of the phenomenon.
It’s human to be distracted by the titillating and trivial. It’s also human to gossip about it. There’s nothing new here. It’s just that these behaviors used to remain trapped within the limited confines of our own social networks. Now, however, they’re amplified through technology. It’s difficult to determine what the long-term consequences of this might be. Was Nicholas Carr, in his book "The Shallows," right? Is technology leading us down the garden path to imbecility, forever distracted by bright, shiny objects? Or is our finest moment yet to come?