Meeting of the Minds
From more articulated typing fingers to an information sense, to a group mind — are there plausible evolutions technology is bringing about in man?
What do you know?
I don’t mean like, “Hey, look at that!” I mean actually what do you know right now, as in: of what things are you aware? Chances are, there are many occurrences and activities of which you are aware in places all over the globe that you wouldn’t have been 10 or even five years ago. Many of these may be mundane: what your old college roommate in Denver had for breakfast, or how a former colleague set a personal best in the local 10k half-way across the country. But the fact remains: you know them.
It’s a step in the evolution of human consciousness, even if for the moment it mostly has to do with things like “OH: drunk white girl at bar, talking to Indian man in turban: ‘you’re so ethnic, I fucking love it.’ #iweepforthefuture.” Consciousness, of a type, has been raised.
In terms of the grand scale of evolution and human development, it’s these areas that may be the most likely to change and adapt along with technology.
And, as usual, we look to MEDIA magazine’s favorite futurist for an example. Think of the precogs in Philip K. Dick’s Minority Report. They are not depicted so much as having supernatural powers, as they are of being a notch up evolutionarily, able to biologically tap into the ether of data all around us. They’ve developed a sort of “information sense.”
At present, with our multiple washes and feeds of information, all we are capable of are neat parlor tricks. My father called me all excited one recent night. He was, he said, at Lincoln Center on the way in to see the play War Horse and, he told me, I would never believe who he had just met. My brain flipped through the available information, and even surprising myself, I blurted, “R. A. Dickey.”
My dad’s excitement of moments ago noticeably deflated, but a sort of wonderment took its place. “How’d you know that?” he asked. I was after all, two time zones and 3,400 miles away in California. But when had said to guess who he had just met, my mind immediately and naturally went to the most recent information I’d gotten about the place in New York he’d just mentioned. I happen to follow Mets knuckleballer R. A. Dickey on Twitter, and just a few minutes before had seen him checking in to the same show with: “About to see War Horse. Heard good things,” and made the correct guess that my father had run into him.
While this doesn’t rise to the level of precognition, imagine if we all always had direct access to all the information out there (not just what we happened to see), directly from our brains. No typing. No googling. No RSS feeds. No following. Just a direct connection.
“Changes in the immediate environment or human occupation [or activity] don’t have consequences for evolution unless these changes affect reproduction and DNA transmission. That’s an extreme situation,” says Margaret J. King, Ph.D., of the Center for Cultural Studies and Analysis in Philadelphia. “We have basically the same body and brain that Cro-Magnon humans had. It’s how we put our inherent faculties to use that counts.”
So chances are that we aren’t going to develop specially articulated typing fingers or light-sensitive eyes for screen reading any time soon. Interface designers are essentially designing for users little more advanced than Neanderthals mashing away at keyboards and touchscreens.
“Will our hands adapt to a more suitable spacing for keyboard use? The answer is a resounding ‘no,’ ” says Christopher Reynolds, a business analyst (and sci-fi buff with a degree in biology for good measure) and marketer with EPS Software. “While there might be an aesthetic and ergonomic advantage to a more comfortable ‘keyboard hand,’ it plays little into survival and mate selection.” In essence, we’re working with the bodies we have, not the ones we wish we had.
Our technological innovation will likely far outpace any macroevolution. However, short-term microevolution, or phenotypic evolution as it known (natural selection would be the classic example), can occur rapidly due to external changes (though, in these cases, there needs to be a survival advantage or immediate impact on mate selection). Case in point: the evolution of the peppered moth in England, often pointed to because of the rapidity with which moths can evolve (they produce a new generation every year) and the clarity of the example. About 200 years ago the peppered moths in England were predominantly light-colored, a camouflage they’d developed that hid them in the trees and lichens on which they rested. During the Industrial Revolution soot discolored and darkened the trees and killed the lichen, and hence, the light colored peppered moths became easy marks for predators. The minority, darker-colored moths, however, survived and became
We are not moths — so, barring some extinction-level event where only those who can type 120 words per minute survive, the way technology will affect human development will mostly be attributed to adaptation. And then there is always the possibility of technology-assisted human development and on-boarding — for example the apparatus that keeps Dick Cheney alive.
And we do adapt. “Our extreme adaptability is what makes us survivable as a species, which means that we do different things than our ancestors, but our genome stays the same,” says King.
Right now, our bodies might be adapting to their environments a little too readily. Technology, however crude, can affect human appearance (think Mayans’ sloping foreheads). When asked what sort of adaptations might be on the way, Dan Nainan, a professional comedian and former senior engineer at Intel, points to the possibility of “short, stubby legs, because we won’t have to walk anywhere.” Then he cites a recently circulated (and frightening) statistic: “We’re practically there now. Did you know that by 2030, half of Americans will be obese?” And the trend is worldwide. “Increased supply of cheap, tasty, energy-dense food, improved food distribution and marketing, and the strong economic forces driving consumption and growth are the key drivers of the obesity epidemic,” wrote Boyd Swinburn and Gary Sacks of the WHO Collaborating Centre for Obesity Prevention at Deakin University in Melbourne, Australia, in a research paper.
Another environmental adaptation is an increase in a sort of forced ADHD. Some have suggested that this is a coping mechanism and reaction to the onslaught of information with which we are presented daily.
“Due to the many new channels of communication — text, IM, email, social networks, phone, etc. — people are forced to handle communication and multitasking at a higher level. They are forced not only to accept the communications and respond to it, but forced to prioritize,” says Rob Hecht, adjunct marketing professor at New York’s Baruch College and “digital innovations” strategist. “People at work today have minds that have become accustomed to this massive multitasking all day. They are connected to the matrix through all sorts of media,” says Hecht, “But when they withdraw from that speed and multitasking they have become so good at — for example, go on a vacation — they feel panic, and they feel depressed.”
A recent study by MTV Networks in conjunction with research consultancy Latitude found that 83 percent of smartphone users polled felt they were “addicted to apps.” One even went so far as to call apps “Xanax in a phone.” To further explore this, researchers asked a handful of study participants to spend three days without using an app. At the end of the three days they were asked what would happen if apps were withheld for two weeks. One young woman responded, “I don’t think you’d find me alive after the second week.”
Tiffany Shlain, founder of the Webby Awards and a Henry Crown Fellow of the Apsen Institute, tackled her own creeping dependence on technology in her new documentary Connected: An Autoblogography about Love, Death & Technology. As Shlain tells it in the film, her hard look in the mirror began when she found herself sneaking into a bathroom stall when she couldn’t fight the urge to check her messages. How seriously would she rate a smartphone addiction? “I used to be a smoker,” Shlain says. “I get that same twitchy jonesing feeling [when not checking messages]. But there are plenty of studies where when we get a ‘hit’ of a text or email, oxytocin flows — the ‘bonding’ hormone — so clearly, it’s not as bad as smoking.”
Oxytocin, the so-called bonding hormone to which Shlain refers, is produced in large amounts by mothers during childbirth and when breastfeeding — assumably nature’s way of ensuring we care for and nurture newborns. The fact that social media and digital interactions produce this same hormone raises one important question: are we bonding with our devices or the people at the other end — or even just the surge of connection?
Could it be, then, that nature is pushing us toward an ever-increasing connectivity?
All of our data and information — whether music libraries or check-ins at restaurants — is increasingly uploaded to the cloud. So, might there be some biological way to access this? The easy answer is, probably not without some help. Whether this help consists of chips implanted in our heads turning us into always-on iHumans, or some sort of virus that can be injected into our bodies giving us this ability, the potential does exist.
William McEwan, a postdoctoral researcher at the MRC Laboratory of Molecular Biology in Cambridge, UK, points to the (admittedly far-fetched) applications of the sort of domain fusion viruses he creates (mostly for the very serious purpose of stanching the spread of AIDS) as electrical components. If organic electrical components could become a part of our bodies, even artificially, the possibility for a brain connected to the cloud does seem tantalizing close. These naturally replicating viruses can then be passed on, and while not a part of the human genome, become a part of our development. There’s always the possibility that a virus or mutation may occur naturally and not need to be engineered by the likes of McEwan.
Much of what separates us from us from apes and monkeys is our communication abilities. According to Asif A. Ghazanfar of Princeton University’s Neuroscience Institute, whose lab studies the neurobiology and behavior of monkeys as a way of understanding the evolution and function of the human brain, humans receive both audio and visual communication from each other simultaneously, whereas in non-human primates it’s one or the other (i.e., a lip smacking or a howl).
In Ghazanfar’s research, not only did humans evolve to simultaneously process audio and visual cues in communication, but, as he writes in his essay in Future Science: Essays from the Cutting Edge (Vintage, 2011), “Evolution apparently selected for vocalizations structured in such a way as to match the ongoing neural rhythms: the 3–8 Hz theta rhythm.” Both visual (lip movements, facial tics, etc.) and sound are processed at this same rhythm by the human brain, according to the research. If it was the development of this added skill that produced an evolutionary leap, could there not be a greater jump up the next evolutionary rung in store? Say, an ability to process ever-increasing amounts of data? Again, the “information sense.”
Such a sense could present an incongruity to the human mind though. How ready are we to join our brains together through the cloud?
Joshua Knobe, an assistant professor in the philosophy department and the Program in Cognitive Science at Yale University, is a founding member in the “experimental philosophy” movement, and as such, has asked some questions that could lead the way toward an answer about how our brains work in this regard, and what entities we may view as having a mind.
“There are different aspects to the mind,” explains Knobe. “We can plan and engage in decision-making and try to work out mathematical problems, and so forth. On the other hand, people are capable of emotions and feelings — you can experience love, or joy or pain. What we find is that there’s a really big difference between those two aspects of the mind, and there is a big difference in terms of how people answer questions about these different aspects. And we seem wholly willing to ascribe certain attributes of the mind to things that have no bodies. For instance, people might say, ‘Facebook wants me to be online more,’ or such and such a corporation ‘believes its profit margins may soon increase.’ But we wouldn’t say ‘Facebook is getting depressed,’ or a corporation has ‘experienced great joy.’”
So, if we are all connected through some biological “cloud-connection,” will we view this group consciousness as having its own mind? We use the term hive mind already. Do we mean it?
“We definitely think of you as having a mind. We definitely think your doorstop doesn’t. Then there are these things in between,” says Knobe. “Does a corporation have a mind? Well, people seem to think about it having a mind to some degree. There’s some way in which an ordinary company like Intel, or Apple, or Microsoft could be planning something, deciding things; that it, itself, has certain kinds of intentions; it knows things or believes things. As we become more and more closely connected and as the actions of these kinds of corporate entities become ever more disconnected from just some individual within the corporation, then it starts to become more and more plausible to start to ascribe certain kinds of mind traits to these entities themselves.”
In such a scenario, where people share a collective consciousness — be it akin to Google results or our Facebook or Twitter feeds, or some combination — how do we view the minds of the individuals? “Studies are showing that [in the view of those surveyed] there’s a kind of trade-off between the mind of the individual and the mind of the group,” says Knobe. “The more you think of the group itself as having a mind, the less you think of the individual people as having any kind of mind at all.” To hear Knobe tell it, the brain is certainly wired to at least conceive of a group mind, “in so far as we’re working together, it’s not just that the individual can be constrained in certain ways by the group, it’s that we think less of certain individuals wanting and needing things and think more of the group as having certain kinds of goals, or wants or needs, or beliefs.”
And research has shown, time and again, surprising results: “The really striking thing that is coming out of research in neuroscience is that there is a distinctive group of brain regions that we use to think about other minds. So if you’re thinking that someone is blond versus you’re thinking that that person wants to go to California, you’ll use different brain regions. What we’re finding is that these same brain regions — for example, the medial prefrontal cortex — are the brain regions we use to think about the mind of a group. If you look at the brain regions you’ll use to think about what your friend intends to do next, you’ll see exactly those same brain regions involved in, say, when you’re thinking about what Microsoft is going to do next.” As long as this capacity to apply understanding of individuals to groups exists in the mind, it opens up new possibilities for the acceptance of collective thought.
“The more we become connected with each other, so that we can work together to achieve things, the more people can see us [and themselves], not just as separate individuals but as a group that has its own mind.”