What if you didn’t need a mobile phone to share content? What if you could share your literal point of view? What if “always on” didn’t just mean an Internet connection that was constantly active, but referred to a device you wore that was always on you, ready to share your thoughts practically as soon as they came to mind?
That’s the idea behind Project Glass, Google’s latest foray into hardware. It’s a different twist for Google. Instead of competing with Apple, Microsoft, and Amazon, Google will soon be competing with LensCrafters. Its new product is a form of wearable computer, with a tiny camera and transparent screen mounted on the front rim of glasses; the lenses themselves aren't needed.
As of now, it’s a curiosity. It’s not yet on the market, and it’s not clear if anyone will want it. I’m sure I’m not the only person who watched the demo video and thought, “If I wanted to wear headgear, I wouldn’t have gotten Lasik.”
Then came the Charlie Rose interview with Sebastian Thrun, a Google Fellow and former director of the Stanford Artificial Intelligence Laboratory. It was the first public interview with anyone at Google wearing the prototype, and I was mesmerized. It was a sneak peek into a future that’s both beautiful and horrifying. I can’t remember ever thinking simultaneously, “I want that now” and “I hope to never go anywhere near that.” I’m neither a fan nor detractor of Charlie Rose, but I give him credit for staying focused on the interview. It must have been hard taking Thrun seriously, despite his credentials.
Thrun’s discussion with Rose about Project Glass was brief; most of it focused on the demo video itself. It shows someone using the hardware to check the weather, save reminders about events, get transportation updates, check into a coffee truck, and use video chat, among other activities. With so many possibilities, what will be the proverbial killer app? Thrun told Rose, “The compelling use case for us is the sharing experience. Other people can now see through my eyes.”
It was a beautiful moment. Thrun downplayed a lot of the more sophisticated features like using augmented reality for image recognition. The feature that excites me the most is being able to recognize faces, even though I have a wonderful accessory to handle that already: my wife. Thrun’s take was simple and elegant. Project Glass is a hands-free way to share your actual perspective. Used well, it’s bound to trigger waves of empathy from those seeing that perspective. How much more powerful will it be to share your point of view when you’re doing so literally?
And then, what happens when you can always share your point of view? Will you blink and share whatever you see? Will you be even more inclined to tweet that you’re checking into the food truck you just instagrammed on Tumblr? Will you ever see the world around you? These are concerns already, but Project Glass raises the possibility that we’ll always be looking at the world through glass and not through our eyes.
Project Glass could thus further diminish our ability to be mindful, where we process the sensory information in our immediate surroundings instead of thinking about anything else. It’s vividly described in the book “Your Brain at Work” by David Rock. He writes, “When you sit on that jetty and stop to pay attention to the warmth of the sun on your skin, you soon notice the breeze, too. Activating the direct-experience network increases the richness of other incoming data, which helps you perceive more information around you. Noticing more information lets you see more options, which helps you make better choices, which makes you more effective at work.”
Rock describes how to be mindful and urges the reader to try it for ten seconds. The first time I tried it, I could feel the mental strain. Then I attempted it while walking to the subway one morning, and I was amazed how I could hear the footsteps of everyone walking by me and the phone conversations of people on the street. I understood Rock’s quote from mindfulness researcher John Teasdale: “Mindfulness isn’t difficult. What’s difficult is to remember to be mindful.” With a screen constantly hovering over your face, is mindfulness even possible?
Project Glass shows the liberation of a hands-free device, but such liberation is misleading. When we bought cordless phones and then cellphones and then smartphones, we were liberated from physical wires only to find ourselves more tethered to the handsets. If Project Glass is successful, we’ll free up our hands at the risk of losing our minds.
I saw the Charlie Rose interview, too. It was an interesting perspective, but the blog accurately described my feeling that reality might be missed while attaching one's face to the web.
Eventually, we can lose our minds completely and just do what the glasses tell us to do. That will be nice.
David, as usual a very thoughtful - and mindful - pos, but I think you forgot the obligatory 'Minority Report' reference!
If I could just use these glasses to remember what I supposed to do this...