While the car of the near future may not drive itself, it is likely to provide a greatly enhanced driver experience.
At CES, Nuance Automotive gave me a demonstration of what this experience may look like, in a test vehicle on the show floor.
Using its new Dragon Drive platform, the vehicle demonstrated in-car voice interactions similar to those being adopted by numerous automakers. The platform not only recognizes voice, which is Nuance’s historical strength, but it also can understand tone of voice, eye and head movements and emotions.
“The module learns your preferences as you drive,” Eric Montague, senior director strategy and product marketing at Nuance, told me at the show. “We’re driving home the ‘humanizing the experience’ theme.”
As installed in the test vehicle, the driver simply looked at my open passenger side door and said "close that door," and the door closed. He looked at one of the car windows and said "open that window," and that window opened. You get the idea.
Use of the voice and sight tracking also extends beyond the vehicle. In the simulated drive in the test vehicle, the driver looked at a building we were passing and asked, "What’s that building?" The details of the building were then displayed on the smart windshield using a transparent screen.
One of the useful aspects of the Dragon Drive system is the human-like conversation capabilities. For example, a driver could say "open that window halfway," followed by "a little more," without having to repeat the initial command.
Eye-tracking also can be used to make a hands-free phone call. The driver could say "call Bob," and when four different people named Bob in the contact database show up on the windshield, the diver simply looks at the correct name and says "that one" and the system places a call to that person.
Rather than way down the road, these types of technological capabilities are coming in cars just around the corner.