There are smartwatches, other smart wearables and now an artificially intelligent wearable that is so smart it can tell how a conversation is going.
The intelligent wearable system can
predict if a conversation is happy, sad or neutral, based on a person’s speech patterns and vitals.
The wearable AI system was introduced this week by researchers at MIT’s Computer
Science and Artificial Intelligence Laboratory and Institute of Medical Engineering and Science.
As the wearer of the wrist device tells a story, the system analyzes audio, text transcriptions
and physiological signals to determine the overall tone of the story with 83% accuracy, according to the researchers.
As part of the research, subjects wore a Samsung Simband, a device that
can measure movement, blood pressure, heart rate, blood flow and temperature, along with audio data and text transcripts to analyze the speaker’s tone, pitch, energy and vocabulary.
After
capturing more than 30 difference conversations, the team created algorithms based on the data.
Long pauses and monotonous vocal tones were associated with sadder stories and more energetic,
varied speech patterns were associated with happier stories, according to MIT. Sadder stories were also strongly associated with increased fidgeting and cardiovascular activity, which also were being
measured.
The researchers concluded that it is possible to classify the emotional tone of conversations in real time.
Next on the agenda is to is to collect such data on a larger
scale, perhaps by using commercial devices like the Apple Watch, which would allow the implementation of the system in a real-world environment.
More capabilities have continually been
included into smartwatches and fitness trackers, as I wrote about here just yesterday (Wearable, Smartwatch Capabilities Converging).
Various features have been
migrating from one wearable to another.
Voice interaction already is becoming core to the Internet of Things. Wearables soon may be deciphering the actual tone of those voices.