Have you ever stopped to think how much emotion we convey in everyday conversation? Conversations that you think went well can be interpreted vastly differently by other people, especially those with Asperger’s or anxiety conditions that may impair their ability to read emotions.
A team of researchers from MIT’s Computer Science and Artificial Intelligence Laboratory say they’re closer to discovering a solution for people who may have trouble reading emotions on their own. An artificially intelligent wearable that’s said to be able to predict the tone of a conversation, from happy, sad, or neutral. It achieves this by analyzing a person’s speech patterns and vitals.
“Imagine if, at the end of a conversation, you could rewind it and see the moments when the people around you felt the most anxious. Our work is a step in this direction, suggesting that we may not be that far away from a world where people can have an AI social coach right in their pocket.”
That’s graduate student Tuka Alhanai’s explanation for why something like this could be useful. The system can analyze audio and text transcriptions to determine the overall tone of a person’s story with 83% accuracy. The researchers say thanks to deep learning algorithms they’ve used to train the system, it’s possible to reveal the emotional undertones of a conversation in real-time.