Don’t get too emotional about emotion-reading AI
Call it “artificial emotional intelligence” — the kind of artificial intelligence (AI) that can now detect the emotional state of a human user.
Or can it?
More importantly, should it?
Most emotion AI is based on the “basic emotions” theory, which is that people universally feel six internal emotional states: happiness, surprise, fear, disgust, anger, and sadness, and may convey these states through facial expression, body language and vocal intonation.
In the post-pandemic, remote-work world, sales people are struggling to “read” the people they’re selling to over video calls. Wouldn’t it be nice for the software to convey the emotional reaction on the other end of the call?
Read more: Don’t get too emotional about emotion-reading AI