AI’s Next Big Step: Detecting Human Emotion and Expression

Available to WrapPRO members

Emotionally intelligent bots may soon understand how you feel when you talk to them and reply with empathy. Is this the future?

AI emotions
(AI-generated image by Midjourney/Big Technology)

The AI field has made remarkable progress with incomplete data. Leading generative models like Claude, Gemini, GPT-4, and Llama can understand text but not emotion. These models can’t process your tone of voice, rhythm of speech, or emphasis on words. They can’t read your facial expressions. They are effectively unable to process any of the non-verbal information at the heart of communication. And to advance further, they’ll need to learn.

Though much of the AI sector is currently focused on making generative models larger via more data, compute, and energy, the field’s next leap may come from teaching emotional intelligence to the models.

Comments