The software picks out movements of the eyebrows, lips and nose, and tracks head movements such as tilting, nodding and shaking, which it then associates with the emotion the actor was showing. When presented with fresh video clips, the software gets people's emotions right 90 per cent of the time when the clips are of actors, and 64 per cent of the time on footage of ordinary people.
This is really exciting. It means that in the near future I can videotape my interactions at cocktail parties and let the software analyze how amusing I was throughout the evening, kind of like going over my comedy sets. I'll know which stories to retell and which stories to leave out. I can learn to make small talk a science.
It's too bad I can't tape the facial expressions of those who read this. But then again, maybe I wouldn't want to know.