Reading someone else’s emotions isn’t always easy. Through rigorous, decades-long anthropologic studies of how we portray emotions with our faces have found that there are seven faces of universal affect: anger, fear, disgust, happiness, sadness and surprise. But sometimes, they’re hard to distinguish and those certainly aren't all distinguisable emotions. There’s lots of gray area between how we could possibly guess how a person feels before us.
I would guess that’d be at least part of the reason why there’s now an app for Google Glass that aids in reading emotion. It’s called the Shore Human Emotion Detector, and it works by using the built-in camera of Glass to capture an active feed on who’s in front of the lenses.
It’s reading speed is only at 10 frames per second, but it reads very basic facial features of displayed emotions. Based on what it picks up, it has meters for how much of an emotion is picked up. Additionally, it also verifies gender and even a rough estimate of a person’s age group.
To be honest, I don’t like this innovation solely because it seems the MOST subversive to our naturally occurring interactions/conversations. I don’t care how accurate an emotion detector app could possibly be; I’d rather be completely wrong trying to navigate emotional ambiguity, than being wrong for even placing any confidence on a device for it. Why would anyone want to rely on technology to read a person in front of you? This app is truly insulting because it threatens to rob us of our intuition and our capacities to be emotionally mature. Plus, wouldn’t it just look weird to be paying attention to what gets projected in the HUD display over a person?