Emotion AI Researchers Feel Their Work is Misunderstood

News reports have been covering the appearance of AI systems that can perform interviews for jobs across Asia and in some companies within the US. However, like most scientific advances, the news about this one should be taken with a grain of salt. A lot of the techniques the AI uses to determine suitability aren’t even supported by scientific evidence. The field is an emerging one, with a lot of rough edges to smooth out, including the methodologies that AI will use to draw its conclusions. Sadly, many AI researchers feel like their work is being misrepresented in popular culture, according to interviews by MIT Technology Review.

Facial Expression is a Bad Predictor of Emotion

A recent meta-study found that humans can’t judge emotions based solely on facial expression. The same limitation applies to AI, since it uses out understanding of emotion to draw its conclusions. Researchers have already agreed that the idea of emotion-reading AI using just facial expressions is a bit of a stretch. An AI’s ability to read a person’s emotional state depends more on just their face. Despite the inclusion of other input channels to aid the AI, researchers still agree that it would be impossible for an artificial intelligence system to tell whether a person is a proper fit for a job within a company. This limitation hasn’t stopped certain companies from capitalizing on a combination of misinformation and the public’s dreams of what AI should be able to do for them. The problem arises when these AI systems are unable to deliver what they promise.

Putting the Profession into Disrepute

Commercialization isn’t only giving the technology a bad name, but it’s also using dodgy science to make determinations that may affect people’s lives. Facial recognition, a significant component in the emotion recognition algorithm, has already come under fire from lawmakers, and regulation is pending regarding the technology. In the future, to limit the commercialization of AI emotional detection technology, similar provisions may need to be put in place.