The idea of technology that can pick up on emotions has floated around for a while, but artificial intelligence opens up the potential for companies to actually try creating it. According to a Bloomberg report, Amazon is working on a device that can allegedly read human emotions.
The project — simply code-named Dylan — is focused on developing a voice-activated wearable device. Bloomberg reviewed internal Amazon documents and spoke to an anonymous source. The device is a collaboration between Amazon’s Lab126 — a hardware company — and the Alexa voice software team.
Unlike affect recognition — or technology reading people’s emotions — this device wouldn’t rely on seeing your face. Instead, Bloomberg reported:
“Designed to work with a smartphone app, the device has microphones paired with software that can discern the wearer’s emotional state from the sound of his or her voice, according to the documents and a person familiar with the program. Eventually the technology could be able to advise the wearer how to interact more effectively with others, the documents show.”
Affect recognition has been denounced by some AI researchers for its reliance on pseudosciences. That includes AI Now, a New York University research group focusing on interdisciplinary research on the social implications of artificial intelligence.
In its 2018 report, AI Now described affect recognition as a “subclass of facial recognition that claims to detect things such as personality, inner feelings, mental health, and ‘worker engagement’ based on images or video of faces.”
The group went on to write, “These claims are not backed by robust scientific evidence, and are being applied in unethical and irresponsible ways that often recall the pseudosciences of phrenology and physiognomy.”
Alongside AI Now’s warning regarding affect recognition, research has found that there’s a highly racialized component to it. In a 2018 test using professional basketball players’ pictures, Lauren Rhue —Assistant Professor of Information Systems and Analytics at Wake Forest University —found that Black players were consistently read as having more negative emotions than white players.
Although Amazon’s device doesn’t focus on faces, the idea that it could discern someone’s emotional state from just the sound of their voice is deeply concerning.
In addition, the suggestion that it could advise people on how to interact with each other is a highly dystopian twist. Ultimately, that would allow Amazon as a company to set the standards for what are “appropriate” emotions in any given interaction.
It’s unclear if Amazon plans to release this device or when. However, is definitely another sign of the company venturing into murky waters.