According to a study published in Science Robotics, the humanoid robot is capable of mimicking human facial expressions in real-time. Emo’s advanced technology enables it to predict a human smile 839 milliseconds before it happens and respond with a smile of its own.
The creators aimed to address the limitations of current social robots, which often disappoint users with delayed reactions. “I think a lot of people actually interacting with a social robot for the first time are disappointed by how limited it is. Improving robots’ expression in real-time is important,” noted Chaona Chen, a human-robot interaction researcher at the University of Glasgow.
According to Yuhang Hu, a roboticist at Columbia University who, along with colleagues, created Emo, this development could alleviate the loneliness epidemic by providing a sense of connection through synced facial expressions.
The researchers equipped the machine with cameras in its eyes to detect subtle human expressions. The robot’s 26 actuators underneath its soft, blue face emulate these expressions. They initially placed Emo in front of a camera for several hours. When it looked at itself in the camera, the researchers ran random motor commands on the actuators, helping it learn the relationships between activating actuators in its face and the expressions it generated. “Then the robot knows, OK, if I want to make a smiley face, I should actuate these ‘muscles,’” Hu added.
Researchers trained Emo using 800 videos, allowing it to learn muscle movements and anticipate human expressions. Apart from smiling, it can create expressions like raising the eyebrows and frowning, Hu said.
Its blue skin helps avoid the uncanny valley effect, making it easier for people to accept it as a new species. Hu said that if people think a robot is supposed to look like a human, “then they will always find some difference or become sceptical.” But with Emo’s rubbery blue face, one can “think about it as a new species. It doesn’t have to be a real person.”
Plans include integrating generative AI chatbot functionalities to enhance Emo’s reactions and enable verbal responses. Currently, Emo’s need some work. Current robot mouth movement frequently relies on the jaw for all the talking, rather than the lips” “They quickly lose interest… It’s really weird,” Hu added.
Once the robot has more realistic lips and chatbot capabilities, it may be a better companion. “Maybe when I’m working at midnight, we can complain to each other about why there’s so much work or tell a few jokes,” Hu said.