[ad_1]
Emo – an experimental new robot – can make eye contact and uses AI models to anticipate and replicate a person’s smile
It’s undeniably the age of artificial intelligence (AI). There is constant buzz about innovations using AI and large language models (LLMs). The newest one might make you smile but a robot might beat you at that too. Researchers have unveiled a new robot that can predict a forthcoming smile and express the smile simultaneously with the person.
One of the challenging tasks of designing a robot has been designing one that can not only make diverse facial expressions but also know when to use them. To address this, researchers at the Columbia University School of Engineering and Applied Science have designed Emo – a human-like head with a face that is equipped with 26 actuators that enable multiple nuanced facial expressions.
The robot makes eye contact and uses two AI models to anticipate and replicate a person’s smile. It has even learned to predict a forthcoming smile about 840 milliseconds before the person smiles and to co-express it with the person, the university’s press statement revealed.
The head is covered with a soft silicone skin with a magnetic attachment system, which enables easy customization. Furthermore, for more lifelike interactions, the researchers embedded high-resolution cameras within the pupil of each eye, enabling Emo to make eye contact.
Emo uses two AI models. The first predicts human facial expressions by analysing subtle changes in the person’s face, while the other generates motor commands using the corresponding facial expressions.
In the statement, Yuhang Hu, lead author for the study, said predicting human facial expressions accurately is a “revolution” as robots have not been traditionally designed to take human expression into account during interactions. “When a robot makes co-expressions with people in real-time, it not only improves the interaction quality but also helps in building trust between humans and robots. In the future, when interacting with a robot, it will observe and interpret your facial expressions, just like a real person,” Hu added in the statement. The study was published recently in the journal Science Robotics.
By designing robots that can interpret and mimic human expressions accurately, the world is moving closer to a future where robots can seamlessly integrate into people’s daily lives and even offer companionship, assistance, and empathy, the researchers said in the statement.
Although it is “exciting”, as the researchers put it, such advancements also inevitably bring with them anxieties of equipping robots with emotional intelligence.
[ad_2]
Source link