Sunday , 19 January 2025
Home Robotics: Technology, News & Trends Robot Emo Predicts Human Smiles 0.9 Seconds in Advance, Smiles Along with Humans

Robot Emo Predicts Human Smiles 0.9 Seconds in Advance, Smiles Along with Humans

331
Emo robot predict human facial expression
Emo's eyes are equipped with cameras, and beneath its blue, flexible silicone skin lie 26 motors, similar to human facial muscles, providing the power for the robot to make expressions. It can predict an upcoming smile 839 milliseconds before a human does, expressing the smile simultaneously with the person. Emo can also predict expressions of sadness, anger, and surprise.

Large models have rapidly developed robot’s verbal communication capabilities, but non-verbal communication has not kept pace. Now, researchers at Columbia University have developed a robot that can observe human facial expressions and predict a person’s smile 0.9 seconds in advance through minor facial changes, responding with a smile. The research was published in Science Robotics on March 27.

Artificial Intelligence can mimic human language, but robots have yet to replicate complex non-verbal cues and behaviors crucial for communication. Humanoid robots can rely on sound for communication, but expressing through facial movements faces a dual challenge: physically driving a face capable of rich expressions is challenging, and knowing what expressions to generate to make the robot appear realistic, natural, and timely is complex.

Researchers at Columbia University suggest that training robots to predict future facial expressions and execute these expressions simultaneously with humans could mitigate these challenges. A team led by Hod Lipson, Professor at Columbia University’s Creative Machines Lab, developed a robot named Emo that uses AI models and high-resolution cameras to predict and attempt to replicate human facial expressions.

According to Latest.com, the robot could predict an upcoming smile 839 milliseconds before it happens, expressing the smile simultaneously with humans through the model. Yuhang Hu, the lead author of the paper and a Ph.D. student at Columbia University’s Creative Machines Lab, explained that for example, before a smile fully forms, there’s a brief moment when the corners of the mouth begin to lift, and the eyes start to crinkle slightly. Emo can capture these subtle changes on people’s faces to predict facial expressions. The researchers demonstrated this ability using a robot face with 26 degrees of freedom.

Emo details

The robot includes 26 motors and uses position control. Three motors control the neck’s movement across three axes. Twelve motors control the upper face, including the eyeballs, eyelids, and eyebrows. Eleven motors control the mouth and jaw.

Emo’s eyes are equipped with cameras, and beneath its blue, flexible silicone skin are 26 motors, akin to human facial muscles, that power the robot’s facial expressions. Three motors control the neck’s movement across three axes. Twelve motors control the upper face, including the eyeballs, eyelids, and eyebrows. Eleven motors control the mouth and jaw. The robot uses two neural networks, one to observe faces and predict expressions, and another to study how to produce expressions on the robot’s face. The first network was trained using videos from video websites, and the second allowed the robot to train by watching its own expressions through a live camera.

“When it pulls all these muscles, it knows what its face will look like,” said Lipson, “It’s a bit like a person looking in a mirror, knowing what their face will look like even if they smile with their eyes closed.”

The researchers hope this technology will make human-robot interaction more lifelike. They believe that robots must first learn to predict and mimic human expressions before advancing to more spontaneous and self-driven expressive communication.

In addition to smiles, Emo can also predict expressions of sadness, anger, and surprise. However, Emo cannot produce all human expressions since it only has 26 facial “muscles.” In the future, researchers need to expand the range of the robot’s expressions. They also hope to train the robot to respond to what people say, rather than simply mimicking another person. Additionally, researchers are integrating verbal communication into Emo using large models, allowing Emo to answer questions and engage in conversations.

Related Articles

Human-like robots 1

The Breakthrough in Human-like Facial Expressions in Robots

As artificial intelligence (AI) and robotics continue to advance rapidly, robots are...

Humanoid robot

Apple Plans Humanoid Robots: “Self-centered” Perception System for Dynamic Obstacle Avoidance

Apple has recently developed a robot perception system called ARMOR. This robot...

Mobile robot 2

Demand for Mobile Robots Surges with 2.8 Million Shipments by 2030

With the continued advancement of supporting technologies, mobile robots (AMR, automated mobile...

The main page of chatbots

The World’s First AI Death Case: Chatbots Become the Best Emotional Companions

After uploading a text and image, you can use your own chatbot....