This

2024 ж. 23 Сәу.
532 Рет қаралды

Engineers at Columbia University's Creative Machines Lab have developed Emo, a robot capable of mimicking human facial expressions.
While robots have made strides in verbal communication through advancements like ChatGPT, their ability to express facial cues has lagged. The researchers believe Emo is a significant advance in non-verbal communication between humans and robots.
Emo, described in a study published in Science Robotics, can anticipate human facial expressions and mimic them simultaneously, even predicting a forthcoming smile around 840 milliseconds (0.84 seconds) before it happens.
The study's lead author explained how the team faced the challenges of developing a mechanically expressive face and determining when to generate natural, timely expressions.
"The primary goal of the Emo project is to develop a robotic face that can enhance the interactions between humans and robots," explained PhD student and lead author, Yuhang Hu.
"As robots become more advanced and complicated like those powered by AI models, there's a growing need to make these interactions more intuitive," he added.
Emo's human-like head uses 26 actuators for a range of facial expressions and is covered with silicone skin. It features high-resolution cameras in its eyes for lifelike interactions and eye contact, crucial for non-verbal communication. The team used AI models to predict human facial expressions and generate corresponding robot facial expressions.
"Emo is equipped with several AI models including detecting human faces, controlling facial actuators to mimic facial expressions and even anticipating human facial expressions. This allows Emo to interact in a way that feels timely and genuine," Hu added.
The robot was trained using a process termed "self-modelling", wherein Emo made random movements in front of a camera, learning the correlation between its facial expressions and motor commands. After observing videos of human expressions, Emo could predict people's facial expressions by noting minor changes as they intend to smile.
The team says the study marks a shift in human-robot interaction (HRI), allowing robots to factor in human expressions during interactions, improving interaction quality and fostering trust.
The team plans to integrate verbal communication into Emo to allow the robot "to engage in more complex and natural conversations."
With inputs from Reuters
Join Nitin Gokhale's Strategic Group WhatsApp Channel to get the latest updates from articles on our website and videos on our KZhead Channel: www.whatsapp.com/channel/0029...
Since many of our well-wishers requested a UPI payment id to contribute and support us, here’s the link, which gives us 100% of what you choose: stratnewsglobal.com/support-us/
You can also click and buy a KZhead Super Thanks(the heart icon where you liked this video), which directly supports StratNewsGlobal, with 70% of your chosen amount.
Leave your comments, questions, and feedback. Like and share our videos. Subscribe to our KZhead channel. Click on the 🔔icon to get notified of our latest uploads.
To get instant updates join our telegram circle - t.me/stratnewsglobalbroadcast

Пікірлер
  • Welcome to Terminator scenario Real life imitation of a movie

    @manuela1410@manuela141015 күн бұрын
KZhead