Thursday , 13 June 2024
Home Robotics: Technology, News & Trends Robot Avatar Enables Remote Sensation and Perception

Robot Avatar Enables Remote Sensation and Perception

144
Humanoid avatar

A person wearing a VR headset and haptic feedback gloves can control the iCub 3 robot and experience being somewhere else

A humanoid robot can transmit video and tactile sensations to an individual using haptic feedback gloves and virtual reality (VR) headset, even when they are hundreds of kilometers apart, providing a means for remote event participation without the need for travel.

The iCub 3 robot, standing at 125 centimeters tall and weighing 52 kilograms, boasts 54 points of articulation distributed across its aluminum alloy and plastic frame. Positioned in its head are two cameras, mimicking human eyes, and an internet-connected computer serving as its virtual brain. Equipped with sensors throughout its body, the robot gathers data, which is then transmitted to its ‘brain.’ These sensory inputs are faithfully replicated on a suit and VR headset worn by a human operator situated remotely.

A person wearing a VR

A person wearing a VR headset and haptic feedback gloves can see and feel what the robot touches

As the operator responds to visual and tactile stimuli, the sensors embedded in the suit detect these movements, and the robot mirrors them accordingly. Stefano Dafarra, a member of the iCub 3 team at the Italian Institute of Technology, emphasizes the importance of translating every signal and numeric data that can be transmitted through the network. While there may be a slight delay of up to 100 milliseconds in capturing and transmitting visual footage, operators can minimize this by moving at a slightly slower pace than usual.

The team showcased the robot at the Venice Biennale, where it roamed through an exhibition while its operator was situated 290 kilometers away in Genoa.

Dafarra envisions the iCub 3 being utilized for remote event participation, minimizing the necessity for travel. However, he notes that currently, a fall could have severe consequences for the robot, and it remains uncertain whether it can autonomously regain an upright position.

“iCub 3 is an interesting robot and offers clear advantages from the previous iteration,” says Jonathan Aitken at the University of Sheffield, UK, whose laboratory owns a prior version of the robot. However, he is disappointed that the team wasn’t clear in its research about the data transmission requirements of the new version of the robot. “It would be good to know just how much data was required, and what the upper and lower bounds were,” he says.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

Soft robot

Study Finds Soft Robots Can Easily Traverse Loops and Curves

Engineers at Princeton University and North Carolina State University, USA, have combined...

Mobile robots

The Future of Mobile Robots: Industry Revolution with Latest Chips and Technology

The mobile robotics industry stands at the cusp of a new wave...

Miniature robots clean up microplastics

Micro-Robots Capable of Cleaning Microplastics and Bacteria

Food packaging, discarded children’s toys and other mismanaged plastic waste can become...

Household robot

Household Robots Shine in the Spotlight

Who wouldn’t want a robot that can handle all household chores? It’s...