Friday , 23 February 2024
Home Robotics: Technology, News & Trends Robot Avatar Enables Remote Sensation and Perception

Robot Avatar Enables Remote Sensation and Perception

34
Humanoid avatar

A person wearing a VR headset and haptic feedback gloves can control the iCub 3 robot and experience being somewhere else

A humanoid robot can transmit video and tactile sensations to an individual using haptic feedback gloves and virtual reality (VR) headset, even when they are hundreds of kilometers apart, providing a means for remote event participation without the need for travel.

The iCub 3 robot, standing at 125 centimeters tall and weighing 52 kilograms, boasts 54 points of articulation distributed across its aluminum alloy and plastic frame. Positioned in its head are two cameras, mimicking human eyes, and an internet-connected computer serving as its virtual brain. Equipped with sensors throughout its body, the robot gathers data, which is then transmitted to its ‘brain.’ These sensory inputs are faithfully replicated on a suit and VR headset worn by a human operator situated remotely.

A person wearing a VR

A person wearing a VR headset and haptic feedback gloves can see and feel what the robot touches

As the operator responds to visual and tactile stimuli, the sensors embedded in the suit detect these movements, and the robot mirrors them accordingly. Stefano Dafarra, a member of the iCub 3 team at the Italian Institute of Technology, emphasizes the importance of translating every signal and numeric data that can be transmitted through the network. While there may be a slight delay of up to 100 milliseconds in capturing and transmitting visual footage, operators can minimize this by moving at a slightly slower pace than usual.

The team showcased the robot at the Venice Biennale, where it roamed through an exhibition while its operator was situated 290 kilometers away in Genoa.

Dafarra envisions the iCub 3 being utilized for remote event participation, minimizing the necessity for travel. However, he notes that currently, a fall could have severe consequences for the robot, and it remains uncertain whether it can autonomously regain an upright position.

“iCub 3 is an interesting robot and offers clear advantages from the previous iteration,” says Jonathan Aitken at the University of Sheffield, UK, whose laboratory owns a prior version of the robot. However, he is disappointed that the team wasn’t clear in its research about the data transmission requirements of the new version of the robot. “It would be good to know just how much data was required, and what the upper and lower bounds were,” he says.

Leave a comment

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Articles

Scythe robotics lawnmower nacs

Scythe Robot Lawn Mower Adopts Tesla Charging Standards

Colorado-based Scythe Robotics is adopting the North American Charging Standard (NACS) charging...

Wyss center synapsuit brain signals exosuit

Wyss Institute’s New Project Aims to Harness Brain Signals to Control Exoskeleton Suits

Researchers at the Wyss Center are working on a project to develop...

Autonomous robots at the BMW manufacturing facility

Humanoid Robots Enter Factories, Revolutionizing Automotive Manufacturing?

Imagine this: accompanied by the mechanical roar, a group of “hands and...

World’s First Living Robot: Pioneering a New Era in Reproduction

Do robots necessarily have to be made of materials like metal, plastic,...