With virtual reality headsets, you may one day be able to control a robot, from the battlefield to the operating room.
End-user robot programming using mixed reality
Researchers from the Microsoft Mixed Reality and AI Lab and ETH Zurich recently developed a new method that combines mixed reality and robotics. The term “mixed reality” (MR or MxR) refers to the merging of real and virtual worlds to produce new environments, and it’s part of a growing effort to find better ways to use robots remotely.
“Current systems require training and practice for operators to learn the abstract commands,” Todd Richmond, an IEEE fellow and director of the Tech + Narrative Lab at Pardee RAND Graduate School, told Lifewire in an email interview. “MxR can provide an operator with a more ‘embodied’ control system and can use more literal body movements for command and control (e.g., a human moving an arm to move the robot arm).”
The MR and robotics system devised by the researchers was tested using a HoloLens MR headset. One method is designed to plan missions in which a robot inspects an environment.