Researchers have developed an app for the Apple Vision Pro mixed reality headset that allows users to operate a robot exclusively with hand and head movements. It might be used to remotely operate devices in a variety of situations, such as pulling practical jokes or navigating a disaster area.
Younghyo Park, the app's developer and MIT doctorate student, posted a video of the application in use on X, formerly known as Twitter. The MIT graduate student and co-author of the study, Gabe Margolis, walks viewers through the operation of the app in the video below. You can see how he uses his hands and body to operate the four-legged robot, reports TomorrowsWorldToday.
Margolis gives the robot instructions to use its gripper to open a closed door and let herself in while showcasing how the software functions. Furthermore, Margolis directs the robot to retrieve a piece of trash and dispose of it in the trash. In another scene in the video, the robot imitates Margolis's movements by bending down when he does.
Related Apple Vision Pro Used in Spinal Surgery
Although the Apple Vision Pro has many benefits, it is not without its drawbacks. There are limitations in confined areas like elevators and moving cars because the gadget depends on movements. In order to guarantee precise tracking, users also need to be mindful of hand location. For instance, when a user's hands are at their waist or by their sides, tracking is restricted.
Still, scientists think that fusing robots with the Apple Vision Pro has a lot of potential. According to Park and Margolis' paper, using the Apple Vision Pro longer yields more data that can be used to train robots to move. It is said that more functionality for robotic applications are planned.