Developing an XR User Interface for High-Level Robot Teleoperation

Many real-world tasks carried out by robots require human participation, whether on-site or remote. However, operating robots typically demands that users undergo extensive robot-specific training to perform low-level tasks effectively and safely. I will describe an experimental XR user interface our team is developing to instead assign and manage assembly tasks remotely through high-level goal-based instructions rather than low-level direct control. The user manipulates virtual replicas of task objects to prescribe 6DoF destination poses without needing to be familiar with specific robots and their capabilities. To make this possible, our user interface is integrated with a robot-planning system that determines, verifies, and executes robot-specific actions. I will present some of the issues involved in designing XR user interfaces to asynchronously support high-level task assignment by users and low-level task performance by robots.

Wednesday, June 19, 2024
12:00 PM - 12:25 PM (PDT)
#94b4a3
Room 101A
Developer