AURaPath is an end-to-end augmented reality system for spatial waypoint determination, digital-twin preview, and execution on a physical Universal Robots UR10 collaborative robot using Microsoft HoloLens 2.
This repository contains the full implementation used in our IEEE paper, including:
- A Unity-based AR application for HoloLens 2
- A PC-based middleware server for robot communication
- Live execution on a physical UR10 robot
🔗 This repository accompanies the paper
“Human–Robot Interaction for Robot Programming using Augmented Reality and Digital Twin”
twin.-.execute.mp4
ARUR.mp4
These images correspond directly to the system described and demonstrated in the paper.
Figure 1 — AR Waypoint Determination
Users place and edit 6-DoF waypoints directly in 3D space using hand interaction on HoloLens 2.
Figure 2 — Digital Twin Preview (Preview-before-Execute)
The authored trajectory is animated on a synchronized UR10 digital twin before execution.
Figure 3 — Physical Execution on UR10
The validated trajectory is executed on the real UR10 robot in a lab workspace.
AURaPath enables users to select robot motion directly in the workspace using augmented reality rather than traditional teach pendants or offline programming tools.
The system follows a structured, multi-stage workflow that makes the author–validate–execute pipeline explicit to the user:
-
Configuration
Users register and position the UR10 digital twin within the AR environment.
The digital twin can be spatially aligned with the physical robot for in-situ programming, or placed elsewhere (e.g., on a nearby table). -
Determining Trajectory
Users place and edit 6-DoF end-effector waypoints relative to the digital twin using hand-based interaction.
Waypoints define the desired robot motion and can be iteratively refined. -
Preview
The authored trajectory is validated and visually inspected using an animated digital twin, enabling preview-before-execute verification without moving the physical robot. -
Execute
The validated trajectory is transmitted to the PC middleware server and executed on the physical UR10 robot.
This workflow supports intuitive spatial programming while reducing trial-and-error and improving safety in collaborative environments.
- Unity application built with Mixed Reality Toolkit (MRTK)
- Provides hand-based interaction for waypoint placement and editing
- Visualizes a UR10 digital twin and trajectory previews
- Serializes waypoint data and transmits it to the PC server over Wi-Fi
- Receives waypoint data from the AR client
- Transforms poses from the AR coordinate frame into the UR10 base frame
- Performs reachability checking before preview or execution
- Generates and sends executable robot commands via URSocket / RTDE
- Executes validated waypoint trajectories on physical hardware
- Reports joint state to initialize and synchronize the digital twin
This separation allows the AR device to focus on interaction and visualization, while the PC server handles robot-specific computation and execution.
Additional screenshots of the AR user interface, including waypoint manipulation, menus, and interaction widgets, are available in:
These images document the full interaction design and UI elements used in the system and correspond to the interfaces described in the paper.
The current implementation of AURaPath has been validated using the following hardware:
- Microsoft HoloLens 2
- Used for AR interaction, waypoint authoring, and digital twin visualization
- Universal Robots UR10
- 6-DoF collaborative robotic arm
- Ethernet-enabled controller
- PC / Laptop (Middleware Server)
- Connected to the UR10 via Ethernet
- Connected to the HoloLens 2 via Wi-Fi
- Used for trajectory validation and robot command execution
⚠️ At present, the system supports UR10 only. Adapting to other robot platforms would require changes to the kinematic model and communication interface.
- Unity (with Universal Windows Platform support enabled)
- Mixed Reality Toolkit (MRTK) for HoloLens 2
- Windows SDK compatible with HoloLens 2 deployment
- Visual Studio (for UWP build and deployment)
- Python or C# (.NET) runtime (depending on server implementation)
- URSocket / RTDE access enabled on the UR10 controller
- Network access to the robot controller over Ethernet
- UR10 PolyScope (standard installation)
- Network communication enabled
- Power on the UR10 robot and controller
- Ensure the robot is in Remote mode to allow external control
- Connect the robot controller to the PC via Ethernet
- Note the robot IP address (static IP is recommended)
The PC middleware server is responsible for receiving waypoint data from the HoloLens 2, transforming poses into the UR10 base frame, and transmitting executable commands to the robot controller.
- A PC connected to the UR10 controller via Ethernet
- Network connectivity to the HoloLens 2 (same LAN/Wi-Fi)
- Python 3.x installed
- **URSocket package is installed using pip.
⚠️ No additional third-party Python packages are required.
The server implementation relies only on standard Python libraries and URSocket communication supported by the UR controller.
- Navigate to the server directory:
cd Server- Start the middleware server:
python server.py- Before execution, ensure that:
-
The UR10 IP address configured in server.py matches the robot controller
-
The robot is powered on and ready to accept external commands
-
Network firewalls allow communication between the PC, robot, and HoloLens 2
Once running, the server listens for incoming waypoint data from the AR client and manages trajectory validation and execution on the physical robot.





