Nowadays, engineers developing robotics software have a serious constraint. They can visualize the robot's perception of the environment, but cannot have a straightforward impression of the difference between such perceived data and the real physical world. They often have to rely on uninformative visualizations of point clouds or heatmaps on a computer screen with no additional information about the robot s surroundings, making it rather difficult to find the flaws and debug the system.
Thanks to our toolkit, which overlays the robot's sensed data onto the real world, robotics engineers will benefit from seeing directly what their robot perceives by exploring the environment around them with HoloLens. Such feature will simplify the development of ROS (Robot Operating System) software, allowing engineers to fully leverage sensor data by visualizing them in the mixed reality.
The most challenging, yet most interesting part of this project is the sensor visualization fusion to the real world, or the "calibration" between visualization and the reality.
ROS sensors topics contain in their metadata information about the position of the sensors relative to an point in the robot or the world. Based on whether there is available transform data from the robot, we can locate the visualization in two ways.
In the easiest scenario, the running odometry estimators running on the robot provide usable transform data. As the robot runs, the position of the sensors in the world frame is also known. The position and orientation for every type of visualization is handled automatically.
In this case the user can manually position and orient a static anchor, and place it to the "reference point" the ROS transforms are mapping to.
If the robot is not aware of its position, it is up to us to get an estimation of the robot odometry to map it in the augmented reality scene.
In this case, we track the robot position by exploiting undistorted images or 3D object tracking methods. The sensor visualisions are then transformed to the moving anchor that follows the robot.