In this project we research how to build maps which include the uncertainty of the robot over the occupancy of the objects in the environment.
We have shown how the constructed maps can be used to increase global navigation safety by planning trajectories which avoid areas of high uncertainty, enabling higher autonomy for mobile robots in indoor settings.
We released the dataset for our IROS 2018 paper “Hallucinating Robots: Inferring Obstacle Distances from Partial Laser Measurements”.
The source code for the article “Hypermap Mapping Framework and its Application to Autonomous Semantic Exploration” is available on github.
The toy-dataset is a new RGB-D dataset captured with the Kinect sensor. The dataset is composed of typical children’s toys and contains a total of 449 RGB-D images alongside with their annotated ground truth images.