Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
175 tokens/sec
GPT-4o
8 tokens/sec
Gemini 2.5 Pro Pro
47 tokens/sec
o3 Pro
5 tokens/sec
GPT-4.1 Pro
38 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

The Newer College Dataset: Handheld LiDAR, Inertial and Vision with Ground Truth (2003.05691v2)

Published 12 Mar 2020 in cs.RO

Abstract: In this paper we present a large dataset with a variety of mobile mapping sensors collected using a handheld device carried at typical walking speeds for nearly 2.2 km through New College, Oxford. The dataset includes data from two commercially available devices - a stereoscopic-inertial camera and a multi-beam 3D LiDAR, which also provides inertial measurements. Additionally, we used a tripod-mounted survey grade LiDAR scanner to capture a detailed millimeter-accurate 3D map of the test location (containing $\sim$290 million points). Using the map we inferred centimeter-accurate 6 Degree of Freedom (DoF) ground truth for the position of the device for each LiDAR scan to enable better evaluation of LiDAR and vision localisation, mapping and reconstruction systems. This ground truth is the particular novel contribution of this dataset and we believe that it will enable systematic evaluation which many similar datasets have lacked. The dataset combines both built environments, open spaces and vegetated areas so as to test localization and mapping systems such as vision-based navigation, visual and LiDAR SLAM, 3D LIDAR reconstruction and appearance-based place recognition. The dataset is available at: ori.ox.ac.uk/datasets/newer-college-dataset

Citations (160)

Summary

  • The paper introduces a dataset with precise 6 DoF ground truth achieved by registering LiDAR scans to a high-resolution, survey-grade map using ICP.
  • It details a handheld sensor setup combining Intel Realsense D435i and Ouster OS-1 LiDAR for robust SLAM, visual-inertial odometry, and 3D reconstruction tests.
  • The dataset facilitates benchmarking autonomous navigation systems across diverse environments, advancing research in localization and mobile mapping.

The Newer College Dataset: A Comprehensive Resource for Mobile Mapping Systems

The Newer College Dataset offers a detailed assemblage of data captured through a handheld device utilizing a combination of 3D LiDAR and stereoscopic-inertial cameras, supplemented by high-resolution ground truth. This dataset addresses several gaps in existing localization and mapping datasets, serving as a valuable resource for evaluating autonomous navigation systems.

Data Collection and Novel Contributions

The Newer College Dataset, captured around New College, Oxford, encompasses 2.2 km with varied environments including structured buildings, open spaces, and vegetated areas. This variety helps test localization and mapping systems such as vision-based navigation, visual and LiDAR-based SLAM, and appearance-based place recognition. Unlike most datasets derived from robotic platforms, data here is captured with a handheld device, simulating the erratic motion of a UAV or quadruped.

The dataset's unique contribution lies in providing a precise 6 Degrees of Freedom (DoF) ground truth pose for each LiDAR scan. This accuracy stems from employing a survey-grade LiDAR scanner to produce a detailed prior map—featuring approximately 290 million points—used to register LiDAR scans with the aid of ICP algorithms. This approach ensures higher local accuracy than datasets relying solely on GPS/INS fusion.

Sensor Platforms and Calibration

The data was collected using a handheld device integrating commercially available low-cost sensors such as the Intel Realsense D435i and Ouster OS-1 LiDAR. These sensors are mounted on a precisely calibrated structure, and their intrinsic and extrinsic properties are calibrated using the Kalibr software toolbox. Synchronization concerns are addressed by comparing timestamps and angular velocities and compensating for any drift over extended periods of operation. This meticulous setup supports diverse data forms, from IMU readings to stereoscopic images, recorded synchronously to ensure robustness in algorithms testing.

Demonstrated Applications

Several use cases exemplify the dataset's broad applicability to mobile robotics research:

  1. LiDAR SLAM: Utilizing LiDAR data for SLAM provides robust ego-motion estimates and demonstrates effective loop closures.
  2. Appearance-Based Loop Closure: Employing DBoW2 with ORB features, examples of visual place recognition and loop closure detection are illustrated, enhancing localization accuracy.
  3. 3D Reconstruction: Leveraging the ground truth map, reconstructions using LiDAR scans demonstrate high fidelity meshes, validating practical applications in 3D mapping systems.
  4. Visual-Inertial Odometry: Tests on ORB-SLAM2 and VILENS systems confirm the dataset’s suitability for evaluating visual-inertial navigation solutions, providing insights into potential improvements and integrations in real-world applications.

Implications for Future Developments

The Newer College Dataset facilitates rigorous testing and benchmarking of algorithms for various navigation and mapping tasks. The distinction of having reliable centimeter-scale ground truth paves the way for improvements in short-length odometry and solutions addressing large-scale drift. This dataset is poised to be a vital tool in advancing AI capabilities in autonomous systems, helping refine methodologies for real-world applicability and enabling more complex scenario testing.

The structured and diverse data enables systematic evaluation and comparison with existing datasets, providing insights into the efficacy and applicability of different algorithms under varied conditions. As this dataset continues to expand, it offers an invaluable foundation for future innovations in mobile robotics and autonomous navigation technologies.