Papers
Topics
Authors
Recent
Gemini 2.5 Flash
Gemini 2.5 Flash
139 tokens/sec
GPT-4o
47 tokens/sec
Gemini 2.5 Pro Pro
43 tokens/sec
o3 Pro
4 tokens/sec
GPT-4.1 Pro
47 tokens/sec
DeepSeek R1 via Azure Pro
28 tokens/sec
2000 character limit reached

Target-free Extrinsic Calibration of a 3D-Lidar and an IMU (2104.12280v3)

Published 25 Apr 2021 in cs.RO

Abstract: This work presents a novel target-free extrinsic calibration algorithm for a 3D Lidar and an IMU pair using an Extended Kalman Filter (EKF) which exploits the \textit{motion based calibration constraint} for state update. The steps include, data collection by motion excitation of the Lidar Inertial Sensor suite along all degrees of freedom, determination of the inter sensor rotation by using rotational component of the aforementioned \textit{motion based calibration constraint} in a least squares optimization framework, and finally, the determination of inter sensor translation using the \textit{motion based calibration constraint} for state update in an Extended Kalman Filter (EKF) framework. We experimentally validate our method using data collected in our lab and open-source (https://github.com/unmannedlab/imu_lidar_calibration) our contribution for the robotics research community.

Citations (22)

Summary

  • The paper introduces a target-free calibration algorithm that uses an Extended Kalman Filter to accurately align 3D-Lidar and IMU data.
  • It employs a two-stage process, first estimating rotation via least squares and then refining rotation, translation, and biases with an EKF.
  • Experiments with an Ouster 128-Lidar and VN-300 IMU demonstrated effective motion distortion mitigation and precise calibration validation.

Target-Free Extrinsic Calibration of a 3D-Lidar and an IMU

The paper by Mishra et al. introduces a target-free extrinsic calibration algorithm for aligning a 3D-Lidar with an Integrated Measurement Unit (IMU) pairing. This advanced approach leverages an Extended Kalman Filter (EKF) to address the complexities associated with motion distortion in LiDAR scans during dynamic maneuvers of autonomous systems. This calibration is crucial in ensuring that the data from the LiDAR and IMU can be accurately integrated into a unified reference frame, thereby enhancing the performance of tasks such as Lidar Odometry and Simultaneous Localization and Mapping (SLAM).

Approach and Methodology

The procedure begins with data collection, during which motion is induced in all degrees of freedom to gather comprehensive data from the Lidar-IMU suite. The calibration is achieved in two primary stages: initial estimation of the inter-sensor rotation using a least squares optimization framework and subsequent determination of inter-sensor translation using an EKF framework. The EKF approach not only refines the initial estimates of rotation parameters but also concurrently estimates translation parameters, IMU poses at scan timestamps, and sensor biases.

This method sets itself apart by not requiring any calibration targets or auxiliary sensors such as GPS or cameras, which have been integral to previous methods like Lidar-camera or camera-IMU calibrations. The proposed algorithm stands in contrast to prior frameworks that often rely on additional environmental features or complex modeling techniques such as Gaussian Process regression. The algorithm utilizes a discrete time IMU state propagation model to effectively mitigate motion distortion during the calibration process itself.

Experimental Setup and Validation

The system validation uses an experimental setup comprising an Ouster 128-Channel LiDAR and a Vectornav VN-300 IMU, and the authors openly share their calibration software on a GitHub repository, providing a resource for further research and replication. The experiments demonstrated convergence of the algorithm across varying Lidar configurations, regardless of the number of channels in operation, proving its versatility. The calibration results aggressively reduced inaccuracies through effective deskewing of LiDAR scans, ensuring sharp and aligned edges in point cloud data.

While ground-truth data was unavailable due to the absence of unified sensor suite procurement, the calibration accuracy is validated indirectly. The placement of IMUs is changed by a known distance and subsequent calibration matched these expected spatial adjustments with minimal error.

Implications and Future Prospects

The paper's contributions are crucial for robotics labs and researchers assembling sensor suites from disparate sources, who often lack pre-calibrated setups. This work simplifies the calibration process without compromising accuracy, hence facilitating seamless integration of different sensor technologies in autonomous systems. Looking ahead, this research could inspire improvements in sensor fusion strategies and calibration algorithms, especially for contexts with scarce or no external reference data.

As autonomous systems continue to expand their applicability in vehicles, drones, and mobile robots, the need for accurate and robust sensor calibrations remains paramount. This paper's calibration algorithm could serve as a foundation for future developments in such systems, pushing forward the capabilities in precise environmental perception and interaction.

Github Logo Streamline Icon: https://streamlinehq.com