4.7 Article

Automatic Extrinsic Calibration Method for LiDAR and Camera Sensor Setups

Journal

IEEE TRANSACTIONS ON INTELLIGENT TRANSPORTATION SYSTEMS
Volume 23, Issue 10, Pages 17677-17689

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TITS.2022.3155228

Keywords

Calibration; Laser radar; Cameras; Robot sensing systems; Three-dimensional displays; Performance evaluation; Machine vision; Automatic calibration; extrinsic parameters; LiDAR; monocular cameras; stereo cameras

Funding

  1. Madrid Government (Comunidad de Madrid)
  2. Universidad Carlos III de Madrid (UC3M) [SEGVAUTO-4.0-CM P2018/EMT-4362]
  3. Spanish Government [RTI2018-096036-B-C21]

Ask authors/readers for more resources

Most sensor setups for onboard autonomous perception consist of LiDARs and vision systems, and an accurate calibration between the sensors is required. We propose a method to calibrate the extrinsic parameters of any pair of sensors, which can handle devices with different resolutions and poses, and outperforms existing methods.
Most sensor setups for onboard autonomous perception are composed of LiDARs and vision systems, as they provide complementary information that improves the reliability of the different algorithms necessary to obtain a robust scene understanding. However, the effective use of information from different sources requires an accurate calibration between the sensors involved, which usually implies a tedious and burdensome process. We present a method to calibrate the extrinsic parameters of any pair of sensors involving LiDARs, monocular or stereo cameras, of the same or different modalities. The procedure is composed of two stages: first, reference points belonging to a custom calibration target are extracted from the data provided by the sensors to be calibrated, and second, the optimal rigid transformation is found through the registration of both point sets. The proposed approach can handle devices with very different resolutions and poses, as usually found in vehicle setups. In order to assess the performance of the proposed method, a novel evaluation suite built on top of a popular simulation framework is introduced. Experiments on the synthetic environment show that our calibration algorithm significantly outperforms existing methods, whereas real data tests corroborate the results obtained in the evaluation suite. Open-source code is available at https://github.com/beltransen/velo2cam_calibration.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.7
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available