Camera-LiDAR Extrinsic Calibration Using Constrained Optimization with Circle Placement

Daeho Kim, Seunghui Shin, Hyoseok Hwang

Research output: Contribution to journalArticlepeer-review

1 Citation (Scopus)

Abstract

Monocular camera-LiDAR data fusion has demonstrated remarkable environmental perception capabilities in various fields. The success of data fusion relies on the accurate matching of correspondence features from images and point clouds. In this letter, we propose a target-based Camera-LiDAR extrinsic calibration by matching correspondences in both data. Specifically, to extract accurate features from the point cloud, we propose a novel method that estimates the circle centers by optimizing the probability distribution from the initial position. This optimization involves generating the probability distribution of circle centers from circle edge points and using the Lagrangian multiplier method to estimate the optimal positions of the circle centers. We conduct two types of experiments: simulations for quantitative results and real system evaluations for qualitative assessment. Our method demonstrates a \mathbf{21\%}$ improvement in simulation calibration performance for 20 target poses with LiDAR noise of \mathbf{\text{0.03}\,m}$ compared to existing methods, and also shows high visual quality in reprojecting point cloud onto images in real-world scenarios.

Original languageEnglish
Pages (from-to)883-890
Number of pages8
JournalIEEE Robotics and Automation Letters
Volume10
Issue number2
DOIs
Publication statusPublished - 2025

Bibliographical note

Publisher Copyright:
© 2016 IEEE.

Keywords

  • Calibration and identification
  • intelligent transportation systems
  • sensor fusion

Fingerprint

Dive into the research topics of 'Camera-LiDAR Extrinsic Calibration Using Constrained Optimization with Circle Placement'. Together they form a unique fingerprint.

Cite this