4.6 Article

Active hand-eye calibration via online accuracy-driven next-best-view selection

Journal

VISUAL COMPUTER
Volume 39, Issue 1, Pages 381-391

Publisher

SPRINGER
DOI: 10.1007/s00371-021-02336-7

Keywords

Hand-eye calibration; Automatic calibration; Next-best-view selection

Ask authors/readers for more resources

This paper proposes a novel high-accuracy active hand-eye calibration approach that improves the calibration accuracy through guiding robot movement and camera view selection. The method employs an online estimated discrete viewing quality field to guide data acquisition and selects the next-best-view based on view quality. Experimental results show that the algorithm outperforms other approaches in terms of accuracy and robustness.
We propose a novel high-accuracy active hand-eye calibration approach. In our method, the robot movement and camera view selection are both driven by and targeting the improvement of calibration accuracy. During the calibration process, the data acquisition is guided by an online estimated discrete viewing quality field (DVQF), representing the calibration quality of different views in various 3D locations. The view quality is measured by how much it reduces the uncertainty of calibration results and increases the diversity of robot poses, contributing to the calibration precision. Based on DVQF, we select the next-best-view as the target moving pose for each time step. A fully automatic system is presented to perform the overall hand-eye calibration process without any human intervention. Numerous experiments are conducted both in real-world and simulated scenarios. The proposed algorithm outperforms other approaches and shows much superiority in accuracy and robustness.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available