4.6 Article

Confidence-Based Hybrid Tracking to Overcome Visual Tracking Failures in Calibration-Less Vision-Guided Micromanipulation

Journal

Publisher

IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TASE.2019.2932724

Keywords

Visualization; Target tracking; Robot sensing systems; Imaging; Uncertainty; Cell manipulation; robot vision systems

Funding

  1. Zhejiang University/University of Illinois at Urbana-Champaign Institute
  2. Natural Science Foundation of Zhejiang Province [LQ19F030013]
  3. SUTD-MIT International Design Centre (IDC)

Ask authors/readers for more resources

This article proposes a confidence-based approach for combining two visual tracking techniques to minimize the influence of unforeseen visual tracking failures to achieve uninterrupted vision-based control. Despite research efforts in vision-guided micromanipulation, existing systems are not designed to overcome visual tracking failures, such as inconsistent illumination condition, regional occlusion, unknown structures, and nonhomogenous background scene. There remains a gap in expanding current procedures beyond the laboratory environment for practical deployment of vision-guided micromanipulation system. A hybrid tracking method, which combines motion-cue feature detection and score-based template matching, is incorporated in an uncalibrated vision-guided workflow capable of self-initializing and recovery during the micromanipulation. Weighted average, based on the respective confidence indices of the motion-cue feature localization and template-based trackers, is inferred from the statistical accuracy of feature locations and the similarity score-based template matches. Results suggest improvement of the tracking performance using hybrid tracking under the conditions. The mean errors of hybrid tracking are maintained at subpixel level under adverse experimental conditions while the original template matching approach has mean errors of 1.53, 1.73, and 2.08 pixels. The method is also demonstrated to be robust in the nonhomogeneous scene with an array of plant cells. By proposing a self-contained fusion method that overcomes unforeseen visual tracking failures using pure vision approach, we demonstrated the robustness in our developed low-cost micromanipulation platform. Note to Practitioners-Cell manipulation is traditionally done in highly specialized facilities and controlled environment. Existing vision-based methods do not readily fulfill the need for the unique requirements in cell manipulation including prospective plant cell-related applications. There is a need for robust visual tracking to overcome visual tracking failure during the automated vision-guided micromanipulation. To address the gap in maintaining continuous tracking for vision-guided micromanipulation under unforeseen visual tracking failures, we proposed a purely visual data-driven hybrid tracking approach. Our proposed confidence-based approach combines two tracking techniques to minimize the influence of scene uncertainties, hence, achieving uninterrupted vision-based control. Because of its readily deployable design, the method can be generalized for a wide range of vision-guided micromanipulation applications. This method has the potential to significantly expand the capability of cell manipulation technology to even include prospective applications associated with plant cells, which are yet to be explored.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.6
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available