期刊
IEEE TRANSACTIONS ON ULTRASONICS FERROELECTRICS AND FREQUENCY CONTROL
卷 69, 期 5, 页码 1691-1702出版社
IEEE-INST ELECTRICAL ELECTRONICS ENGINEERS INC
DOI: 10.1109/TUFFC.2022.3162097
关键词
Kalman filters; Image reconstruction; Filtering; Three-dimensional displays; Estimation; Ultrasonic imaging; Target tracking; Kalman filtering; neural networks; object tracking; optical ultrasound (OpUS); out-of-plane artifacts
资金
- Academy of Finland [336796, 326291, 338408]
- CMIC-EPSRC Platform [EP/M020533/1]
- Wellcome Trust [203145Z/16/Z]
- Engineering and Physical Sciences Research Council (EPSRC) [NS/A000050/1]
- Rosetrees Trust [PGS19-2/10006]
This study presents a technique that combines neural network training and Kalman filtering to achieve accurate and robust object localization and quantification in ultrasound imaging using synthetic training data and experimental data. Results show that the proposed method provides reliable and precise positional information in both simulated and experimental data, with the most accurate 3-D localization for elevational distances greater than approximately 1mm.
Many interventional surgical procedures rely on medical imaging to visualize and track instruments. Such imaging methods not only need to be real time capable but also provide accurate and robust positional information. In ultrasound (US) applications, typically, only 2-D data from a linear array are available, and as such, obtaining accurate positional estimation in three dimensions is nontrivial. In this work, we first train a neural network, using realistic synthetic training data, to estimate the out-of-plane offset of an object with the associated axial aberration in the reconstructed US image. The obtained estimate is then combined with a Kalman filtering approach that utilizes positioning estimates obtained in previous time frames to improve localization robustness and reduce the impact of measurement noise. The accuracy of the proposed method is evaluated using simulations, and its practical applicability is demonstrated on experimental data obtained using a novel optical US imaging setup. Accurate and robust positional information is provided in real time. Axial and lateral coordinates for out-of-plane objects are estimated with a mean error of 0.1 mm for simulated data and a mean error of 0.2 mm for experimental data. The 3-D localization is most accurate for elevational distances larger than 1 mm, with a maximum distance of 6 mm considered for a 25-mm aperture.
作者
我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。
推荐
暂无数据