4.3 Article

Augmented Reality Surgical Navigation System Integrated with Deep Learning

Journal

BIOENGINEERING-BASEL
Volume 10, Issue 5, Pages -

Publisher

MDPI
DOI: 10.3390/bioengineering10050617

Keywords

surgical navigation; mixed reality; augmented reality; neurosurgery; deep learning; automatic scanning; EVD surgery

Ask authors/readers for more resources

Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, the spatial information displayed in this arrangement is non-intuitive, and previous studies have mainly focused on visual aids with little attention to surgical guidance aids. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves low cost, high stability, and high accuracy, while providing intuitive guidance for surgical targets. Clinical trials confirmed the overall benefit of the system, and the recognition accuracy, sensitivity, and specificity were significantly improved compared to previous studies.
Most current surgical navigation methods rely on optical navigators with images displayed on an external screen. However, minimizing distractions during surgery is critical and the spatial information displayed in this arrangement is non-intuitive. Previous studies have proposed combining optical navigation systems with augmented reality (AR) to provide surgeons with intuitive imaging during surgery, through the use of planar and three-dimensional imagery. However, these studies have mainly focused on visual aids and have paid relatively little attention to real surgical guidance aids. Moreover, the use of augmented reality reduces system stability and accuracy, and optical navigation systems are costly. Therefore, this paper proposed an augmented reality surgical navigation system based on image positioning that achieves the desired system advantages with low cost, high stability, and high accuracy. This system also provides intuitive guidance for the surgical target point, entry point, and trajectory. Once the surgeon uses the navigation stick to indicate the position of the surgical entry point, the connection between the surgical target and the surgical entry point is immediately displayed on the AR device (tablet or HoloLens glasses), and a dynamic auxiliary line is shown to assist with incision angle and depth. Clinical trials were conducted for EVD (extra-ventricular drainage) surgery, and surgeons confirmed the system's overall benefit. A virtual object automatic scanning method is proposed to achieve a high accuracy of 1 +/- 0.1 mm for the AR-based system. Furthermore, a deep learning-based U-Net segmentation network is incorporated to enable automatic identification of the hydrocephalus location by the system. The system achieves improved recognition accuracy, sensitivity, and specificity of 99.93%, 93.85%, and 95.73%, respectively, representing a significant improvement from previous studies.

Authors

I am an author on this paper
Click your name to claim this paper and add it to your profile.

Reviews

Primary Rating

4.3
Not enough ratings

Secondary Ratings

Novelty
-
Significance
-
Scientific rigor
-
Rate this paper

Recommended

No Data Available
No Data Available