Abstract
Deep needle insertion to a target often poses a huge challenge, requiring a combination of specialized skills, assistive technology, and extensive training. One of the frequently encountered medical scenarios demanding such expertise includes the needle insertion into a femoral vessel in the groin. Conventionally, deep needle insertion is guided by ultrasound (US) imaging. However, utilizing US for needle tracking demands specialized training and skill to manipulate the probe effectively and interpret the imaging accurately. To address this challenge, this article presents an innovative technology for needle tip real-time tracking. This advancement will be instrumental in facilitating robotic-guided needle insertion toward the identified target. Specifically, our approach revolves around the creation of scattering imaging using an optical fiber-equipped needle and uses convolutional neural network (CNN)-based algorithms to enable real-time estimation of the needle tip's position and orientation during insertion procedures. The efficacy of the proposed technology was rigorously evaluated through three experiments. The first two experiments involved rubber and bacon phantoms to simulate groin anatomy. The positional errors averaging 2.3±1.5 and 2.0±1.2 mm, and the orientation errors averaging 0.2±0.11 and 0.16±0.1 rad. Furthermore, the system's capabilities were validated through experiments conducted on fresh porcine phantom mimicking more complex anatomical structures, yielding the positional accuracy results of 3.2±3.1 mm and an orientational accuracy of 0.19±0.1 rad. Given the average femoral arterial radius of 4-5 mm, the proposed system is demonstrated with a great potential for precise needle guidance in femoral artery insertion procedures. In addition, the findings highlight the broader potential applications of the system in the medical field.
Originalsprog | Engelsk |
---|---|
Tidsskrift | IEEE Sensors Journal |
Vol/bind | 24 |
Udgave nummer | 17 |
Sider (fra-til) | 28145-28153 |
ISSN | 1530-437X |
DOI | |
Status | Udgivet - 2024 |