iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/S11548-017-1631-4
CASPER: computer-aided segmentation of imperceptible motion—a learning-based tracking of an invisible needle in ultrasound | International Journal of Computer Assisted Radiology and Surgery Skip to main content
Log in

CASPER: computer-aided segmentation of imperceptible motion—a learning-based tracking of an invisible needle in ultrasound

  • Original Article
  • Published:
International Journal of Computer Assisted Radiology and Surgery Aims and scope Submit manuscript

Abstract

Purpose

This paper presents a new micro-motion-based approach to track a needle in ultrasound images captured by a handheld transducer.

Methods

We propose a novel learning-based framework to track a handheld needle by detecting microscale variations of motion dynamics over time. The current state of the art on using motion analysis for needle detection uses absolute motion and hence work well only when the transducer is static. We have introduced and evaluated novel spatiotemporal and spectral features, obtained from the phase image, in a self-supervised tracking framework to improve the detection accuracy in the subsequent frames using incremental training. Our proposed tracking method involves volumetric feature selection and differential flow analysis to incorporate the neighboring pixels and mitigate the effects of the subtle tremor motion of a handheld transducer. To evaluate the detection accuracy, the method is tested on porcine tissue in-vivo, during the needle insertion in the biceps femoris muscle.

Results

Experimental results show the mean, standard deviation and root-mean-square errors of \(1.28^{\circ }\), \(1.09^{\circ }\) and \(1.68^{\circ }\) in the insertion angle, and 0.82, 1.21, 1.47 mm, in the needle tip, respectively.

Conclusions

Compared to the appearance-based detection approaches, the proposed method is especially suitable for needles with ultrasonic characteristics that are imperceptible in the static image and to the naked eye.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8

Similar content being viewed by others

References

  1. Matalon TA, Silver B (1990) US guidance of interventional procedures. Radiology 174(1):43–47

    Article  CAS  PubMed  Google Scholar 

  2. Chin Ki Jinn, Perlas Anahi, Chan Vincent W S, Brull Richard (2008) Needle visualization in ultrasound-guided regional anesthesia: challenges and solutions. Reg Anesthesia and Pain Med 33(6):532–544

    Google Scholar 

  3. Ayvali E, Desai JP (2015) Optical flow-based tracking of needles and needle-tip localization using circular hough transform in ultrasound images. Ann Biomed Eng 43(8):1828–1840

    Article  PubMed  Google Scholar 

  4. Uhercik M, Kybic J, Liebgott H, Cachard C (2010) Model fitting using ransac for surgical tool localization in 3D ultrasound images. IEEE Trans Biomed Eng 57(8):1907–1916

    Article  PubMed  Google Scholar 

  5. Zhao Y, Cachard C, Liebgott H (2013) Automatic needle detection and tracking in 3d ultrasound using an ROI-based RANSAC and Kalman method. Ultrasonic imaging 35(4):283–306

    Article  PubMed  Google Scholar 

  6. Ding M, Fenster A (2004) Projection-based needle segmentation in 3D ultrasound images. Comput aided surg : int Soc Comput Aided Surg 9(5):193–201

    Google Scholar 

  7. Wu Q, Yuchi M, Ding M (2014) Phase grouping-based needle segmentation in 3d trans-rectal ultrasound-guided prostate trans-perineal therapy. Ultrasound in Med Biol 40(4):804–816

    Article  Google Scholar 

  8. Beigi P, Salcudean T, Rohling RN, Lessoway VA, Ng GC (2015) Needle trajectory and tip localization in 3D ultrasound using a moving stylus. Ultrasound in Med Biol 41(7):2057–2070

    Article  Google Scholar 

  9. Zhao Y, Shen Y, Bernard A, Cachard C, Liebgott H (2017) Evaluation and comparison of current biopsy needle localization and tracking methods using 3D ultrasound. Ultrasonics 73:206–220

    Article  CAS  PubMed  Google Scholar 

  10. Adebar TK, Fletcher AE, Okamura AM (2014) 3D ultrasound-guided robotic needle steering in biological tissue. IEEE Trans Biomed Eng 61(12):2899–2910

    Article  PubMed  PubMed Central  Google Scholar 

  11. Hakime A, Deschamps F, De Carvalho EGM, Barah A, Auperin A, De Baere T (2012) Electromagnetic-tracked biopsy under ultrasound guidance: preliminary results. Cardiovasc interv radiol 35(4):898–905

    Article  Google Scholar 

  12. Sviggum HP, Ahn K, Dilger JA, Smith HM (2013) Needle echogenicity in sonographically guided regional anesthesia: blinded comparison of 4 enhanced needles and validation of visual criteria for evaluation. J Ultrasound in Med 32(1):143–148

    Article  Google Scholar 

  13. Kim C, Chang D, Petrisor D, Chirikjian G, Han M, Stoianovici D (2013) Ultrasound probe and needle-guide calibration for robotic ultrasound scanning and needle targeting. IEEE Trans Biomed Eng 60(6):1728–1734

    Article  PubMed  Google Scholar 

  14. Cowan NJ, Goldberg K, Chirikjian GS, Fichtinger G, Alterovitz R, Reed KB, Kallem V, Park W, Misra S, Okamura AM (2011) Robotic needle steering: Design, modeling, planning, and image guidance. Surgical robotics. Springer, New York, pp 557–582

    Chapter  Google Scholar 

  15. Boctor EM, Choti MA, Burdette EC, Webster RJ (2008) Three-dimensional ultrasound-guided robotic needle placement: an experimental evaluation. Int J Med Robot Comput Assist Surg 4(2):180–91

    Article  Google Scholar 

  16. Stephanie Cheung, Robert Rohling (2004) Enhancement of needle visibility in ultrasound-guided percutaneous procedures. Ultrasound in Med Biol 30(5):617–624

    Article  Google Scholar 

  17. Beigi P, Salcudean T, Rohling RN, Lessoway VA, Ng GC (2016) Automatic detection of a hand-held needle in ultrasound via phased-based analysis of the tremor motion. In: SPIE medical imaging, pp 97860I-1–97860I-6

  18. Beigi P, Salcudean T, Rohling RN, Ng GC (2016) Spectral analysis of the tremor motion for needle detection in curvilinear ultrasound via spatio-temporal linear sampling. Int J Comput Assist Radiol Surg 11(6):1183–1192

    Article  PubMed  Google Scholar 

  19. Poggio T, Cauwenberghs G (2001) Incremental and decremental support vector machine learning. Adv neural inform process syst 13:409

    Google Scholar 

Download references

Acknowledgements

This work is jointly funded by the Natural Sciences and Engineering Research Council of Canada (NSERC) and the Canadian Institutes of Health Research (CIHR). Thanks to Philips Ultrasound for supplying the ultrasound machine and research interface.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Parmida Beigi.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

All applicable international, national, and/or institutional guidelines for the care and use of animals were followed.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Beigi, P., Rohling, R., Salcudean, S.E. et al. CASPER: computer-aided segmentation of imperceptible motion—a learning-based tracking of an invisible needle in ultrasound. Int J CARS 12, 1857–1866 (2017). https://doi.org/10.1007/s11548-017-1631-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11548-017-1631-4

Keywords

Navigation