iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/978-3-030-60334-2_11
Augmented Reality-Based Lung Ultrasound Scanning Guidance | SpringerLink
Skip to main content

Augmented Reality-Based Lung Ultrasound Scanning Guidance

  • Conference paper
  • First Online:
Medical Ultrasound, and Preterm, Perinatal and Paediatric Image Analysis (ASMUS 2020, PIPPI 2020)

Abstract

Lung ultrasound (LUS) is an established non-invasive imaging method for diagnosing respiratory illnesses. With the rise of SARS-CoV-2 (COVID-19) as a global pandemic, LUS has been used to detect pneumopathy for triaging and monitoring patients who are diagnosed or suspected with COVID-19 infection. While LUS offers a cost-effective, radiation-free, and higher portability compared with chest X-ray and CT, its accessibility is limited due to its user dependency and the small number of physicians and sonographers who can perform appropriate scanning and diagnosis. In this paper, we propose a framework of guiding LUS scanning featuring augmented reality, in which the LUS procedure can be guided by projecting the scanning trajectory on the patient’s body. To develop such a system, we implement a computer vision-based detection algorithm to classify different regions on the human body. The DensePose algorithm is used to obtain body mesh data for the upper body pictured with a mono-camera. Torso sub-mesh is used to extract and overlay the eight regions corresponding to anterior and lateral chests for LUS guidance. To minimize the instability of the DensePose mesh coordinates based on different frontal angles of the camera, a machine learning regression algorithm is applied to predict the angle-specific projection model for the chest. ArUco markers are utilized for training the ground truth chest regions to be scanned, and another single ArUco marker is used for detecting the center-line of the body. The augmented scanning regions are highlighted one by one to guide the scanning path to execute the LUS procedure. We demonstrate the feasibility of guiding the LUS scanning procedure through the combination of augmented reality, computer vision, and machine learning.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Lichtenstein, D., Mezière, G., Biderman, P., Gepner, A.: The comet-tail artifact: an ultrasound sign ruling out pneumothorax. Intensiv. Care Med. 25(4), 383–388 (1999). https://doi.org/10.1007/s001340050862

    Article  Google Scholar 

  2. WHO: Coronavirus Disease 2019 (COVID-19) Situation Reports, 1 April 2020. WHO Situation Report 2019(72), 1–19. https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200324-sitrep-64-covid-19.pdf?sfvrsn=703b2c40_2%0Ahttps://www.who.int/docs/default-source/coronaviruse/situation-reports/20200401-sitrep-72-covid-19.pdf?sfvrsn=3dd8971b_2

  3. Soldati, G., et al.: Is there a role for lung ultrasound during the COVID-19 pandemic? J. Ultrasound Med. Off. J. Am. Inst. Ultrasound Med., 1–4 (2020) https://doi.org/10.1002/jum.15284Ads

  4. Lichtenstein, D.A., Mezière, G.A.: Relevance of lung ultrasound in the diagnosis of acute respiratory failure the BLUE protocol. Chest 134(1), 117–125 (2008). https://doi.org/10.1378/chest.07-2800

    Article  Google Scholar 

  5. Chen, Y., Tian, Y., He, M.: Monocular human pose estimation: a survey of deep learning-based methods. Comput. Vis. Image Underst. 192, 1–23 (2020). https://doi.org/10.1016/j.cviu.2019.102897

    Article  Google Scholar 

  6. Toshev, A., Szegedy, C.: DeepPose: Human pose estimation via deep neural networks. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 1653–1660 (2014). https://doi.org/10.1109/CVPR.2014.214

  7. Carreira, J., Agrawal, P., Fragkiadaki, K., Malik, J.: Human pose estimation with iterative error feedback. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 4733–4742, December 2016. https://doi.org/10.1109/CVPR.2016.512

  8. Sun, C., Shrivastava, A., Singh, S., Gupta, A.: Revisiting unreasonable effectiveness of data in deep learning era. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 843–852, October 2017. https://doi.org/10.1109/ICCV.2017.97

  9. Luvizon, D.C., Tabia, H., Picard, D.: Human pose regression by combining indirect part detection and contextual information. Comput. Graph. (Pergamon) 85, 15–22 (2019). https://doi.org/10.1016/j.cag.2019.09.002

    Article  Google Scholar 

  10. Fourure, D., Emonet, R., Fromont, E., Muselet, D., Tremeau, A., Wolf, C.: Residual conv-deconv grid network for semantic segmentation. In: British Machine Vision Conference, BMVC 2017 (2017). https://arxiv.org/pdf/1707.07958.pdf

  11. Sun, K., Xiao, B., Liu, D., Wang, J.: Deep high-resolution representation learning for human pose estimation. In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 5686–5696, June 2019. https://doi.org/10.1109/CVPR.2019.00584

  12. Tang, W., Wu, Y.: Does learning specific features for related parts help human pose estimation? In: Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 1107–1116, June 2019. https://doi.org/10.1109/CVPR.2019.00120

  13. Chen, Y., Shen, C., Wei, X.S., Liu, L., Yang, J.: Adversarial PoseNet: a structure-aware convolutional network for human pose estimation. In: Proceedings of the IEEE International Conference on Computer Vision, pp. 1221–1230, October 2017. https://doi.org/10.1109/ICCV.2017.137

  14. Guler, R.A., Neverova, N., Kokkinos, I.: DensePose: dense human pose estimation in the wild. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, pp. 7297–7306 (2016). https://doi.org/10.1109/CVPR.2017.280

  15. Romero-Ramirez, F.J., Muñoz-Salinas, R., Medina-Carnicer, R.: Speeded up detection of squared fiducial markers. Image Vis. Comput. 76, 38–47 (2018). https://doi.org/10.1016/j.imavis.2018.05.004

    Article  Google Scholar 

  16. Volpicelli, G., et al.: Bedside lung ultrasound in the assessment of alveolar-interstitial syndrome. Am. J. Emerg. Med. 24(6), 689–696 (2006). https://doi.org/10.1016/j.ajem.2006.02.013

    Article  Google Scholar 

  17. Manivel, V., Lesnewski, A., Shamim, S., Carbonatto, G., Govindan, T.: CLUE: COVID-19 lung ultrasound in emergency department. Emerg. Med. Australas., EMA (2020). https://doi.org/10.1111/1742-6723.13546

    Book  Google Scholar 

  18. Moore, S., Gardiner, E.: Point of care and intensive care lung ultrasound: a reference guide for practitioners during COVID-19. Radiography (2020). https://doi.org/10.1016/j.radi.2020.04.005

    Article  Google Scholar 

  19. Bouhemad, B., Mongodi, S., Via, G., Rouquette, I.: Ultrasound for “lung monitoring” of ventilated patients. Anesthesiology 122(2), 437–447 (2015). https://doi.org/10.1097/ALN.0000000000000558

    Article  Google Scholar 

  20. Lee, F.C.Y.: Lung ultrasound-a primary survey of the acutely dyspneic patient. J. Intensiv. Care 4(1) (2016). https://doi.org/10.1186/s40560-016-0180-1

  21. Via, G., et al.: Instrument to Respiratory Monitoring Tool, August 2012

    Google Scholar 

  22. Soldati, G., et al.: Proposal for international standardization of the use of lung ultrasound for patients with COVID-19: a simple, quantitative, reproducible method. J. Ultrasound Med. (2020). https://doi.org/10.1002/jum.15285

  23. Moro, F., Buonsenso, D., et al.: How to perform lung ultrasound in pregnant women with suspected COVID-19. Ultrasound Obstet. Gynecol. Off. J. Int. Soc. Ultrasound Obstet. Gynecol. 55(5), 593–598 (2020). https://doi.org/10.1002/uog.22028

    Article  Google Scholar 

  24. Cortes, C., Vapnik, V.: Support-vector networks. Mach. Learn. 20(3), 273–297 (1995)

    MATH  Google Scholar 

  25. Awad, M., Khanna, R.: Support vector regression. In: Efficient learning machines, pp. 67–80. Apress, Berkeley (2015)

    Google Scholar 

Download references

Acknowledgment

The financial support was provided through the Worcester Polytechnic Institute’s internal fund; in part by the National Institute of Health (DP5 OD028162).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Keshav Bimbraw .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Bimbraw, K., Ma, X., Zhang, Z., Zhang, H. (2020). Augmented Reality-Based Lung Ultrasound Scanning Guidance. In: Hu, Y., et al. Medical Ultrasound, and Preterm, Perinatal and Paediatric Image Analysis. ASMUS PIPPI 2020 2020. Lecture Notes in Computer Science(), vol 12437. Springer, Cham. https://doi.org/10.1007/978-3-030-60334-2_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-60334-2_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-60333-5

  • Online ISBN: 978-3-030-60334-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics