iBet uBet web content aggregator. Adding the entire web to your favor.
iBet uBet web content aggregator. Adding the entire web to your favor.



Link to original content: https://unpaywall.org/10.1007/978-3-031-58776-4_20
Conformer: A Parallel Segmentation Network Combining Swin Transformer and Convolutional Neutral Network | SpringerLink
Skip to main content

Conformer: A Parallel Segmentation Network Combining Swin Transformer and Convolutional Neutral Network

  • Conference paper
  • First Online:
Fast, Low-resource, and Accurate Organ and Pan-cancer Segmentation in Abdomen CT (FLARE 2023)

Abstract

Abdominal organ segmentation can help doctors to have a more intuitive observation of the abdominal organ structure and tissue lesion structure, thereby improving the accuracy of disease diagnosis. Accurate segmentation results can provide valuable information for clinical diagnosis and follow-up, such as organ size, location, boundary status, and spatial relationship of multiple organs. Manual labels are precious and difficult to obtain in medical segmentation, so the use of pseudo-labels is an irresistible trend. In this paper, we demonstrate that pseudo-labels are beneficial to enrich the learning samples and enhance the feature learning ability of the model for abdominal organs and tumors. In this paper, we propose a semi-supervised parallel segmentation model that simultaneously aggregates local and global information using parallel modules of CNNS and transformers at high scales. The two-stage strategy and lightweight network make our model extremely efficient. Our method achieved an average DSC score of 89.75% and 3.78% for the organs and tumors, respectively, on the testing set. The average NSD scores were 93.51% and 1.82% for the organs and tumors, respectively. The average running time and area under GPU memory-time curve are 14.85 s and 15963 MB.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
$34.99 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 99.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 74.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Tang, Y., et al.: High-resolution 3D abdominal segmentation with random patch network fusion. Med. Image Anal. 69, 101894 (2021)

    Article  Google Scholar 

  2. Liu, W., Xu, W., Yan, S., Wang, L., Li, H., Yang, H.: Combining self-training and hybrid architecture for semi-supervised abdominal organ segmentation. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 281–292. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_25

    Chapter  Google Scholar 

  3. Liu, W., et al.: PHTrans: parallelly aggregating global and local representations for medical image segmentation. In: Wang, L., Dou, Q., Fletcher, P.T., Speidel, S., Li, S. (eds.) MICCAI 2022. LNCS, vol. 13435, pp. 235–244. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-16443-9_23

    Chapter  Google Scholar 

  4. Cao, H., et al.: Swin-Unet: Unet-like pure transformer for medical image segmentation. In: Karlinsky, L., Michaeli, T., Nishino, K. (eds.) ECCV 2022. LNCS, vol. 13803, pp. 205–218. Springer, Cham (2023). https://doi.org/10.1007/978-3-031-25066-8_9

    Chapter  Google Scholar 

  5. Liu, Z., et al.: Swin transformer: hierarchical vision transformer using shifted windows. In: Proceedings of the IEEE/CVF International Conference on Computer Vision, pp. 10012–10022 (2021)

    Google Scholar 

  6. Huang, Z., et al.: Revisiting nnU-net for iterative pseudo labeling and efficient sliding window inference. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 178–189. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_16

    Chapter  Google Scholar 

  7. Wang, E., Zhao, Y., Wu, Y.: Cascade dual-decoders network for abdominal organs segmentation. In: Ma, J., Wang, B. (eds.) FLARE 2022. LNCS, vol. 13816, pp. 202–213. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-23911-3_18

    Chapter  Google Scholar 

  8. Ma, J., et al.: Fast and low-GPU-memory abdomen CT organ segmentation: the flare challenge. Med. Image Anal. 82, 102616 (2022)

    Article  Google Scholar 

  9. Ma, J., et al.: Unleashing the strengths of unlabeled data in pan-cancer abdominal organ quantification: the flare22 challenge. arXiv preprint arXiv:2308.05862 (2023)

  10. Clark, K., et al.: The cancer imaging archive (TCIA): maintaining and operating a public information repository. J. Digit. Imaging 26(6), 1045–1057 (2013)

    Article  Google Scholar 

  11. Bilic, P., et al.: The liver tumor segmentation benchmark (LiTS). Med. Image Anal. 84, 102680 (2023)

    Article  Google Scholar 

  12. Simpson, A.L., et al.: A large annotated medical image dataset for the development and evaluation of segmentation algorithms. arXiv preprint arXiv:1902.09063 (2019)

  13. Heller, N., et al.: The state of the art in kidney and kidney tumor segmentation in contrast-enhanced CT imaging: results of the KiTS19 challenge. Med. Image Anal. 67, 101821 (2021)

    Article  Google Scholar 

  14. Heller, N., et al.: An international challenge to use artificial intelligence to define the state-of-the-art in kidney and kidney tumor segmentation in CT imaging. Am. Soc. Clin. Oncol. 38(6), 626 (2020)

    Article  Google Scholar 

  15. Gatidis, S., et al.: A whole-body FDG-PET/CT dataset with manually annotated tumor lesions. Sci. Data 9(1), 601 (2022)

    Article  Google Scholar 

  16. Gatidis, S., et al.: The autopet challenge: towards fully automated lesion segmentation in oncologic PET/CT imaging. Preprint at Research Square (Nature Portfolio) (2023)

    Google Scholar 

  17. Wasserthal, J., et al.: TotalSegmentator: robust segmentation of 104 anatomic structures in CT images. Radiol. Artif. Intell. 5(5), e230024 (2023)

    Article  Google Scholar 

  18. Ma, J., et al.: AbdomenCT-1K: is abdominal organ segmentation a solved problem? IEEE Trans. Pattern Anal. Mach. Intell. 44(10), 6695–6714 (2022)

    Article  Google Scholar 

  19. Yushkevich, P.A., Gao, Y., Gerig, G.: ITK-snap: an interactive tool for semi-automatic segmentation of multi-modality biomedical images. In: Annual International Conference of the IEEE Engineering in Medicine and Biology Society, pp. 3342–3345 (2016)

    Google Scholar 

  20. Isensee, F., Jaeger, P.F., Kohl, S.A.A., Petersen, J., Maier-Hein, K.H.: nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat. Methods 18(2), 203–211 (2021)

    Article  Google Scholar 

  21. Ma, J., He, Y., Li, F., Han, L., You, C., Wang, B.: Segment anything in medical images. Nat. Commun. 15, 654 (2024)

    Article  Google Scholar 

  22. Pavao, A., et al.: CodaLab competitions: an open source platform to organize scientific challenges. J. Mach. Learn. Res. 24(198), 1–6 (2023)

    Google Scholar 

Download references

Acknowledgements

The authors of this paper declare that the segmentation method they implemented for participation in the FLARE 2023 challenge has not used any pre-trained models nor additional datasets other than those provided by the organizers. The proposed solution is fully automatic without any manual intervention. We thank all the data owners for making the CT scans publicly available and CodaLab [22] for hosting the challenge platform.

This work was supported by National Natural Science Foundation of China (62271149), Fujian Provincial Natural Science Foundation project(2021J02019, 2021J01578).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Mingjing Yang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2024 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, Y., Wu, Z., Chen, H., Yang, M. (2024). Conformer: A Parallel Segmentation Network Combining Swin Transformer and Convolutional Neutral Network. In: Ma, J., Wang, B. (eds) Fast, Low-resource, and Accurate Organ and Pan-cancer Segmentation in Abdomen CT. FLARE 2023. Lecture Notes in Computer Science, vol 14544. Springer, Cham. https://doi.org/10.1007/978-3-031-58776-4_20

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-58776-4_20

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-58775-7

  • Online ISBN: 978-3-031-58776-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics