Preview

Sechenov Medical Journal

Advanced search

Segmentation of renal structures based on contrast computed tomography scans using a convolutional neural network

https://doi.org/10.47093/2218-7332.2023.14.1.39-49

Abstract

   Aim. Develop a neural network to build 3D models of kidney neoplasms and adjacent structures.
   Materials and methods. DICOM data (Digital Imaging and Communications in Medicine standard) from 41 patients with kidney neoplasms were used. Data included all phases of contrast-enhanced multispiral computed tomography. We split the data: 32 observations for the training set and 9 – for the validation set. At the labeling stage, the arterial, venous, and excretory phases were taken, affine registration was performed to jointly match the location of the kidneys, and noise was removed using a median filter and a non-local means filter. Then the masks of arteries, veins, ureters, kidney parenchyma and kidney neoplasms were marked. The model was the SegResNet architecture. To assess the quality of segmentation, the Dice score was compared with the AHNet, DynUNet models and with three variants of the nnU-Net (lowres, fullres, cascade) model.
   Results. On the validation subset, the values of the Dice score of the SegResNet architecture were: 0.89 for the normal parenchyma of the kidney, 0.58 for the kidney neoplasms, 0.86 for arteries, 0.80 for veins, 0.80 for ureters. The mean values of the Dice score for SegResNet, AHNet and DynUNet were 0.79; 0.67; and 0.75, respectively. When compared with the nnU-Net model, the Dice score was greater for the kidney parenchyma in SegResNet – 0.89 compared to three model variants: lowres – 0.69, fullres – 0.70, cascade – 0.69. At the same time, for the neoplasms of the parenchyma of the kidney, the Dice score was comparable: for SegResNet – 0.58, for nnU-Net fullres – 0.59; lowres and cascade had lower Dice score of 0.37 and 0.45, respectively.
   Conclusion. The resulting SegResNet neural network finds vessels and parenchyma well. Kidney neoplasms are more difficult to determine, possibly due to their small size and the presence of false alarms in the network. It is planned to increase the sample size to 300 observations and use post-processing operations to improve the model.

About the Authors

I. М. Chernenkiy
Sechenov First Moscow State Medical University (Sechenov University)
Russian Federation

Ivan М. Chernenkiy, software engineer

Institute of Urology and Human Reproductive Systems

Center for Neural Network Technologies

119991

8/2, Trubetskaya str.

Moscow



M. M. Chernenkiy
Sechenov First Moscow State Medical University (Sechenov University)
Russian Federation

Michail M. Chernenkiy, physical engineer

Institute of Urology and Reproductive Health

Center for Neural Network Technologies

119991

8/2, Trubetskaya str.

Moscow



D. N. Fiev
Sechenov First Moscow State Medical University (Sechenov University)
Russian Federation

Dmitry N. Fiev, Dr. of Sci. (Medicine), urologist

Institute of Urology and Human Reproductive Health

119991

8/2, Trubetskaya str.

Moscow



E. S. Sirota
Sechenov First Moscow State Medical University (Sechenov University)
Russian Federation

Evgeny S. Sirota, Dr. of Sci. (Medicine), Senior Researcher

Institute of Urology and Reproductive Health

119991

8/2, Trubetskaya str.

Moscow



References

1. Axel E. M., Matveev V. В. Statistics of malignant tumors of urinary and male urogenital organs in Russia and the countries of the former USSR. Cancer Urology. 2019; 15 (2): 15–24 (In Russian). doi: 10.17650/1726-9776-2019-15-2-15-24

2. Verma J., Nath M., Tripathi P., et al. Analysis and identification of kidney stone using Kth nearest neighbour (KNN) and support vector machine (SVM) classification techniques. Pattern Recognit. Image Anal. 2017; 27, 574–580. doi: 10.1134/S1054661817030294

3. Sudharson S., Kokil P. An ensemble of deep neural networks for kidney ultrasound image classification. Comput Methods Programs Biomed. 2020 Dec; 197: 105709. doi: 10.1016/j.cmpb.2020.105709. Epub 2020 Aug 23. PMID: 32889406

4. Ronneberger O., Fischer P., Brox T. U-Net: convolutional networks for biomedical image segmentation. In: Navab N., Hornegger J., Wells W., Frangi A. (eds) Medical Image Computing and Computer-Assisted Intervention – MICCAI 2015. MICCAI 2015. Lecture Notes in Computer Science. Vol 9351. Springer, Cham. doi: 10.1007/978-3-319-24574-4_28

5. Golts A., Khapun D., Shats D., et al. An ensemble of 3D U-Net based models for segmentation of kidney and masses in CT scans. In: Heller N., Isensee F., Trofimova D., Tejpaul R., Papanikolopoulos N., Weight C. (eds) Kidney and Kidney Tumor Segmentation. KiTS 2021. Lecture Notes in Computer Science. Vol. 13168. Springer, Cham. doi: 10.1007/978-3-030-98385-7_14

6. Zhao Z., Chen H., Wang L. A Coarse-to-Fine framework for the 2021 kidney and kidney tumor segmentation challenge. In: Heller N., Isensee F., Trofimova D., Tejpaul R., Papanikolopoulos N., Weight C. (eds) Kidney and Kidney Tumor Segmentation. KiTS 2021. Lecture Notes in Computer Science, vol 13168. Springer, Cham. doi: 10.1007/978-3-030-98385-7_8

7. da Cruz L.B., Araújo J. D. L., Ferreira J. L., et al. Kidney segmentation from computed tomography images using deep neural network. Comput Biol Med. 2020 Aug; 123: 103906. doi: 10.1016/j.compbiomed.2020.103906. Epub 2020 Jul 11. PMID: 32768047

8. Alom M. Z., Yakopcic C., Hasan M., et al. Recurrent residual U-Net for medical image segmentation. J Med Imaging (Bellingham). 2019 Jan; 6 (1): 014006. doi: 10.1117/1.JMI.6.1.014006. Epub 2019 Mar 27. PMID: 30944843; PMCID: PMC6435980

9. Novikov A. A., Major D., Wimmer M., et al. Deep sequential segmentation of organs in volumetric medical scans. IEEE Trans Med Imaging. 2019 May; 38 (5): 1207–1215. doi: 10.1109/TMI.2018.2881678. Epub 2018 Nov 16. PMID: 30452352

10. Baid U., Ghodasara S., Mohan S., et al. The RSNA-ASNR-MICCAI BRATS 2021 benchmark on brain tumor segmentation and radiogenomic classification // arXiv preprint arXiv:2107.02314. 2021. doi: 10.48550/arXiv.2107.02314

11. Antonelli M., Reinke A., Bakas S., et al. The medical segmentation decathlon. Nat Commun. 2022 Jul 15; 13(1): 4128. doi: 10.1038/s41467-022-30695-9. PMID: 35840566

12. Myronenko A. 3D MRI brain tumor segmentation using autoencoder regularization. In: Crimi A., Bakas S., Kuijf H., Keyvan F., Reyes M., van Walsum T. (eds) Brainlesion: Glioma, Multiple Sclerosis, Stroke and Traumatic Brain Injuries. BrainLes 2018. Lecture Notes in Computer Science. Vol. 11384. Springer, Cham. doi: 10.1007/978-3-030-11726-9_28

13. Hatamizadeh A., Tang Yu., Nathet V., et al. UNETR: Transformers for 3D medical image segmentation. Proceedings of the IEEE / CVF Winter Conference on Applications of Computer Vision. 2022. P. 574–584.

14. Isensee F., Jaeger P. F., Kohl S. A. A., et al. nnU-Net: a self-configuring method for deep learning-based biomedical image segmentation. Nat Methods. 2021 Feb; 18 (2): 203–211. doi: 10.1038/s41592-020-01008-z. Epub 2020 Dec 7. PMID: 33288961

15. Liu S., Xu D., Zhou K., et al. 3D anisotropic hybrid network: transferring convolutional features from 2D images to 3D anisotropic volumes. In: Frangi A., Schnabel J., Davatzikos C., Alberola-López C., Fichtinger G. (eds) Medical Image Computing and Computer Assisted Intervention – MICCAI 2018. MICCAI 2018. Lecture Notes in Computer Science. Vol 11071. Springer, Cham. doi: 10.1007/978-3-030-00934-2_94

16. Cao H., Wang Yu., Chen J., et al. Swin-Unet: Unet-like pure transformer for medical image segmentation. In: Karlinsky L., Michaeli T., Nishino K. (eds) Computer Vision – ECCV 2022 Workshops. ECCV 2022. Lecture Notes in Computer Science. Vol. 13803. Springer, Cham. doi: 10.1007/978-3-031-25066-8_9

17. Pavlov N. A., Andreychenko A. E., Vladzymyrskyy A. V., et al. Reference medical datasets (MosMedData) for independent external evaluation of algorithms based on artificial intelligence in diagnostics. Digital Diagnostics. 2021; 2(1): 49–66. doi: 10.17816/DD60635


Review

Views: 1804


Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 License.


ISSN 2218-7332 (Print)
ISSN 2658-3348 (Online)