: With coronavirus disease 2019 (COVID-19) cases rising rapidly, deep learning has emerged as a promising diagnosis technique. However, identifying the most accurate models to characterize COVID-19 patients is challenging because comparing results obtained with different types of data and acquisition processes is non-trivial. In this paper we designed, evaluated, and compared the performance of 20 convolutional neutral networks in classifying patients as COVID-19 positive, healthy, or suffering from other pulmonary lung infections based on chest computed tomography (CT) scans, serving as the first to consider the EfficientNet family for COVID-19 diagnosis and employ intermediate activation maps for visualizing model performance. All models are trained and evaluated in Python using 4173 chest CT images from the dataset entitled "A COVID multiclass dataset of CT scans," with 2168, 758, and 1247 images of patients that are COVID-19 positive, healthy, or suffering from other pulmonary infections, respectively. EfficientNet-B5 was identified as the best model with an F1 score of 0.9769 ± 0.0046, accuracy of 0.9759 ± 0.0048, sensitivity of 0.9788 ± 0.0055, specificity of 0.9730 ± 0.0057, and precision of 0.9751 ± 0.0051. On an alternate 2-class dataset, EfficientNetB5 obtained an accuracy of 0.9845 ± 0.0109, F1 score of 0.9599 ± 0.0251, sensitivity of 0.9682 ± 0.0099, specificity of 0.9883 ± 0.0150, and precision of 0.9526 ± 0.0523. Intermediate activation maps and Gradient-weighted Class Activation Mappings offered human-interpretable evidence of the model's perception of ground-class opacities and consolidations, hinting towards a promising use-case of artificial intelligence-assisted radiology tools. With a prediction speed of under 0.1 s on GPUs and 0.5 s on CPUs, our proposed model offers a rapid, scalable, and accurate diagnostic for COVID-19.

Efficient and visualizable convolutional neural networks for COVID-19 classification using Chest CT

Rocca, Marianna La;
2022-01-01

Abstract

: With coronavirus disease 2019 (COVID-19) cases rising rapidly, deep learning has emerged as a promising diagnosis technique. However, identifying the most accurate models to characterize COVID-19 patients is challenging because comparing results obtained with different types of data and acquisition processes is non-trivial. In this paper we designed, evaluated, and compared the performance of 20 convolutional neutral networks in classifying patients as COVID-19 positive, healthy, or suffering from other pulmonary lung infections based on chest computed tomography (CT) scans, serving as the first to consider the EfficientNet family for COVID-19 diagnosis and employ intermediate activation maps for visualizing model performance. All models are trained and evaluated in Python using 4173 chest CT images from the dataset entitled "A COVID multiclass dataset of CT scans," with 2168, 758, and 1247 images of patients that are COVID-19 positive, healthy, or suffering from other pulmonary infections, respectively. EfficientNet-B5 was identified as the best model with an F1 score of 0.9769 ± 0.0046, accuracy of 0.9759 ± 0.0048, sensitivity of 0.9788 ± 0.0055, specificity of 0.9730 ± 0.0057, and precision of 0.9751 ± 0.0051. On an alternate 2-class dataset, EfficientNetB5 obtained an accuracy of 0.9845 ± 0.0109, F1 score of 0.9599 ± 0.0251, sensitivity of 0.9682 ± 0.0099, specificity of 0.9883 ± 0.0150, and precision of 0.9526 ± 0.0523. Intermediate activation maps and Gradient-weighted Class Activation Mappings offered human-interpretable evidence of the model's perception of ground-class opacities and consolidations, hinting towards a promising use-case of artificial intelligence-assisted radiology tools. With a prediction speed of under 0.1 s on GPUs and 0.5 s on CPUs, our proposed model offers a rapid, scalable, and accurate diagnostic for COVID-19.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/417714
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 14
  • Scopus 36
  • ???jsp.display-item.citation.isi??? 27
social impact