In Machine Learning, new architectures are continually proposed, making difficult to evaluate which configurations better fit specific fields and tasks. The most reliable way to overcome this issue is to test them using the same data and parameters. In this work, five state of art deep neural network architectures has been performed in a promising field of health technology: the breath analysis. In particular it is reported that standard convolutional neural networks exploiting inductive bias, do not perform as well as the AUCO ResNet, an architecture designed for audio classification. In addition, the Vision Transformer model need lots of data to learn patterns showing the limitation of this technique even when transfer learning is performed. © 2023 CEUR-WS. All rights reserved.

A benchmarking study of deep learning techniques applied for breath analysis

Dentamaro Vincenzo
Conceptualization
;
Impedovo Donato
Conceptualization
;
Pirlo Giuseppe;
2023-01-01

Abstract

In Machine Learning, new architectures are continually proposed, making difficult to evaluate which configurations better fit specific fields and tasks. The most reliable way to overcome this issue is to test them using the same data and parameters. In this work, five state of art deep neural network architectures has been performed in a promising field of health technology: the breath analysis. In particular it is reported that standard convolutional neural networks exploiting inductive bias, do not perform as well as the AUCO ResNet, an architecture designed for audio classification. In addition, the Vision Transformer model need lots of data to learn patterns showing the limitation of this technique even when transfer learning is performed. © 2023 CEUR-WS. All rights reserved.
File in questo prodotto:
File Dimensione Formato  
paper5[1].pdf

accesso aperto

Tipologia: Documento in Versione Editoriale
Licenza: Creative commons
Dimensione 684.72 kB
Formato Adobe PDF
684.72 kB Adobe PDF Visualizza/Apri

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/520520
Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 1
  • ???jsp.display-item.citation.isi??? ND
social impact