Capturing users’ engagement is crucial for gathering feedback about the features of a software product. In a market-driven context, current approaches to collecting and analyzing users’ feedback are based on techniques leveraging information extracted from product reviews and social media. These approaches are hardly applicable in contexts where online feedback is limited, as for the majority of apps, and software in general. In such cases, companies need to resort to face-to-face interviews to get feedback on their products. In this paper, we propose to utilize biometric data, in terms of physiological and voice features, to complement product feedback interviews with information about the engagement of the user on product-relevant topics. We evaluate our approach by interviewing users while gathering their physiological data (i.e., biofeedback) using an Empatica E4 wristband, and capturing their voice through the default audio-recorder of a common laptop. Our results show that we can predict users’ engagement by training supervised machine learning algorithms on biofeedback and voice data, and that voice features alone can be sufficiently effective. The best configurations evaluated achieve an average F1 ∼70% in terms of classification performance, and use voice features only. This work is one of the first studies in requirements engineering in which biometrics are used to identify emotions. Furthermore, this is one of the first studies in software engineering that considers voice analysis. The usage of voice features can be particularly helpful for emotion-aware feedback collection in remote communication, either performed by human analysts or voice-based chatbots, and can also be exploited to support the analysis of meetings in software engineering research.

Using Voice and Biofeedback to Predict User Engagement during Product Feedback Interviews

Alessio Ferrari
;
Nicole Novielli;Daniela Girardi
In corso di stampa

Abstract

Capturing users’ engagement is crucial for gathering feedback about the features of a software product. In a market-driven context, current approaches to collecting and analyzing users’ feedback are based on techniques leveraging information extracted from product reviews and social media. These approaches are hardly applicable in contexts where online feedback is limited, as for the majority of apps, and software in general. In such cases, companies need to resort to face-to-face interviews to get feedback on their products. In this paper, we propose to utilize biometric data, in terms of physiological and voice features, to complement product feedback interviews with information about the engagement of the user on product-relevant topics. We evaluate our approach by interviewing users while gathering their physiological data (i.e., biofeedback) using an Empatica E4 wristband, and capturing their voice through the default audio-recorder of a common laptop. Our results show that we can predict users’ engagement by training supervised machine learning algorithms on biofeedback and voice data, and that voice features alone can be sufficiently effective. The best configurations evaluated achieve an average F1 ∼70% in terms of classification performance, and use voice features only. This work is one of the first studies in requirements engineering in which biometrics are used to identify emotions. Furthermore, this is one of the first studies in software engineering that considers voice analysis. The usage of voice features can be particularly helpful for emotion-aware feedback collection in remote communication, either performed by human analysts or voice-based chatbots, and can also be exploited to support the analysis of meetings in software engineering research.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/454000
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus ND
  • ???jsp.display-item.citation.isi??? ND
social impact