The automatic monitoring and assessment of the engagement level of learners in distance education may help in understanding problems and providing personalized support during the learning process. This article presents a research aiming to investigate how student engagement level can be assessed from facial behavior and proposes a model based on Long Short-Term Memory (LSTM) networks to predict the level of engagement from facial action units, gaze, and head poses. The dataset used to learn the model is the one of the EmotiW 2019 challenge datasets. In order to test its performance in learning contexts, an experiment, involving students attending an online lecture, was performed. The aim of the study was to compare the self-evaluation of the engagement perceived by the students with the one assessed by the model. During the experiment we collected videos of students behavior and, at the end of each session, we asked students to answer a questionnaire for assessing their perceived engagement. Then, the collected videos were analyzed automatically with a software that implements the model and provides an interface for the visual analysis of the model outcome. Results show that, globally, engagement prediction from students’ facial behavior was weakly correlated to their subjective answers. However, when considering only the emotional dimension of engagement, this correlation is stronger and the analysis of facial action units and head pose (facial movements) are positively correlated with it, while there is an inverse correlation with the gaze, meaning that the more the student’s feels engaged the less are the gaze movements.

Assessing student engagement from facial behavior in on-line learning

Buono, Paolo
;
De Carolis, Berardina;D'Errico, Francesca;Palestra, Giuseppe
2023-01-01

Abstract

The automatic monitoring and assessment of the engagement level of learners in distance education may help in understanding problems and providing personalized support during the learning process. This article presents a research aiming to investigate how student engagement level can be assessed from facial behavior and proposes a model based on Long Short-Term Memory (LSTM) networks to predict the level of engagement from facial action units, gaze, and head poses. The dataset used to learn the model is the one of the EmotiW 2019 challenge datasets. In order to test its performance in learning contexts, an experiment, involving students attending an online lecture, was performed. The aim of the study was to compare the self-evaluation of the engagement perceived by the students with the one assessed by the model. During the experiment we collected videos of students behavior and, at the end of each session, we asked students to answer a questionnaire for assessing their perceived engagement. Then, the collected videos were analyzed automatically with a software that implements the model and provides an interface for the visual analysis of the model outcome. Results show that, globally, engagement prediction from students’ facial behavior was weakly correlated to their subjective answers. However, when considering only the emotional dimension of engagement, this correlation is stronger and the analysis of facial action units and head pose (facial movements) are positively correlated with it, while there is an inverse correlation with the gaze, meaning that the more the student’s feels engaged the less are the gaze movements.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/419294
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? 0
  • Scopus 2
  • ???jsp.display-item.citation.isi??? 4
social impact