The amount of data to analyze in virtual learning environments (VLEs) grows exponentially everyday. The daily interaction of students with VLE platforms represents a digital foot print of the students' engagement with the learning materials and activities. This big and worth source of information needs to be managed and processed to be useful. Educational Data Mining and Learning Analytics are two research branches that have been recently emerged to analyze educational data. Artificial Intelligence techniques are commonly used to extract hidden knowledge from data and to construct models that could be used, for example, to predict students' outcomes. However, in the educational field, where the interaction between humans and AI systems is a main concern, there is a need of developing new Explainable AI (XAI) systems, that are able to communicate, in a human understandable way, the data analysis results. In this paper, we use an XAI tool, called ExpliClas, with the aim of facilitating data analysis in the context of the decision-making processes to be carried out by all the stakeholders involved in the educational process. The Open University Learning Analytics Dataset (OULAD) has been used to predict students' outcome, and both graphical and textual explanations of the predictions have shown the need and the effectiveness of using XAI in the educational field.

Explainable Artificial Intelligence for Human-Centric Data Analysis in Virtual Learning Environments

Gabriella Casalino
2019

Abstract

The amount of data to analyze in virtual learning environments (VLEs) grows exponentially everyday. The daily interaction of students with VLE platforms represents a digital foot print of the students' engagement with the learning materials and activities. This big and worth source of information needs to be managed and processed to be useful. Educational Data Mining and Learning Analytics are two research branches that have been recently emerged to analyze educational data. Artificial Intelligence techniques are commonly used to extract hidden knowledge from data and to construct models that could be used, for example, to predict students' outcomes. However, in the educational field, where the interaction between humans and AI systems is a main concern, there is a need of developing new Explainable AI (XAI) systems, that are able to communicate, in a human understandable way, the data analysis results. In this paper, we use an XAI tool, called ExpliClas, with the aim of facilitating data analysis in the context of the decision-making processes to be carried out by all the stakeholders involved in the educational process. The Open University Learning Analytics Dataset (OULAD) has been used to predict students' outcome, and both graphical and textual explanations of the predictions have shown the need and the effectiveness of using XAI in the educational field.
978-3-030-31283-1
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/242708
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 28
  • ???jsp.display-item.citation.isi??? ND
social impact