Children with Autism Spectrum Disorders (ASD) are characterized by impairments in communication and social skills, including problems in understanding and producing gestures. Using the approach of robot-based imitation games, in this paper, we propose the prototype of an imitation game that aims at improving the non-verbal communication skills, gestures in particular, of children with ASD. Starting from an application that we developed in another domain, social inclusion of migrant children, we use a social robot to teach them to recognize and produce social gestures through an imitation game. For allowing the recognition of gestures by the robot, we learned a LSTM-based model using MediaPipe for the analysis of hands positions and landmarks. The model was trained on six selected gestures for recognizing their pattern. The module is then used by the robot in the game. Results from the software accuracy point of view are encouraging and show that the proposed approach is suitable for the purpose of showing and recognizing predefined gestures, however we are aware that in the wild with ASD children it might not work properly. For this reason, in the near future, we will perform a study aiming at assessing the efficacy of the approach with ASD children and revise the model and the game accordingly.

Social Robots to Support Gestural Development in Children with Autism Spectrum Disorder

De Carolis B.;D'Errico F.
Methodology
;
Palestra G.
2021-01-01

Abstract

Children with Autism Spectrum Disorders (ASD) are characterized by impairments in communication and social skills, including problems in understanding and producing gestures. Using the approach of robot-based imitation games, in this paper, we propose the prototype of an imitation game that aims at improving the non-verbal communication skills, gestures in particular, of children with ASD. Starting from an application that we developed in another domain, social inclusion of migrant children, we use a social robot to teach them to recognize and produce social gestures through an imitation game. For allowing the recognition of gestures by the robot, we learned a LSTM-based model using MediaPipe for the analysis of hands positions and landmarks. The model was trained on six selected gestures for recognizing their pattern. The module is then used by the robot in the game. Results from the software accuracy point of view are encouraging and show that the proposed approach is suitable for the purpose of showing and recognizing predefined gestures, however we are aware that in the wild with ASD children it might not work properly. For this reason, in the near future, we will perform a study aiming at assessing the efficacy of the approach with ASD children and revise the model and the game accordingly.
File in questo prodotto:
Non ci sono file associati a questo prodotto.

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11586/535560
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 6
  • ???jsp.display-item.citation.isi??? 4
social impact