Tele-rehabilitation aims at increasing clinical outcomes while reducing costs and improving patients' quality of life (QoL). However, two main challenges need to be addressed to ensure its effectiveness: remote motor and cognitive rehabilitation. In this research work, we want to focus on the latter. Our idea is to integrate the concept of Emotional AI into tele-rehabilitation by monitoring the facial expressions of patients during motor rehabilitation exercises. Thus, we can assess the patient's cognitive and emotional state, with the objective of determining the relationship between motor and cognitive rehabilitation outcomes. Therefore, this study considers a Cloud/Edge continuum tele-rehabilitation scenario where a Hospital Cloud interacts with remote rehabilitation and monitoring Edge devices placed in patients' homes and/or rehabilitation centres. Specifically, we want to assess the performance of a Facial Expression Recognition (FER) system that can be deployed at the Edge. To achieve our goal, we employ the MediaPipe suite of libraries, which is optimized to run on lowresource devices. In particular, we used its Face Mesh module that is capable of generating a face mesh (i.e., a set of 3D face points) from an input image. The features of the mesh are then used to train a classifier that can identify the different facial expressions defined in Ekman's model (i.e., angry, fear, happy, sad, surprise, and neutral). In our experiments, we tested several combinations of datasets, face meshes (FMs), face feature maps (FFMs), and classifiers to identify the best-performing solution and demonstrate the applicability of this approach in a tele-rehabilitation environment.

Facial expression recognition based on emotional artificial intelligence for tele-rehabilitation

Ciraolo, Davide
Primo
;
Fazio, Maria
Secondo
;
Villari, Massimo
Penultimo
;
Celesti, Antonio
Ultimo
2024-01-01

Abstract

Tele-rehabilitation aims at increasing clinical outcomes while reducing costs and improving patients' quality of life (QoL). However, two main challenges need to be addressed to ensure its effectiveness: remote motor and cognitive rehabilitation. In this research work, we want to focus on the latter. Our idea is to integrate the concept of Emotional AI into tele-rehabilitation by monitoring the facial expressions of patients during motor rehabilitation exercises. Thus, we can assess the patient's cognitive and emotional state, with the objective of determining the relationship between motor and cognitive rehabilitation outcomes. Therefore, this study considers a Cloud/Edge continuum tele-rehabilitation scenario where a Hospital Cloud interacts with remote rehabilitation and monitoring Edge devices placed in patients' homes and/or rehabilitation centres. Specifically, we want to assess the performance of a Facial Expression Recognition (FER) system that can be deployed at the Edge. To achieve our goal, we employ the MediaPipe suite of libraries, which is optimized to run on lowresource devices. In particular, we used its Face Mesh module that is capable of generating a face mesh (i.e., a set of 3D face points) from an input image. The features of the mesh are then used to train a classifier that can identify the different facial expressions defined in Ekman's model (i.e., angry, fear, happy, sad, surprise, and neutral). In our experiments, we tested several combinations of datasets, face meshes (FMs), face feature maps (FFMs), and classifiers to identify the best-performing solution and demonstrate the applicability of this approach in a tele-rehabilitation environment.
File in questo prodotto:
Non ci sono file associati a questo prodotto.
Pubblicazioni consigliate

I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.

Utilizza questo identificativo per citare o creare un link a questo documento: https://hdl.handle.net/11570/3307810
 Attenzione

Attenzione! I dati visualizzati non sono stati sottoposti a validazione da parte dell'ateneo

Citazioni
  • ???jsp.display-item.citation.pmc??? ND
  • Scopus 16
  • ???jsp.display-item.citation.isi??? 10
social impact