The complex articulation of hand gestures is an essential characteristic of the human species. Due to this, hand rehabilitation or hand-arm prostheses should be supported by appropriate feedback to verify the correct execution of such gestures. Within the scope of this research, a comprehensive methodology is proposed and designed for the robust identification of hand status (i.e., open and closed positions) as well as wrist movements (i.e., extension, neutral, and flexion) through the use of electromyography (EMG) signals in combination with machine vision-based acquisition techniques. The study employes a camera to capture hand gestures, open-source libraries for the extraction of the information regarding hand status and wrist movements, and includes a machine learning algorithm for the identification and recognition of these specific gestures from the EMG signals that have been acquired. The synergistic combination of these techniques provides an accurate representation of the user's motor intentions, particularly in relation to the most common daily activities, by analyzing simple EMG signals. The robustness of the system is supported by its high performance in preliminary tests, where the training model exhibited a very good performance in terms of accuracy, precision, and recall. Furthermore, such training model can also correctly predict 81.70 % of the status and 72.05 % of a random real-time sequence of hand gestures.
A machine vision and electromyographic-based approach for hand gesture recognition
Emilia Currò;Antonino Quattrocchi
;Cristiano De Marchis;Dario Milone;Giovanni Gugliandolo;Nicola Donato
2024-01-01
Abstract
The complex articulation of hand gestures is an essential characteristic of the human species. Due to this, hand rehabilitation or hand-arm prostheses should be supported by appropriate feedback to verify the correct execution of such gestures. Within the scope of this research, a comprehensive methodology is proposed and designed for the robust identification of hand status (i.e., open and closed positions) as well as wrist movements (i.e., extension, neutral, and flexion) through the use of electromyography (EMG) signals in combination with machine vision-based acquisition techniques. The study employes a camera to capture hand gestures, open-source libraries for the extraction of the information regarding hand status and wrist movements, and includes a machine learning algorithm for the identification and recognition of these specific gestures from the EMG signals that have been acquired. The synergistic combination of these techniques provides an accurate representation of the user's motor intentions, particularly in relation to the most common daily activities, by analyzing simple EMG signals. The robustness of the system is supported by its high performance in preliminary tests, where the training model exhibited a very good performance in terms of accuracy, precision, and recall. Furthermore, such training model can also correctly predict 81.70 % of the status and 72.05 % of a random real-time sequence of hand gestures.Pubblicazioni consigliate
I documenti in IRIS sono protetti da copyright e tutti i diritti sono riservati, salvo diversa indicazione.