EMG-based Hand Gesture Recognition in VR
The use of virtual reality (VR) technology is expanding, with new developments pushing VR beyond the realm of video games and into the Metaverse as a social space.
As VR continues to be a popular topic in research, there is a focus on making the VR experience more comfortable for users and more immersive. These elements are crucial for a enjoyable gaming or social experience, but they are even more essential when applied in serious game contexts, such as training simulations or therapeutic use. In these cases, biosignals-powered biofeedback tools are often used to closely monitor the user’s comfort, health, and emotional states to enhance the overall experience.
EMG-based techniques have the potential to be a more accurate and precise way to track hand movements in VR applications. A recent study developed a puzzle-like VR application that combined EMG-based techniques to recognize and estimate grasp movements, which can be used in simple tasks such as grabbing and relocating objects. The application uses EMG signals recorded via the MindRove armband and transferred and processed with the help of the SDK. It utilizes deep neural networks to estimate the hands’ continuous motion and gestures, which could be particularly helpful in rehabilitation of people with hand dysfunctions or amputations.
The research utilized the MindRove armband and MindRove SDK for the collection of sEMG data. It has 8 channels, a sampling rate of 500 Hz and a resolution of 24 bits. The electrodes are positioned in a circular formation on the forearm. Additionally, the armband incorporates an inertial measurement unit, consisting of a 3-axis gyroscope and a 3-axis accelerometer.
Combining CNN and BiLSTM for accurate hand gesture recognition and tracking in VR
In a recent study, researchers developed a puzzle-like VR application that combined EMG-based techniques to recognize and estimate grasp movements.
This application used a combination of a convolutional neural network (CNN) and a bidirectional LSTM (BiLSTM) to recognize and estimate hand gestures. The models used in this research have simple architectures, yet have achieved good performance.
The CNN used in the study was able to classify 6 specific grasp tasks with an accuracy of 80 ± 2.36%, a performance that is comparable with similar works evaluated in an inter-subject way. The BiLSTM model provided feedback of the joint angles measured during performing a task online, and achieved a high average CC (correlation coefficient) of 90% and a low average NRMSE (normalized root mean square error) of 13%.
This application used EMG signals measured via the MindRove armband to control the virtual hands in the VR application instead of VR controllers, making it easy to deploy for people with hand dysfunctionalities.
Overall, this research presents a promising approach for hand gesture recognition and tracking in VR applications and has the potential to be used in real-world scenarios.
Find out more about this unique approach in their conference paper and check their poster below.