Manuel Eggimann, Abbas Rahimi, et al.
IEEE TCAS-I
Wearable devices that monitor muscle activity based on surface electromyography could be of use in the development of hand gesture recognition applications. Such devices typically use machine-learning models, either locally or externally, for gesture classification. However, most devices with local processing cannot offer training and updating of the machine-learning model during use, resulting in suboptimal performance under practical conditions. Here we report a wearable surface electromyography biosensing system that is based on a screen-printed, conformal electrode array and has in-sensor adaptive learning capabilities. Our system implements a neuro-inspired hyperdimensional computing algorithm locally for real-time gesture classification, as well as model training and updating under variable conditions such as different arm positions and sensor replacement. The system can classify 13 hand gestures with 97.12% accuracy for two participants when training with a single trial per gesture. A high accuracy (92.87%) is preserved on expanding to 21 gestures, and accuracy is recovered by 9.5% by implementing model updates in response to varying conditions, without additional computation on an external device.
Manuel Eggimann, Abbas Rahimi, et al.
IEEE TCAS-I
Robert Guirado, Abbas Rahimi, et al.
IJCNN 2022
Denis Kleyko, Mike Davies, et al.
Proceedings of the IEEE
Michael Hersche, Stefan Lippuner, et al.
Brain Informatics