Researchers are looking to employ onboard artificial intelligence systems to improve the control and sophistication of prosthetic hands, by using deep learning approaches that read and react to nerve signals transmitted through the arm.
The practice of tracking the natural electric impulses sent by the brain to control individual muscles, known as electromyography, has been used to operate prosthetic limbs and hands before, as well as wheelchairs and other devices. But performance gaps remain when it comes to the fine motor control of fingers and hands.
By running a neural network in real-time on a dedicated processing unit within the prosthetic, researchers at the University of Texas at Dallas (UT Dallas) hope to speed up responses for faster hand movements. In addition, the proposed system could be retrained based on the actions of the user to increase its accuracy.
According to lead researcher Mohsen Jafarzadeh of UT Dallas, the system uses a convolutional neural network—the kind typically used in image recognition and visual analysis. By applying that to the raw electromyography data taken from electrodes on the arm, they can skip the labor-intensive steps of isolating and characterizing the specific signals within the noise that are typically used to train algorithms by hand.
"Removing the feature extraction and feature description is an important step toward the paradigm of end-to-end optimization,” Jafarzadeh said in a statement. “Our results are a solid starting point to begin designing more sophisticated prosthetic hands.”
The work still has far to go, the researchers said, including the collection of more electromyography data from more people to train and improve the accuracy of their networks and allow for more complex hand movements.
The research was presented at the 2019 IEEE International Symposium on Measurement and Control in Robotics in Houston last month and was published by IEEE, the Institute of Electrical and Electronics Engineers.