Using onboard AI to power quicker, more complex prosthetic hands

True artificial intelligence remains out of reach today, but predictive analytics and machine learning is already at work behind the scenes.
By employing an artificial intelligence network typically used for image recognition, researchers at University of Texas at Dallas aim to skip labor-intensive processing steps while reacting to raw nerve signal data in real time. (Getty Images/Pixtum)

Researchers are looking to employ onboard artificial intelligence systems to improve the control and sophistication of prosthetic hands, by using deep learning approaches that read and react to nerve signals transmitted through the arm.

The practice of tracking the natural electric impulses sent by the brain to control individual muscles, known as electromyography, has been used to operate prosthetic limbs and hands before, as well as wheelchairs and other devices. But performance gaps remain when it comes to the fine motor control of fingers and hands.

By running a neural network in real-time on a dedicated processing unit within the prosthetic, researchers at the University of Texas at Dallas (UT Dallas) hope to speed up responses for faster hand movements. In addition, the proposed system could be retrained based on the actions of the user to increase its accuracy.

Survey

Industry Insight Survey: Direct-to-Patient Distribution of Clinical Supplies

This industry survey seeks to gain insight on trial sponsors' perspective on offering a DTP option and their current level of awareness and understanding of any factors that may influence their ability to do so. The first 50 qualified respondents will receive a $5 Amazon gift card.

According to lead researcher Mohsen Jafarzadeh of UT Dallas, the system uses a convolutional neural network—the kind typically used in image recognition and visual analysis. By applying that to the raw electromyography data taken from electrodes on the arm, they can skip the labor-intensive steps of isolating and characterizing the specific signals within the noise that are typically used to train algorithms by hand.

RELATED: Brain-computer interface allows paralyzed patients to use off-the-shelf tablet

"Removing the feature extraction and feature description is an important step toward the paradigm of end-to-end optimization,” Jafarzadeh said in a statement. “Our results are a solid starting point to begin designing more sophisticated prosthetic hands.”

The work still has far to go, the researchers said, including the collection of more electromyography data from more people to train and improve the accuracy of their networks and allow for more complex hand movements.

The research was presented at the 2019 IEEE International Symposium on Measurement and Control in Robotics in Houston last month and was published by IEEE, the Institute of Electrical and Electronics Engineers.

Suggested Articles

Working with a joint venture established by BMS and Pfizer, Fitbit also plans to help develop educational content regarding atrial fibrillation.

Plexium is launching with $28 million to build its platform and a pipeline of drugs that target the enzymes drive recognition of protein targets.

The lackluster results led Syros to switch its attention to an oral CDK7 inhibitor, SY-5609, that is due to enter the clinic early next year.