Using onboard AI to power quicker, more complex prosthetic hands

True artificial intelligence remains out of reach today, but predictive analytics and machine learning is already at work behind the scenes.
By employing an artificial intelligence network typically used for image recognition, researchers at University of Texas at Dallas aim to skip labor-intensive processing steps while reacting to raw nerve signal data in real time. (Getty Images/Pixtum)

Researchers are looking to employ onboard artificial intelligence systems to improve the control and sophistication of prosthetic hands, by using deep learning approaches that read and react to nerve signals transmitted through the arm.

The practice of tracking the natural electric impulses sent by the brain to control individual muscles, known as electromyography, has been used to operate prosthetic limbs and hands before, as well as wheelchairs and other devices. But performance gaps remain when it comes to the fine motor control of fingers and hands.

By running a neural network in real-time on a dedicated processing unit within the prosthetic, researchers at the University of Texas at Dallas (UT Dallas) hope to speed up responses for faster hand movements. In addition, the proposed system could be retrained based on the actions of the user to increase its accuracy.

Free Webinar

From Patient Adherence to Manufacturing Ease - Why Softgels Make Sense for Rx

Join Thermo Fisher Scientific’s upcoming webinar to learn why softgels offer numerous benefits for Rx drug development, including enhanced bioavailability, patient compliance and easy scale-up. Register Today.

According to lead researcher Mohsen Jafarzadeh of UT Dallas, the system uses a convolutional neural network—the kind typically used in image recognition and visual analysis. By applying that to the raw electromyography data taken from electrodes on the arm, they can skip the labor-intensive steps of isolating and characterizing the specific signals within the noise that are typically used to train algorithms by hand.

RELATED: Brain-computer interface allows paralyzed patients to use off-the-shelf tablet

"Removing the feature extraction and feature description is an important step toward the paradigm of end-to-end optimization,” Jafarzadeh said in a statement. “Our results are a solid starting point to begin designing more sophisticated prosthetic hands.”

The work still has far to go, the researchers said, including the collection of more electromyography data from more people to train and improve the accuracy of their networks and allow for more complex hand movements.

The research was presented at the 2019 IEEE International Symposium on Measurement and Control in Robotics in Houston last month and was published by IEEE, the Institute of Electrical and Electronics Engineers.

Suggested Articles

Biotech IPOs are up over 40% year-to-date, but today’s markets have rewritten the rules for going public. Find out the new best practices for IPOs.

Baxter has received clearance from the FDA for the latest version of its automated peritoneal dialysis system designed for home use.

ATAI Life Sciences is topping off a busy year with a $125 million financing, which will push two programs through phase 2 readouts.