DALLAS, Oct. 2, 2019 /PRNewswire/ — Scientists from The University of Texas at Dallas presented a groundbreaking new approach for improving control of prosthetics with the use of artificial intelligence (AI) at the 2019 IEEE International Symposium on Measurement and Control in Robotics this month. This research was led by Mohsen Jafarzadeh. The research findings show a huge leap forward in the goal of fully end-to-end optimization of electromyography (EMG) controlled prosthetic hands.

There are more than 40 million amputees across the globe, according to the World Health Organization. Recent advances in prosthetic hand and limb technology have greatly improved the quality of life for upper-limb amputees. However, gaps remain in the control of prosthetic hands, specifically in using naturally generated electric signals from the patient’s muscles.

Natural muscles provide mobility in response to nerve impulses. EMG measures the electrical activity of muscles in response to a nerve’s stimulation. This controlling function is currently the most effective and convenient way of controlling prosthetic hands. EMG signals have been used extensively in the identification of user intention to control assistive devices such as smart wheelchairs, exoskeletons, and prosthetic devices. However, the performance of conventional assistive devices is still unsatisfactory.

New EMG-based control system research from The University of Texas at Dallas led by researcher Mohsen Jafarzadeh utilizes an advanced form of artificial intelligence and deep learning to control prosthetic hands with raw EMG signals. A major step forward, the research’s proposed convolutional neural network without preprocessing results in faster and more accurate data classification and faster hand movements for the user. Researchers also used specific user data to re-train the system, personalizing actions based on user requests. The new control system is implemented in Python with TensorFlow deep learning library and runs in real time on an embedded general-purpose graphics processing unit developer kit.

“Our solution uses a novel deep convolutional neural network to eschew the feature-engineering step,” said Mohsen Jafarzadeh, lead researcher, The University of Texas at Dallas. “Removing the feature extraction and feature description is an important step toward the paradigm of end-to-end optimization. Our results are a solid starting point to begin designing more sophisticated prosthetic hands.”

For the study, titled “Deep learning approach to control of prosthetic hands with electromyography signals,” The University of Texas at Dallas researchers led by Mohsen Jafarzadeh used data from two subjects for testing and six subjects for training and validation. Findings showed the proposed convolutional neural network runs in real time and transmits the user command (signals) to the prosthetic hand low-level controller, with the error probability of zero, making the final product error-free and functional.

The research was presented at the 2019 IEEE International Symposium on Measurement and Control in Robotics in Houston this September and published by IEEE. This symposium focused on various aspects of international research, applications and trends of robotic Innovations for the benefit of humanity, advanced human-robot systems and applied technologies in the fields of robotics, telerobotics, simulator platforms and environment and mobile work machines, as well as virtual reality, augmented reality and 3D modeling and simulation.

Contact:
Mohsen Jafarzadeh
Phone: 727-228-2208
Email: [email protected]

Related Links

Mohsen Jafarzadeh

SOURCE Mohsen Jafarzadeh

Leave a comment

Your email address will not be published. Required fields are marked *