Neuroprostetics: Fraunhofer IBMT in EU joint project SOMA
Bidirectional control of prosthetic hands using ultrasonic sensors
For anyone who has lost a hand, a functional prosthetic hand is of enormous benefit when it comes to everyday activities. So researchers at Fraunhofer are working as part of an EU research project to improve control of prosthetic hands down to individual fingers. Instead of conventional electrodes that detect nerve impulses in muscle tissue in the arm, these rely on ultrasonic sensors. This means commands can be executed with far greater accuracy and sensitivity. In the next stage, researchers want to make the design bidirec-tional, with the brain also receiving sensory stimuli from the prosthesis.
Together with its partners in the project, Fraunhofer researchers have shown that controlling prosthetic hands can be significantly improved by using ultrasonic sensors. Someone who has lost a hand following an accident, for example, might be able to control individual fingers on the prosthesis even better and move them even more precisely than was previously possible with myoelectric prostheses, to use the technical term. Myoelectric prostheses usually work with electrodes placed on the skin, which pick up electrical signals from muscle contractions and forward them to an electronics module, which in turn controls the prosthesis.
With the SOMA project (Ultrasound peripheral interface and in-vitro model of human somatosensory system and muscles for motor decoding and restoration of somatic sensations in amputees), scientists at the Fraunhofer Institute for Biomedical Engineering IBMT in Sulzbach, Saarland, have adopted a new approach. They are using ultrasonic sensors that continuously send sound pulses into the muscle tissue in the forearm. Unlike electrical impulses, sound waves are reflected by tissue. The time taken for the reflected signals to propagate provides information about the physical depth of the muscle strand that is reflecting the respective sound wave. This allows contractions in the muscle tissue triggered by nerve stimuli in the brain to be studied in great detail. This in turn means typical activation patterns in the muscle, ones that represent specific hand or finger movements, can be identified. The aim of the project is for AI-controlled software in a compact electronics box worn on the patient’s body to take over the job of identification. The electronics could send the decoded signals as a command to the actuators in the prosthetic hand, thus triggering movement of the prosthetic fingers. Control commands are detected, analyzed, and transmitted all in real time.
This EU project based around fundamental research is currently still in the laboratory phase. Ultrasonic transducers and electronics generate signals and decode the sound waves that are reflected back. This data is then passed to a PC where the AI starts analyzing. The electronics then send decoded signals as a command to the actuators in the prosthetic hand, thereby triggering finger movement. The advantages of this technology are in fact already clearly visible. “The ultrasonic-based control acts with greater sensitivity and accuracy than would be possible with electrodes. The sensors are able to detect varying degrees of freedom such as flexing, extending or rotating,” says Dr. Marc Fournelle, head of the Sensors & Actuators group at Fraunhofer IBMT, who is responsible for developing SOMA ultrasonic sensors within the project.
Time differences reveal depth and location information
In order to achieve high precision and reliability, the piezoelectric sound transducers send impulses into the muscle tissue dozens of times per second at a frequency ranging between 1 and 4 MHz. Furthermore, a minimum of 20 sensors are interconnected. Besides the depth information, each sensor also provides data about the position of the muscle strand that has just sent back a wave. The data gathered about the location and depth of the signals are pre-sorted before the AI gets to work. “The AI then has to analyze the ultrasound signals, identify an activation pattern, convert it into a control command and send it to the corresponding finger on the prosthesis. From a technical perspective, the AI analyzes the amplitude and time profile of the electrical voltages that each sensor module supplies,” explains Fournelle.
The sensors are integrated into a bracelet that might at a later stage be fitted into the shaft of the prosthetic hand. To link the muscle signals correctly with the right finger and desired movement, subjects have to complete a short training session where they try to move various parts of the hand and fingers. The activity patterns generated in this way are stored as a base reference in the system. This means a link can be established between the corresponding finger or part of the hand, and the de-sired movement. Training takes just a few minutes.this way are stored as a base reference in the system
Andreas Schneider-Ickert, project manager in the Active Implants unit and innovation manager at Fraunhofer IBMT, says: “Trials on test subjects have shown that the technology works. It is very easy to use and non-invasive. We are now working on making the system even more inconspicuous.”
Project partners across five countries
This technology has been developed together with several project partners. A total of seven partners from five countries are working together as part of the SOMA consortium. Fraunhofer IBMT experts are bringing their decades of experience in the development of sensors and in fields including neuroprosthetics and implants. The team developed the specially adapted ultrasonic transducers as well as the electronics box. Working alongside the Fraunhofer researchers, Imperial College of Science Technology and Medicine in London developed the AI process for recognizing movement patterns and carried out initial testing on subjects. “We have also been working very closely for a number of years with the Università Campus Bio-Medico di Roma (UCBM), which is coordinating the whole SOMA project and which approached us with the idea for the sensors,” explains Schneider-Ickert.
Work on SOMA continues apace following proof of concept and the positive feedback from the test subjects. In the next stage, researchers want to improve the temporal resolution of the sensors further and make the electronics smaller so that the prosthesis can be controlled even more accurately and comfortably. The sensor bracelet will be hidden away in the cuff of the prosthetic hand. Thinking of improved suitability for everyday use, it is also conceivable that the AI and control software may one day be integrated into a smartphone. For instance, after being decoded by the electronics box, signals might be transmitted to the smartphone and back using Bluetooth.
Sensory feedback from the prosthetic hand
The consortium is also working on making the system bi-directional. The prosthetic hand should not only be able to execute commands but also return feedback that the person wearing the prosthesis can feel as a sensory stimulus and react to. “When someone who hasn’t lost their hand picks up a glass of water and holds it to their mouth, they get constant feedback from their fingers on how tight to hold the glass so that on the one hand it doesn’t slip and fall, and on the other so it doesn’t shatter from being squeezed too tightly. “Such functionality is also being investigated within SOMA and could one day be integrated into prosthetic hands,” explains Schneider-Ickert.
However, rather than using ultrasonic sensors, the feedback could be delivered via electrodes implanted in or onto nerves. From there they transmit signals sent by the prosthesis to the brain as a sensory stimulus in the form of specific nerve stimulation. This means the person’s brain is getting feedback from the artificial hand and can send back commands that, for example, tighten or loosen the fingers. The person cannot feel the electrode, which is made of biologically compatible material and implanted in the nerve tissue. “This means a closed loop is set up, where the brain and prosthetic hand are communicating with each other constantly and in real time,” explains Fournelle. Fraunhofer IBMT has already developed and tested the relevant technology and electrodes.
User acceptance and usability
Usability and user acceptance are crucial factors at every stage of the project. The SOMA project team has been gathering feedback from test subjects at every stage. In the current phase, these are subjects who still have their hands. “The feedback from the test subjects helps us make this innovative prosthetic hand even better. People who have lost a hand have endured a long period of suffering. A functioning prosthetic hand is of enormous benefit when it comes to everyday activities and also restores some quality of life,” explains Schneider-Ickert.
Development of the innovative prosthetic hand is also giving a noticeable boost to the market for myoelectric prostheses. An estimated three million people worldwide have had an arm or hand amputated, and this number continues to grow. These people should see benefits from this improvement to myoelectric prostheses in terms of functionality and comfort.
The SOMA project
Objective: To design a bidirectional, non-invasive human-machine interface based on ultrasound
Project start and length: September 1, 2020 – September 1, 2024
Funding: Horizon 2020 EU innovation funding program, Future and Emerging Technologies (FET)
Funding: Three million euros (grant agreement: 899822)
Project partners:
• Università Campus Bio-Medico di Roma, Italy (project lead)
• Fraunhofer Institute for Biomedical Engineering IBMT, Germany
• Università degli Studi di Napoli Federico II, Italy
• Imperial College of Science Technology and Medicine, England
• University College London, England
• Universidad Autònoma de Barcelona, Spain
• The company Össur hf, Iceland
Websites:
SOMA project
Horizon 2020 EU innovation funding program