Fine-tuned brain-computer interface makes prosthetic limbs feel more real
Two new papers document progress in neuroprosthetic technology that lets people feel the shape and movement of objects moving over the "skin" of a bionic hand
University of Chicago Medical Center
image:
A testing participant controls a bionic hand through a brain-computer interface that allows him to feel pressure changes as the steering wheel moves in the hand.
view moreCredit: Charles Greenspon, University of Chicago
You can probably complete an amazing number of tasks with your hands without looking at them. But if you put on gloves that muffle your sense of touch, many of those simple tasks become frustrating. Take away proprioception — your ability to sense your body’s relative position and movement — and you might even end up breaking an object or injuring yourself.
“Most people don’t realize how often they rely on touch instead of vision — typing, walking, picking up a flimsy cup of water,” said Charles Greenspon, PhD, a neuroscientist at the University of Chicago. “If you can’t feel, you have to constantly watch your hand while doing anything, and you still risk spilling, crushing or dropping objects.”
Greenspon and his research collaborators recently published papers in Nature Biomedical Engineering and Science documenting major progress on a technology designed to address precisely this problem: direct, carefully timed electrical stimulation of the brain that can recreate tactile feedback to give nuanced “feeling” to prosthetic hands.
The science of restoring sensation
These new studies build on years of collaboration among scientists and engineers at UChicago, the University of Pittsburgh, Northwestern University, Case Western Reserve University and Blackrock Neurotech. Together they are designing, building, implementing and refining brain-computer interfaces (BCIs) and robotic prosthetic arms aimed at restoring both motor control and sensation in people who have lost significant limb function.
On the UChicago side, the research was led by neuroscientist Sliman Bensmaia, PhD, until his unexpected passing in 2023.
The researchers’ approach to prosthetic sensation involves placing tiny electrode arrays in the parts of the brain responsible for moving and feeling the hand. On one side, a participant can move a robotic arm by simply thinking about movement, and on the other side, sensors on that robotic limb can trigger pulses of electrical activity called intracortical microstimulation (ICMS) in the part of the brain dedicated to touch.
For about a decade, Greenspon explained, this stimulation of the touch center could only provide a simple sense of contact in different places on the hand.
“We could evoke the feeling that you were touching something, but it was mostly just an on/off signal, and often it was pretty weak and difficult to tell where on the hand contact occurred,” he said.
The newly published results mark important milestones in moving past these limitations.
Advancing understanding of artificial touch
In the first study, published in Nature Biomedical Engineering, Greenspon and his colleagues focused on ensuring that electrically evoked touch sensations are stable, accurately localized and strong enough to be useful for everyday tasks.
By delivering short pulses to individual electrodes in participants’ touch centers and having them report where and how strongly they felt each sensation, the researchers created detailed “maps” of brain areas that corresponded to specific parts of the hand. The testing revealed that when two closely spaced electrodes are stimulated together, participants feel a stronger, clearer touch, which can improve their ability to locate and gauge pressure on the correct part of the hand.
The researchers also conducted exhaustive tests to confirm that the same electrode consistently creates a sensation corresponding to a specific location.
“If I stimulate an electrode on day one and a participant feels it on their thumb, we can test that same electrode on day 100, day 1,000, even many years later, and they still feel it in roughly the same spot,” said Greenspon, who was the lead author on this paper.
From a practical standpoint, any clinical device would need to be stable enough for a patient to rely on it in everyday life. An electrode that continually shifts its “touch location” or produces inconsistent sensations would be frustrating and require frequent recalibration. By contrast, the long-term consistency this study revealed could allow prosthetic users to develop confidence in their motor control and sense of touch, much as they would in their natural limbs.
Adding feelings of movement and shapes
The complementary Science paper went a step further to make artificial touch even more immersive and intuitive. The project was led by first author Giacomo Valle, PhD, a former postdoctoral fellow at UChicago who is now continuing his bionics research at Chalmers University of Technology in Sweden.
“Two electrodes next to each other in the brain don’t create sensations that ‘tile’ the hand in neat little patches with one-to-one correspondence; instead, the sensory locations overlap,” explained Greenspon, who shared senior authorship of this paper with Bensmaia.
The researchers decided to test whether they could use this overlapping nature to create sensations that could let users feel the boundaries of an object or the motion of something sliding along their skin. After identifying pairs or clusters of electrodes whose “touch zones” overlapped, the scientists activated them in carefully orchestrated patterns to generate sensations that progressed across the sensory map.
Participants described feeling a gentle gliding touch passing smoothly over their fingers, despite the stimulus being delivered in small, discrete steps. The scientists attribute this result to the brain’s remarkable ability to stitch together sensory inputs and interpret them as coherent, moving experiences by “filling in” gaps in perception.
The approach of sequentially activating electrodes also significantly improved participants’ ability to distinguish complex tactile shapes and respond to changes in the objects they touched. They could sometimes identify letters of the alphabet electrically “traced” on their fingertips, and they could use a bionic arm to steady a steering wheel when it began to slip through the hand.
These advancements help move bionic feedback closer to the precise, complex, adaptive abilities of natural touch, paving the way for prosthetics that enable confident handling of everyday objects and responses to shifting stimuli.
The future of neuroprosthetics
The researchers hope that as electrode designs and surgical methods continue to improve, the coverage across the hand will become even finer, enabling more lifelike feedback.
“We hope to integrate the results of these two studies into our robotics systems, where we have already shown that even simple stimulation strategies can improve people’s abilities to control robotic arms with their brains,” said co-author Robert Gaunt, PhD, associate professor of physical medicine and rehabilitation and lead of the stimulation work at the University of Pittsburgh.
Greenspon emphasized that the motivation behind this work is to enhance independence and quality of life for people living with limb loss or paralysis.
“We all care about the people in our lives who get injured and lose the use of a limb — this research is for them,” he said. “This is how we restore touch to people. It’s the forefront of restorative neurotechnology, and we’re working to expand the approach to other regions of the brain.”
The approach also holds promise for people with other types of sensory loss. In fact, the group has also collaborated with surgeons and obstetricians at UChicago on the Bionic Breast Project, which aims to produce an implantable device that can restore the sense of touch after mastectomy.
Although many challenges remain, these latest studies offer evidence that the path to restoring touch is becoming clearer. With each new set of findings, researchers come closer to a future in which a prosthetic body part is not just a functional tool, but a way to experience the world.
“Evoking stable and precise tactile sensations via multi-electrode intracortical microstimulation of the somatosensory cortex” was published in Nature Biomedical Engineering in December 2024. Authors include Charles M. Greenspon, Giacomo Valle, Natalya D. Shelchkova, Thierri Callier, Ev I. Berger-Wolf, Elizaveta V. Okorokova, Efe Dogruoz, Anton R. Sobinov, Patrick M. Jordan, Emily E. Fitzgerald, Dillan Prasad, Ashley Van Driesche, Qinpu He, David Satzer, Peter C. Warnke, John E. Downey, Nicholas G. Hatsopoulos and Sliman J. Bensmaia from the University of Chicago; Taylor G. Hobbs, Ceci Verbaarschot, Jeffrey M. Weiss, Fang Liu, Jorge Gonzalez-Martinez, Michael L. Boninger, Jennifer L. Collinger and Robert A. Gaunt from the University of Pittsburgh; Brianna C. Hutchison, Robert F. Kirsch, Jonathan P. Miller, Abidemi B. Ajiboye, Emily L. Graczyk, from Case Western Reserve University; Lee E. Miller from Northwestern University; and Ray C. Lee from Schwab Rehabilitation Hospital.
“Tactile edges and motion via patterned microstimulation of the human somatosensory cortex” was published in Science in January 2025. Authors include Giacomo Valle, now at Chalmers University in Sweden; Ali H. Alamri, John E. Downey, Patrick M. Jordan, Anton R. Sobinov, Linnea J. Endsley, Dillan Prasad, Peter C. Warnke, Nicholas G. Hatsopoulos, Charles M. Greenspon and Sliman J. Bensmaia from the University of Chicago; Robin Lienkämper, Michael L. Boninger, Jennifer L. Collinger and Robert A. Gaunt from the University of Pittsburgh; and Lee E. Miller from Northwestern University.
Journal
Science
Method of Research
Experimental study
Subject of Research
People
Article Title
Tactile edges and motion via patterned microstimulation of the human somatosensory cortex
Article Publication Date
16-Jan-2025
Most advanced artificial touch for brain-controlled bionic hand
image:
An illustration showing a paralyzed individual with a spinal cord injury, implanted with intracortical electrodes in the brain. This brain-computer interface (BCI) allows the individual to control a bionic limb that is not attached to the body, directly with thoughts, to reach and grasp a coffee mug. Due to embedded sensors, the bionic hand senses the grasped object as if it were being grasped with the human hand, communicating the touch sensations to the user’s brain via advanced neurostimulation.
view moreCredit: Chalmers University of Technology | Boid | David Ljungberg
For the first time ever, a complex sense of touch for individuals living with spinal cord injuries is a step closer to reality. A new study published in Science, paves the way for complex touch sensation through brain stimulation, whilst using an extracorporeal bionic limb, that is attached to a chair or wheelchair.
The researchers, who are all part of the US-based Cortical Bionics Research Group, have discovered a unique method for encoding natural touch sensations of the hand via specific microstimulation patterns in implantable electrodes in the brain. This allows individuals with spinal cord injuries not only to control a bionic arm with their brain, but also to feel tactile edges, shapes, curvatures and movements, that until now have not been possible.
“In this work, for the first time the research went beyond anything that has been done before in the field of brain-computer interfaces (BCI) – we conveyed tactile sensations related to orientation, curvature, motion and 3D shapes for a participant using a brain-controlled bionic limb. We are in another level of artificial touch now. We think this richness is crucial for achieving the level of dexterity, manipulation, and a highly dimensional tactile experience typical of the human hand,” says Giacomo Valle, lead author of the study and Assistant Professor at Chalmers University of Technology, in Sweden.
The importance of the sense of touch
A sense of touch builds richness and independence in our everyday lives. For individuals living with a spinal cord injury, the electrical signals coming from the hand to the brain that should allow an individual to feel tactile sensations, are being blocked by the injury and that sense of touch is lost. A bionic limb controlled by user’s brain signals can bring back some functionality and independence to someone with a paralysed hand, but without the sense of touch, it is very difficult to lift, hold and manipulate objects. Previously, a bionic hand would not be perceived by the user as part of the body, since it would not provide any sensory feedback like a biological hand. This study aimed to improve the usability of an extracorporeal bionic limb, which would be mounted on a wheelchair or similar equipment close by to the user.
Implantable technology for controlling bionic limbs with the brain
For the study, two BCI participants were fitted with chronic brain implants in the sensory and motor regions of the brain that represent the arm and hand. Over the course of several years, the researchers were able to record and decode all of the different patterns of electrical activity that occurred in the brain related to motor intention of the arm and hand. This was possible, since the electrical activity was still present in the brain, but the paralysis was blocking this from reaching the hand. Decoding and deciphering brain signals with this technology is unique and allows the participants to directly control a bionic arm and hand with the brain for interacting with the environment.
Complex touch typed into the brain
The participants were able to accomplish a series of complex experiments, that require rich tactile sensations. To do this, the researchers typed specific stimulations directly into the users’ brain via the implants.
“We found a way to type these ‘tactile messages’ via microstimulation using the tiny electrodes in the brain and we found a unique way to encode complex sensations. This allowed for more vivid sensory feedback and experience while using a bionic hand,” says Valle.
The participants could feel the edge of an object, as well as the direction of motion along the fingertips.
By utilising the Brain Computer Interface, the researchers could decode the intention of motion from the participant’s brain in order to control a bionic arm. Since the bionic arm has sensors on it, when an object comes into contact with these sensors, the stimulation is sent to the brain and the participant feels the sensation as if it were in their hand. This means that the participants could potentially complete complex tasks with a bionic arm with more accuracy than was previously possible, like picking up an object and moving it from one location to another.
The future of complex touch for neural prosthetics
This research is just the first step towards patients with spinal cord injuries being able to feel this level of complex touch. To capture all the features of complex touch that the researchers are now able to encode and convey to the user, more complex sensors and robotic technology is needed (for example prosthetic skin). The implantable technology used to stimulate, would also require development, to increase the repertoire of sensation.
More about the research:
The study “Tactile edges and motion via patterned microstimulation of the human somatosensory cortex” published in Science, was led by Giacomo Valle, who is now Assistant Professor at the Department of Electrical Engineering at Chalmers University of Technology in Sweden, and who was active with a team at the University of Chicago in the Bensmaia Lab, USA, at the time of the study.
More about Cortical Bionics Research Group:
This area of research is already of major business interest in the USA, with multiple research institutions, and also private companies, now starting to commercialise implantable neurotechnology Part of the system being used for this study is now being developed by an American neurotech company. But there is less happening in Europe where the different regulatory landscape could impact the translation of emerging neurotechnologies. This study brings Chalmers into the Cortical Bionics Research Group and aims to establish a European hub for this area of neurotech research.
The Cortical Bionics Research Group is made up of three north American Universities – University of Pittsburgh, University of Chicago and Northwestern University. The mission of the Cortical Bionics Research Group is to build next-generation intracortical Brain-Computer Interfaces that enable dexterous control of bionic hands by people with paralysis or amputation.
Illustration caption: An illustration showing a paralyzed individual with a spinal cord injury, implanted with intracortical electrodes in the brain. This brain-computer interface (BCI) allows the individual to control a bionic limb that is not attached to the body, directly with thoughts, to reach and grasp a coffee mug. Due to embedded sensors, the bionic hand senses the grasped object as if it were being grasped with the human hand, communicating the touch sensations to the user’s brain via advanced neurostimulation.
Illustration credit: Chalmers University of Technology | Boid | David Ljungberg
For more information, please contact:
Giacomo Valle, Assistant Professor at the Department of Electrical Engineering, Chalmers University of Technology, Sweden, +46 70 83 88 515 valleg@chalmers.se
Charles Greenspon, Research Assistant Professor at the Department of Organismal Biology & Anatomy, University of Chicago, USA, +1 312 88 94 029 cmgreenspon@uchicago.edu
The contact persons speak English They are available for live and pre-recorded interviews. At Chalmers, we have podcast studios and broadcast filming equipment on site and would be able to assist a request for a television, radio or podcast interview. The BCI participants in America are available for interview through Giacomo Valle.
More about the scientific article:
The article, Tactile edges and motion via patterned microstimulation of the human somatosensory cortex, was published in Science.
The researchers involved in the study are Giacomo Valle, Ali H. Alamri, John E. Downey, Robin Lienkämper, Patrick M. Jordan, Anton R. Sobinov, Linnea J. Endsley, Dillan Prasad, Michael L. Boninger, Jennifer L. Collinger, Peter C. Warnke, Nicholas G. Hatsopoulos, Lee E. Miller, Robert A. Gaunt, Charles M. Greenspon, Sliman J. Bensmaia.
At the time of the study, the researchers were active at University of Chicago, USA; Chalmers University of Technology, Sweden; University of Pittsburgh, USA; and Northwestern University, USA.
The research was funded by the National Institute of Neurological Disorders and Stroke of the National Institutes of Health under Award Number UH3NS107714 to R.A.G., by R35 NS122333 to S.J.B. and by the University of Chicago (Chicago Postdoctoral Fellowship) to G.V.
You can read a recent Nature article that has been written about the study here.
- The study won the world-wide prestigious award for innovative BCI applications by gTec.
- The company developing part of the system that was used in the study and the researchers' industrial partner can be found here.
- BCI participants association involved can be found here.
- BCI network of worldwide experts in which the researchers collaborate can be found here.
- patent submitted on the presented neurotech: “GV is an inventor on a pending international (PCT) patent application submitted by The University of Chicago and The University of Pittsburgh that covers multi-channel microstimulation of the somatosensory cortex (n. 23-0518-WO)"
Journal
Science
Method of Research
Observational study
Subject of Research
People
Article Title
Tactile edges and motion via patterned microstimulation of the human somatosensory cortex
Article Publication Date
17-Jan-2025
COI Statement
N.G.H. and R.A.G. serve as consultants for Blackrock Neurotech, Inc. R.A.G. is also on the scientific advisory board of Neurowired. LLC., M.L.B., J.L.C., and R.A.G have received research funding from Blackrock Neurotech, Inc. though that funding did not support the work presented here. P.W. served as a consultant for Medtronic. A.R.S. served as a consultant for Google DeepMind. S.J.B., C.M.G., G.V., R.A.G., and R.L., are inventors on a pending international (PCT) patent application submitted by The University of Chicago and The University of Pittsburgh that covers multichannel microstimulation of the somatosensory cortex (no. 23-0518-WO).
No comments:
Post a Comment