Monday, November 17, 2025

 

Wearable lets users control machines and robots while on the move




University of California - San Diego
motion wearable 1 

image: 

Wearable technology uses everyday gestures to reliably control robotic devices even under excessive motion noise, such as when the user is running, riding in a vehicle or in environments with turbulence. 

view more 

Credit: David Baillot/UC San Diego Jacobs School of Engineering






Engineers at the University of California San Diego have developed a next-generation wearable system that enables people to control machines using everyday gestures — even while running, riding in a car or floating on turbulent ocean waves.

The system, published on Nov. 17 in Nature Sensors, combines stretchable electronics with artificial intelligence to overcome a long-standing challenge in wearable technology: reliable recognition of gesture signals in real-world environments.

Wearable technologies with gesture sensors work fine when a user is sitting still, but the signals start to fall apart under excessive motion noise, explained study co-first author Xiangjun Chen, a postdoctoral researcher in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering. This limits their practicality in daily life. “Our system overcomes this limitation,” Chen said. “By integrating AI to clean noisy sensor data in real time, the technology enables everyday gestures to reliably control machines even in highly dynamic environments.”

The technology could enable patients in rehabilitation or individuals with limited mobility, for example, to use natural gestures to control robotic aids without relying on fine motor skills. Industrial workers and first responders could potentially use the technology for hands-free control of tools and robots in high-motion or hazardous environments. It could even enable divers and remote operators to command underwater robots despite turbulent conditions. In consumer devices, the system could make gesture-based controls more reliable in everyday settings.

The work was a collaboration between the labs of Sheng Xu and Joseph Wang, both professors in the Aiiso Yufeng Li Family Department of Chemical and Nano Engineering at the UC San Diego Jacobs School of Engineering.

To the researchers’ knowledge, this is the first wearable human-machine interface that works reliably across a wide range of motion disturbances. As a result, it can work with the way people actually move.

The device is a soft electronic patch that is glued onto a cloth armband. It integrates motion and muscle sensors, a Bluetooth microcontroller and a stretchable battery into a compact, multilayered system. The system was trained from a composite dataset of real gestures and conditions, from running and shaking to the movement of ocean waves. Signals from the arm are captured and processed by a customized deep-learning framework that strips away interference, interprets the gesture, and transmits a command to control a machine — such as a robotic arm — in real time.

“This advancement brings us closer to intuitive and robust human-machine interfaces that can be deployed in daily life,” Chen said.

The system was tested in multiple dynamic conditions. Subjects used the device to control a robotic arm while running, exposed to high-frequency vibrations, and under a combination of disturbances. The device was also validated under simulated ocean conditions using the Scripps Ocean-Atmosphere Research Simulator at UC San Diego’s Scripps Institution of Oceanography, which recreated both lab-generated and real sea motion. In all cases, the system delivered accurate, low-latency performance.

Originally, this project was inspired by the idea of helping military divers control underwater robots. But the team soon realized that interference from motion wasn’t just a problem unique to underwater environments. It is a common challenge across the field of wearable technology, one that has long limited the performance of such systems in everyday life.

“This work establishes a new method for noise tolerance in wearable sensors,” Chen said. “It paves the way for next-generation wearable systems that are not only stretchable and wireless, but also capable of learning from complex environments and individual users.”

Full study: “A noise-tolerant human-machine interface based on deep learning-enhanced wearable sensors.” Co-first authors on the study are UC San Diego researchers Xiangjun Chen, Zhiyuan Lou, Xiaoxiang Gao and Lu Yin.

This work was supported by the Defense Advanced Research Projects Agency (DARPA, contract number HR001120C0093).

The wearable system glued onto a cloth armband.

Credit

David Baillot/UC San Diego Jacobs School of Engineering

Using robotic testing to spot overlooked sensory deficits in stroke survivors



University of Delaware study could pave the way for more precise rehabilitation



University of Delaware

Detecting hidden sensory loss 

image: 

Joanna Hoh, a biomechanics and movement science (BIOMS) doctoral student at the University of Delaware, tests Don Lewis’ sensory loss in his arm post-stroke using a KINARM robotic exoskeleton.

view more 

Credit: Ashley Barnas Larrimore/ University of Delaware





A decade ago, at age 55, Don Lewis suffered a stroke in his sleep. When he woke up, he couldn’t move his left arm or leg. Lewis’ neighbor realized his truck hadn’t moved in two days and called 911 for a welfare check. When paramedics found him, he was paralyzed on one side.

“At the hospital, they told me an aneurysm caused my stroke,” he said.

He would remain there for two months, and after extensive physical therapy, Lewis regained use of his left leg. His left arm remains paralyzed.

“I feel pain when I hit it or scrape it walking through a doorway, but I can’t control the motion.”

Since then, the cancer survivor has had two more strokes. 

Now, Lewis is helping University of Delaware researchers understand one of the most overlooked challenges in stroke recovery – proprioception, the body’s ability to sense movement and position.

“To simplify the concept, in class, I tell my undergraduates to close their eyes and touch their nose; if people can’t do that, it means they likely have impaired proprioception,” said Jennifer Semrau, associate professor of kinesiology and applied physiology, in the College of Health Sciences.

In findings recently published in Neurorehabilitation and Neural Repair, Semrau and doctoral candidate Joanna Hoh suggest it’s possible to identify hidden sensory losses after stroke without requiring patients to move their affected arm. This advance could make assessments more accessible in clinical settings.

Assessing movement

Inside the lab, Lewis is placed in a KINARM robotic exoskeleton that tracks upper limb movement, allowing Semrau to better understand the neural and behavioral mechanisms that contribute to his recovery of sensory and motor function. 

Semrau’s lab used several tests, including a new one – the single-arm measurement- to gauge perception-based movement. The test moves Lewis’ stroke-affected arm robotically while he responds with his non-affected arm if he can feel the movement of his stroke-affected arm.

“We’re trying to determine the lowest level someone can detect their arm moving,” Semrau said.

The average person, who hasn’t had a stroke, can feel as little movement as a half centimeter. For people post-stroke, it varies. 

“Some can’t tell their arm was moved 10 centimeters, and that could be the difference between touching a hot stove or a knife in the kitchen,” Semrau said.

The communication from the brain to the receptors in the muscles, which are responsible for detecting movement, is disrupted after a stroke.

“When you move, the receptors lengthen or shorten, and if the information isn’t getting from the brain to those muscle receptors, you can’t properly coordinate movement,” Semrau said. 

However, someone with a proprioceptive deficit could still feel pain and may not have a touch impairment.

“Pain is part of the somatosensory system and is relayed on a different set of nerves. After a stroke, some may have increased or decreased sensitivity to pain, and it’s the same with touch,” Semrau said. “Every person is a fingerprint--impairments each person has after a stroke are unique and require individualized treatment." 

The difficulty Semrau faces: it’s challenging to tease apart sensory deficits from motor deficits because they’re deeply intertwined. 

“It’s hard to determine whether the issue is the person’s ability to feel the arm or their ability to move,” she said. “The tasks we’re studying in our lab get to the heart of the matter.”

From clinic to the classroom

Hoh, an occupational therapist, became interested in upper-limb stroke research after working with patients in rehabilitation. 

“We often think about movement through motor function,” said Hoh. “I had a blind spot to the sensory system in terms of stroke recovery and realized this is an avenue we don’t consider enough as clinicians.”

That inspired her to pursue her doctorate in biomechanics and movement science at UD. Her dissertation focuses on individuals with sensory issues following a stroke and how these issues affect their daily activity levels.

Semrau hopes their ongoing research will raise awareness of the problem and encourage more clinicians to integrate this kind of precision testing. 

“In one of our studies, we found that just 1% of clinicians assess proprioception in people with stroke,” Semrau said. “It’s a newer area, but research also shows that without sensory recovery, a person will not gain full recovery of function after a stroke.” 

To develop a personalized medicine approach to treatment, both Semrau and Hoh emphasized the need for a better understanding of post-stroke impairments.

“The onus is on clinicians and researchers to ensure they’re testing for sensory deficits. Just because someone is impaired motorically, it doesn’t mean they will or won’t be impaired sensory-wise,” Hoh said. 

Semrau added, “Understanding the connection between motor and sensory impairments that affect function is key to better targeting therapies and tailoring recovery for each individual.” 

No comments: