A leg up on better running data
Machine learning, wearable sensors could help reduce injury, improve form
image:
Study co-author Andrew Chin runs with wearable sensors on the Harvard McCurdy Outdoor Track.
view moreCredit: Eliza Grinnell / Harvard SEAS
Key Takeaways
- Wearable inertial measurement units combined with machine learning can measure a runner’s braking and propulsion forces.
- A generalized machine learning model trained on lab data can accurately predict real-world running forces.
- The research opens the door to future wearable products that help runners avoid injury and improve their form.
Today’s GPS smartwatches and other wearable devices give millions of runners reams of data about their pace, location, heart rate and more. But one thing your Garmin can’t measure is plain old physics: How much force is being generated when your foot hits the ground and takes off again.
These backward-forward, braking and propulsion forces a runner generates with each stride are closely associated with performance and injury. Biomechanics experts in the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) think wearable sensor technology is poised to help runners better understand these forces and ultimately stay healthier.
A recent study in PLOS One from the lab of Conor Walsh, the Paul A. Maeder Professor of Engineering and Applied Sciences at SEAS, shows that simple, commercially available sensors worn on the body can provide useful data on what researchers call ground-reaction forces. These insights could open avenues to devices and products that deliver this data to users in real time.
“Wearable sensors, combined with machine learning, can accurately estimate the forces acting on a runner’s body – not just in the lab, but out in the real world,” said lead author Lauren Baker, a recent Ph.D. graduate from Walsh’s lab (and a runner herself).
Overstriding and braking
The study built on foundational work by co-author Daniel Lieberman, the Edwin M. Lerner Professor of Biological Sciences in the Faculty of Arts and Sciences at Harvard, who studies running in the context of human evolution and is a longtime collaborator with the Walsh lab. Lieberman’s team had highlighted the relationship between overstriding – when a runner’s foot lands far ahead of their hips and knees – and larger braking forces. The two teams also collaborated on a 2024 study that showed that wearable sensors could capture braking and overstriding data, and they wondered: What other relevant forces could they capture?
Baker chose to focus on the horizontal forces of braking and propulsion, not only because they relate to overstriding, but because most ground-reaction force studies focus on the vertical direction – most closely associated with body weight and speed.
Capturing and interpreting data
For the study, Baker and colleagues took ground-reaction force data from 15 volunteer runners in the Harvard Motion Capture Lab, which has a motion capture camera system, a treadmill with force-sensitive plates below the belts, and a mini track with force-sensitive plates embedded in the floor. Each runner also wore a set of sensors called inertial measurement units, which are widely used in phones, watches and gaming applications. These units capture motion and orientation data but cannot directly measure force. That’s where machine learning comes in.
The raw force data were fed into a machine learning model, a version of which the lab had used in other studies to estimate walking propulsion forces in people recovering from stroke. The new running-optimized model interpreted the relationship between the wearable sensor data and the force data collected in the lab.
The results showed that a generalized model trained from lab data could accurately predict overstriding-related running forces from new people who weren’t in the original training data. Adding just a small amount (about eight steps) of user data could fine-tune the predictions to make them even better personalized to the individual.
To bring real-world applicability into the study, Baker then tested the lab-trained model by outfitting a subset of five runners with the same small, commercially available inertial measurement sensors worn on the hip and lower leg. These runners wore the sensors as they ran on the Harvard outdoor track. The model estimated braking and propulsion forces from the outdoor runners based on the sensor-captured motion data alone.
The study tackled the challenge of predicting braking and propulsion during real-world, outdoor running, where direct measurement of forces is impractical or impossible.
“It struck me that a lot of biomechanics research takes place on a treadmill,” Baker said. “And a lot of running does not.”
Lieberman added: “Since we can learn so much by studying people on treadmills, this research will help us test hypotheses about running biomechanics in the real world, especially to help prevent injuries.”
Future directions
Walsh’s lab is continuing to find potential innovative uses of machine learning and wearable sensor technology for human locomotion and health.
“More data to understand performance for recreational or elite athletes is a trend that is only increasing,” Walsh said. “We recognize there is a big gap between what existing commercial wearables measure, and what we would really like to measure – to really understand running form or athletic performance … I think we’re looking at extending wearables to not just measure steps, speed or heart rate, but to really give a more detailed understanding of how the body is moving, whether that’s walking or running.”
Future directions for the research could include ascertaining the correct number or placement of sensors on the body for maximum result and integrating existing smartwatch capabilities with the sensors to provide a runner feedback on their form.
“Wearable sensors could give runners access to running metrics historically limited to lab collection only,” Baker said.
The paper was co-authored by Fabian C. Weigend, Krithika Swaminathan, Daekyum Kim and Andrew Chin.
Study lead author Lauren Baker attaches wearable sensors to a participant in the lab.
Credit
Eliza Grinnell / Harvard SEAS
Study co-author Andrew Chin runs on the lab treadmill.
Credit
Eliza Grinnell / Harvard SEAS
Journal
PLOS One
Method of Research
Experimental study
Subject of Research
People
Article Title
Estimating braking and propulsion forces during overground running in and out of the lab
FAU Engineering researchers make great ‘strides’ in gait analysis technology
Florida Atlantic University
image:
Microsoft’s Azure Kinect depth camera captures 3D data, color images, and body movements for motion tracking.
view moreCredit: Florida Atlantic University
A study from the College of Engineering and Computer Science and the Sensing Institute (I-SENSE) at Florida Atlantic University reveals that foot-mounted wearable sensors and a 3D depth camera can accurately measure how people walk – even in busy clinical environments – offering a powerful and more accessible alternative to traditional gait assessment tools.
Gait, the pattern of how a person walks, is an increasingly important marker of overall health, used in detecting fall risk, monitoring rehabilitation, and identifying early signs of neurodegenerative diseases such as Parkinson’s disease and Alzheimer’s disease. Although electronic walkways like the Zeno™ Walkway have long been considered the gold standard for gait analysis, their high cost, large footprint and limited portability restrict widespread use – especially outside controlled lab settings.
To overcome these barriers, FAU researchers and collaborators conducted the first known study to simultaneously evaluate three different sensing technologies: APDM wearable inertial measurement units (IMUs); Microsoft’s’ Azure Kinect depth camera; and the Zeno™ Walkway – under identical, real-world clinical conditions. The depth-sensing camera captures 3D data, color images, and body movements for use in AI, robotics and motion tracking.
The study findings, published in the journal Sensors, reveal that foot-mounted IMUs and the Azure Kinect not only match the accuracy of traditional tools but also enable scalable, remote and cost-effective gait analysis.
“This is the first time these three technologies have been directly compared side by side in the same clinical setting,” said Behnaz Ghoraani, Ph.D., senior author and an associate professor in the FAU Department of Electrical Engineering and Computer Science and the Department of Biomedical Engineering, and an I-SENSE fellow. “We wanted to answer a question the field has been asking for a long time: Can more accessible tools like wearables and markerless cameras reliably match the clinical standard for detailed gait analysis? The answer is yes – especially when it comes to foot-mounted sensors and the Azure Kinect.”
The study recruited 20 adults aged 52 to 82, who completed both single-task and dual-task walking trials – a method often used to mimic real-world walking conditions that require multitasking or divided attention. Each participant’s gait was captured by the three systems at the same time, thanks to a custom-built hardware platform the FAU researchers developed, which precisely synchronized all data sources to the millisecond.
Researchers evaluated 11 different gait markers, including basic metrics like walking speed and step frequency, as well as more detailed indicators such as stride time, support phases and swing time. These markers were analyzed using statistical methods to compare each device’s measurements with those from the Zeno™ Walkway.
The results were clear: foot-mounted sensors showed near-perfect agreement with the walkway across nearly all gait markers. The Azure Kinect also performed impressively, maintaining strong accuracy even in the complex, real-world clinic setting where multiple people, including caregivers and staff, were present in the camera’s field of view. In contrast, lumbar-mounted sensors, which are commonly used in wearable gait studies, demonstrated significantly lower accuracy and consistency, particularly for fine-grained gait cycle events.
Many studies use lower-back sensors because they are easy to mount. However, data from this study shows that they often fail to capture the details clinicians care most about – especially timing-based markers that can reveal early signs of neurological problems.
“By testing these tools in a realistic clinical environment with all the unpredictable visual noise that comes with it, we’ve made great strides toward validating them for everyday use,” said Ghoraani. “This isn’t just a lab experiment. These technologies are ready to meet real-world demands.”
Importantly, the study is the first to benchmark the Azure Kinect against an electronic walkway for micro-temporal gait markers – filling a critical gap in the literature and confirming the device’s potential clinical value.
“The implications of this research are far-reaching,” said Stella Batalama, Ph.D., dean of the FAU College of Engineering and Computer Science. “As health care systems increasingly embrace telehealth and remote monitoring, scalable technologies like wearable foot sensors and depth cameras are emerging as powerful tools. They enable clinicians to track mobility, detect early signs of functional decline, and tailor interventions – without the need for costly, space-intensive equipment.”
Study co-authors are first author Marjan Nassajpour and Mahmoud Seifallahi, both doctoral students in the FAU College of Engineering and Computer Science; and Amie Rosenfeld, a physical therapist researcher and assistant director of education; Magdalena I. Tolea, Ph.D., research assistant professor of neurology and associate director of research; and James E. Galvin, M.D., professor of neurology, chief, Division of Neurology, and director, Comprehensive Center for Brain Health, all with the University of Miami Miller School of Medicine.
This work was supported by the National Science Foundation and the National Institutes of Health.
Caption
A closeup of Microsoft’s Azure Kinect depth camera, which captures 3D data, color images, and body movements for motion tracking.
Credit
Florida Atlantic University
(From left): Marjan Nassajpour; Behnaz Ghoraani, Ph.D.; Mahmoud Seifallahi, Ph.D., (seated); and Mustafa Shuqair, Ph.D.
Credit
Alex Dolce, Florida Atlantic University
- FAU -
About FAU’s Sensing Institute (I-SENSE)
Florida Atlantic University’ Sensing Institute (I-SENSE) is a university-wide research institute advancing innovation in sensing, smart systems, and real-time situational awareness technologies. As the hub for FAU’s strategic research emphasis in Sensing and Smart Systems, I-SENSE integrates cutting-edge research in sensing, computing, AI/ML, and wireless communication across disciplines and domains. With a mission to catalyze research excellence and deliver high-impact technological solutions, I-SENSE drives interdisciplinary collaboration across academia, industry, and government. From infrastructure systems and weather forecasting to health, behavior, and connected autonomy, I-SENSE-enabled technologies support improved decision-making, automated control, and fine-grained situational awareness at scale. Learn more at isense.fau.edu.
About FAU’s College of Engineering and Computer Science:
The FAU College of Engineering and Computer Science is internationally recognized for cutting-edge research and education in the areas of computer science and artificial intelligence (AI), computer engineering, electrical engineering, biomedical engineering, civil, environmental and geomatics engineering, mechanical engineering, and ocean engineering. Research conducted by the faculty and their teams expose students to technology innovations that push the current state-of-the art of the disciplines. The College research efforts are supported by the National Science Foundation (NSF), the National Institutes of Health (NIH), the Department of Defense (DOD), the Department of Transportation (DOT), the Department of Education (DOEd), the State of Florida, and industry. The FAU College of Engineering and Computer Science offers degrees with a modern twist that bear specializations in areas of national priority such as AI, cybersecurity, internet-of-things, transportation and supply chain management, and data science. New degree programs include Master of Science in AI (first in Florida), Master of Science and Bachelor in Data Science and Analytics, and the new Professional Master of Science and Ph.D. in computer science for working professionals. For more information about the College, please visit eng.fau.edu.
About Florida Atlantic University:
Florida Atlantic University serves more than 32,000 undergraduate and graduate students across six campuses along Florida’s Southeast coast. Recognized as one of only 21 institutions nationwide with dual designations from the Carnegie Classification - “R1: Very High Research Spending and Doctorate Production” and “Opportunity College and University” - FAU stands at the intersection of academic excellence and social mobility. Ranked among the Top 100 Public Universities by U.S. News & World Report, FAU is also nationally recognized as a Top 25 Best-In-Class College and cited by Washington Monthly as “one of the country’s most effective engines of upward mobility.” As a university of first choice for students across Florida and the nation, FAU welcomed its most academically competitive incoming class in university history in Fall 2025. To learn more, visit www.fau.edu.
Journal
Sensors
Method of Research
Observational study
Subject of Research
People
Article Title
Comparison of Wearable and Depth-Sensing Technologies with Electronic Walkway for Comprehensive Gait Analysis
No comments:
Post a Comment