AI Supports Home-based Balance Training

University of Michigan

New machine learning model draws data from wearable sensors to predict how a physical therapist would assess balance training performance

A woman stands on a squishy, plastic dome while wearing black velcro straps around her head, upper arms, wrists, hips, upper thighs, below the knees and feet. Three women surround her with arms extended to catch her if she falls.
Geeta Peethambaran stands on a balance trainer, testing steadiness on a difficult exercise. Emma Nigrelli (left), Wendy Carender (center) and Nadine El-Ghaffir (right) stand close to help if she loses her balance. Image credit: Marcin Szczepanski, Michigan Engineering

Study: Automatic multi-IMU-based deep learning evaluation of intensity during static standing balance training exercises (DOI: 10.1186/s12984-025-01760-3)

Balance training patients may soon be able to get AI feedback during home exercises, with four wearable sensors and a new machine learning model developed at the University of Michigan.

The team hopes their technology could help patients make faster progress during physical therapy and maintain their abilities after the end of their prescribed sessions. It could also help physical therapists to make health care decisions.

Kathleen Sienko
Kathleen Sienko

"Our machine learning model used data from wearable sensors to predict how physical therapists would rate patients' performance on balance exercises, providing a basis to make recommendations about the most appropriate set of exercises to perform next. This type of AI-based support would be helpful in between appointments or after people complete their insurance-reimbursed sessions with a clinician," said Kathleen Sienko, an Arthur F. Thurnau Professor of mechanical engineering at U-M and senior author of the study in the Journal of NeuroEngineering and Rehabilitation.

A close up of a smiling woman wearing glasses with a solid plastic bar across the forehead and a camera in the center of the bar. Gray, half rectangles with no lens surround the eyes where the lens would be in eyeglasses.
Wendy Carender, a physical therapist in the Michigan Medicine Department of Otolaryngology, wears eye tracking glasses. A preliminary study used eye tracking technology to measure where physical therapists focus attention during balance testing, helping researchers understand their assessment rationale and where to place sensors. Image credit: Marcin Szczepanski, Michigan Engineering

The model was built from sensor data combined with analysis from physical therapists, and its development was funded by the National Science Foundation and U-M AI & Digital Health Innovation.

Balance training helps reduce the risk of falls, helping older adults and those with sensory and motor impairments live independent lives for longer. Typically, physical therapists assess the amount of difficulty patients experience while balancing by observing them during in-clinic sessions. The next exercise they suggest must be sufficiently challenging-balance improvements only happen when the neuromuscular system is pushed beyond current abilities-while keeping the patient safe.

In addition to supporting the care of patients with local access to physical therapists, the researchers are also exploring the possibility of providing rural patients with remote care, lifting the burden of long drives while keeping a clinician in the loop.

Leia Stirling
Leia Stirling

"Understanding what the patient and the therapist need has to be part of the algorithms we put together. I'm excited to merge different types of data to create a decision support system for both parties," said study co-author Leia Stirling, U-M professor of industrial and operations engineering and robotics.

To build the model, researchers filmed participants doing standing balance exercises at various levels of difficulty while wearing 13 sensors attached with velcro straps. The sensors, called inertial measurement units, measure acceleration and rotational motion to detect sway and movements of the major body segments.

A closeup of a smiling woman's face. She uses her hands to adjust a black velcro strap positioned around her forehead. The strap has a square, black plastic device in the center.
Geeta Peethambaran positions a sensor on her forehead to participate in a balance training study. Sensor data helped researchers develop a model that aims to provide balance training feedback to patients at home. Image credit: Marcin Szczepanski, Michigan Engineering

Physical therapist study participants watched the videos to evaluate how hard the balancing participants were working on a scale from 1 to 5 for each exercise.

Using the sensor data, the research team trained convolutional neural networks-a class of machine learning models that learn spatial features from image data-to predict balance difficulty. Then, they compared the machine learning model's prediction with the physical therapists' average score for each balance participant.

The team's model predicted patients' balance ratings with nearly 90% accuracy, which is within a point of expert ratings on the same scale. While the study tested 13 sensors, a sensitivity analysis found just four sensors-placed on each thigh, the low back and upper back-were sufficient to maintain model performance.

Xun Huan
Xun Huan

"It is very important to understand both the strengths and potential failure modes of machine learning in physical therapy, where people's well-being is directly at stake. For example, an overfitted model may perform poorly with new patients, leading to mispredictions and unsafe exercise recommendations. To protect patients, these systems should be validated on real-world data and used with therapist oversight so unexpected or risky suggestions can be caught before harm," said study co-author Xun Huan, U-M associate professor of mechanical engineering.

Four women gather together to look at a computer monitor in the foreground. One woman points and speaks while the others look on.
Kathleen Sienko, an Arthur F. Thurnau Professor of mechanical engineering at U-M and senior author of the study, analyzes body movement data from the sensors on a computer monitor surrounded by her graduate students (from left to right) Emma Nigrelli, Jean Avaala and Nadine El-Ghaffir. Image credit: Marcin Szczepanski, Michigan Engineering

In a second, related study published in IEEE Transactions on Neural Systems and Rehabilitation Engineering, physical therapists wore eye tracking glasses and provided explanations of their assessments while the participants performed these standing balance exercises. Analyzing where physical therapists focus attention as exercises become more difficult helped researchers understand their decision-making process.

"It was interesting to see how complicated the physical therapists' balance assessments are and to consider how best to capture the factors they consider in our models," said Emma Nigrelli, U-M doctoral student of mechanical engineering and lead author of the eye tracking study.

Moving forward, the team hopes to contribute the technical grounding and background for machine-learning-assisted balance training technology to be widely available.

Center: A woman stands on one leg while wearing black velcro straps around her head, upper arms, wrists, hips, upper thighs, below the knees and feet. Left: A woman stands behind with her hands stretched to catch if she falls. Right: Two women observe.
Geeta Peethambaran performs a static balance training exercise while wearing 13 sensors on velcro straps. Emma Nigrelli (left) spots while Wendy Carender (right) observes, wearing eye-tracking glasses. Nadine El-Ghaffir looks on. Image credit: Marcin Szczepanski, Michigan Engineering

"In some regions, access to physical therapists specializing in balance rehabilitation may not be possible," Sienko said. "I was excited by the possibility of developing something that could expand access to services like balance training-not only for people in rural areas across the U.S. who may lack regular access to physical therapists, but also for individuals globally."

Safa Jabri, Jeremiah Hauth, Lauro Ojeda and Jenna Wiens of the U-M College of Engineering and Wendy Carender of the Michigan Medicine Department of Otolaryngology also contributed to this research.

The research was funded by the National Science Foundation (CMMI-2125256; 2125256) and the University of Michigan.

The team is seeking partners to bring the technology to market.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.