PULLMAN, Wash. - Researchers have long been able to use information from smartwatches to identify physical movement, such as sitting or walking, that wearers are performing in a controlled lab setting.
Now, Washington State University researchers have developed a way, using a computer algorithm and a large dataset gathered from smartwatches, to more comprehensively identify what people are doing in everyday settings, such as working, eating, doing hobbies or running errands.
The work, published in the IEEE Journal of Biomedical and Health Informatics , could someday lead to improve assessment and understanding of cognitive health, rehabilitation, disease management, or surgical recovery. In their study, the researchers were able to accurately identify activities 78% of the time.
"If we want to determine whether somebody needs caregiving assistance in their home or elsewhere and what level of assistance, then we need to know how well the person can perform critical activities," said Diane Cook, WSU Regents Professor in WSU's School of Electrical Engineering and Computer Science who led the work. "How well can they bathe themselves, feed themselves, handle finances, or do their own errands? These are the things that you really need to be able to accomplish to be independent."
One of the big challenges in healthcare is trying to assess how people who are sick or elderly are managing their everyday lives. Medical professionals often need more comprehensive information about how a person performs functional activities, or higher-level, goal-directed behavior, to really assess their health. As anyone who is trying to help a distant parent with aging or health challenges knows, that information on how well a person is performing at paying their bills, running errands, or cooking meals is complex, variable, and difficult to gather - whether in a doctor's office or with a smartwatch.
Medical professionals often need more comprehensive information about how a person performs functional activities, or higher-level, goal-directed behavior, to really assess their health.
"Lack of awareness of a person's cognitive and physical status is one of the hurdles that we face as we age, and so having an automated way to give indicators of where a person is will someday allow us to better intervene for them and to keep them not only healthy, but ideally independent," said Cook. "This work lays the foundation for more advanced, behavior-aware applications in digital health and human-centered AI."
For their study, the WSU researchers collected activity information over several years from several studies.
"Whenever we had a study that collected smartwatch data, we added a question to our data collection app that asked participants to self-label their current activity, and that's how we ended up with so many participants from so many studies," she said. "And then we just dug in to see whether we can perform activity recognition."
The 503 study participants over eight years were asked at random times throughout the day to pick from a scroll-down list of 12 categories to describe what they were doing. The categories included things like doing errands, sleeping, traveling, working, eating, socializing, or relaxing. The researchers analyzed a variety of artificial intelligence methods for their ability to generalize across the population of study participants.
The researchers developed a large-scale dataset that includes more than 32 million labeled data points, with each point representing one minute of activity. They then trained an AI model to predict what functional activity had occurred. They were able to predict activities up to 77.7% of the time.
"A foundational step is to perform activity recognition because if we can describe a person's behavior in terms of activity in categories that are well recognized, then we can start to talk about their behavior patterns and changes in their patterns," said Cook. "We can use what we sense to try to approximate traditional measures of health, such as cognitive health and functional independence."
The researchers hope to use their model in future studies in areas such as being able to automate clinical diagnoses, and to look for links between behavior, health, genetics, and environment. The methods and dataset without any identifying information are also publicly available for other researchers to use. The work was funded by the National Institutes of Health.