Clinical Trials Get High-Tech Makeover with Wearables, AI

University College London

A multi-disciplinary team of researchers, involving several UCL scientists, has developed a way to monitor the progression of movement disorders using motion capture technology and AI.

Movement is recorded and analysed by an AI system

In two ground-breaking studies, published in Nature Medicine, a cross-disciplinary team of AI and clinical researchers have shown that by combining human movement data gathered from wearable tech with a powerful new medical AI technology they are able to identify clear movement patterns, predict future disease progression and significantly increase the efficiency of clinical trials in two very different rare disorders, Friedreich's ataxia (FA) and Duchenne muscular dystrophy (DMD).

Tracking the progression of DMD and FA is normally done through intensive testing in a clinical setting. These papers offer a significantly more precise assessment that also increases the accuracy and objectivity of the data collected.

FA and DMD are rare, degenerative, genetic diseases that affect movement and eventually lead to paralysis. There are currently no cures for either disease, but researchers hope that these results will significantly speed up the search for new treatments.

The researchers estimate that using these disease markers means that significantly fewer patients are required to develop a new drug when compared to current methods. This is particularly important for rare diseases where it can be hard to identify suitable patients.

Scientists hope that as well as using the technology to monitor patients in clinical trials, it could also one day be used to monitor or diagnose a range of common diseases that affect movement behaviour such as dementia, stroke and orthopaedic conditions.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.