AI tool could speed up dementia diagnosis

  • A new digital tool that is able to look for early signs of dementia and Alzheimer's disease more quickly and efficiently developed by University of Sheffield researchers with funding from the National Institute for Health and Care Research
  • CognoSpeak uses artificial intelligence and speech technology to automatically analyse language and speech patterns that could warrant further specialist investigation and be early signs of dementia or Alzheimer's disease
  • Research has found the tool is as accurate at predicting Alzheimer's as the current pen-and-paper-based assessments
  • Tool is now being trialled more widely as researchers are recruiting 700 patients from memory clinics across the UK to develop the system further
  • CognoSpeak could help patients start treatments sooner and reduce the burden on dementia assessment services by freeing up valuable specialist time, improving access to care and aiding earlier diagnosis

A new AI tool that could help doctors assess the early signs of dementia and Alzheimer's more quickly and efficiently, has been developed by researchers at the University of Sheffield.

The system, known as CognoSpeak, uses a virtual agent displayed on a screen to engage a patient in a conversation. It asks memory-probing questions inspired by those used in outpatient consultations and conducts cognitive tests, such as picture descriptions and verbal fluency tests.

The tool then uses artificial intelligence and speech technology to analyse language and speech patterns to look for signs of dementia, Alzheimer's disease and other memory disorders.

Researchers behind the technology say CognoSpeak could play a key role in reducing the burden on dementia assessment services, once further testing in GP and secondary care memory clinics across the UK is complete.

The system is being designed to work in between primary and secondary care. This means that once fully rolled out, a GP could refer a person with memory complaints to use the technology. CognoSpeak would send the test results back to the GP and then they would decide whether they need to refer the patient to a memory clinic for further assessment.

CognoSpeak can be accessed through a web browser - meaning patients are able to take the test in the comfort of their home via a computer, laptop or tablet, rather than having to wait for a hospital appointment to take a pen-and-paper-based assessment, which can often cause undue stress and anxiety.

Early trials have shown the technology is as accurate at predicting Alzheimer's as the current pen-and-paper-based tests used to assess or screen for cognitive, memory or thinking impairments. The team has demonstrated accuracies of 90 per cent for distinguishing people with Alzheimer's from people that are cognitively healthy.

Developed by Dr Dan Blackburn and Professor Heidi Christensen from the University of Sheffield's Departments of Neuroscience and Computer Science, the CognoSpeak system is still in the research phase, but thanks to a £1.4 million grant from the National Institute for Health and Care Research (NIHR), the technology is being trialled more widely. The researchers are recruiting 700 participants from memory clinics across the UK to help develop the system further.

Dr Dan Blackburn, from the University of Sheffield's Department of Neuroscience, said: "Waiting for a possible diagnosis of dementia can be a very anxious time for patients and their families. This tool could help patients start treatments sooner, reduce waiting times and give people certainty earlier.

"The CognoSpeak system could transform how dementia and other memory disorders are diagnosed by speeding up assessments. This would also free-up clinicians' valuable time and mean that those who need specialist care get access to it as quickly as possible."

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.