New research has revealed insights into the brain processes of human echolocators - blind people who use mouth clicks and the returning echoes from nearby objects to navigate their environment.
Scientists from Cardiff University and the Smith–Kettlewell Eye Research Institute in San Francisco have, for the first time, uncovered how the brain processes auditory information from echoes to give clues about their environment moment by moment, showing how brain activity tracks and accumulates the spatial information and relates to improvements in echolocation skills.
Haydee Garcia Lazaro, Cardiff University Brain Research Imaging Centre, said: "Some blind people can navigate and represent their surroundings using echolocation. By interpreting echoes that bounce back from objects around them, they can work out where objects are, how far away they are, and even their shape and material.
"This ability can be extremely accurate in expert human echolocators – but little is known about how the brain combines information from multiple echoes over time to build a reliable sense of space."
We wanted to understand how echolocation works in the brain - specifically whether it relies on finding a single 'good' echo in a sequence or on combining information from many echoes.
The researchers asked blind expert echolocators (who regularly use echolocation in their daily life) and sighted people (with no echolocation training) to listen to realistic, computer‑generated mouth clicks and their echoes, which simulated an object placed at a 1-metre distance slightly to the left or right in front of them. During the experiment, they heard sequences ranging from a few clicks to up to eleven clicks in a row. Their task was to decide whether the object was on the left or the right.
During the task, the researchers recorded the participant's brain activity using an electroencephalogram (EEG). This allowed them to track changes in the brain signals over time at the millisecond level.
The researchers found that blind expert echolocators were significantly better at judging where the object was, with some of the echolocators identifying the position of the object with just two clicks. Novice-sighted participants, however, struggled to solve the task, performing at chance, and showing little improvement even when more clicks were provided.
The researchers also found that among the blind participants, those who had been blind from birth performed better than those who lost their sight later in life, consistent with earlier studies.
The echolocators became more accurate as they heard more clicks, suggesting that the brain 'adds up' information across clicks, rather than relying on a single moment of sound.