QUT robotics researchers have developed a new robot navigation system that mimics neural processes of the human brain and uses less than 10 per cent of the energy required by traditional systems.
In a study published in the journal Science Robotics , the researchers detail a new system which they call LENS – Locational Encoding with Neuromorphic Systems.
LENS uses brain-inspired computing to set a new, low-energy benchmark for robotic place recognition.
The research, conducted by first author neuroscientist Dr Adam Hines along with Professor Michael Milford and Dr Tobias Fischer , all from the QUT Centre of Robotics and the QUT School of Electrical Engineering and Robotics , uses a system called neuromorphic computing
"To run these neuromorphic systems, we designed specialised algorithms that learn more like humans do, processing information in the form of electrical spikes, similar to the signals used by real neurons," Dr Hines said.
"Energy constraints are a major challenge in real-world robotics, especially in fields like search and rescue, space exploration and underwater navigation.
"By using neuromorphic computing, our system reduces the energy requirements of visual localisation by up to 99 per cent, allowing robots to operate longer and cover greater distances on limited power supplies.
"We have known neuromorphic systems could be more efficient, but they're often too complex and hard to use in the real world – we developed a new system that we think will change how they are used with robots."
In the study, the researchers developed LENS, a system that was able to recognise locations along an 8km journey but using only 180KB of storage – almost 300 times less than other systems.
LENS combines a brain-like spiking neural network with a special camera that only reacts to movement and a low-power chip, all on one small robot.
"This system demonstrates how neuromorphic computing can achieve real-time, energy-efficient location tracking on robots, opening up new possibilities for low-power navigation technology," Dr Hines said.
"Lower energy consumption can allow remotely operated robots to explore for longer and further.
"Our system enables robots to localise themselves using only visual information, in a way that is both fast and energy efficient."
Dr Fischer, ARC DECRA Fellow, said the key innovation in the LENS system was a new algorithm that exploited two types of promising bio-inspired hardware: sensing, via a special type of camera known as an "event camera", and computing, via a neuromorphic chip.
"Rather than capturing a full image of the scene that takes in every detail in each frame, an event camera continuously senses changes and movement every microsecond," Dr Fischer said.
"The camera detects changes in brightness at each pixel, closely replicating how our eyes and brain process visual information.
"Knowing where you are, also known as visual place recognition, is essential for both humans and robots.
"While people use visual cues effortlessly, it's a challenging task for machines."
Professor Michael Milford, director of the QUT Centre for Robotics, said the study was representative of a key theme of research conducted by the centre's researchers.
"Impactful robotics and tech means both pioneering ground-breaking research, but also doing all the translational work to ensure it meets end user expectations and requirements," Professor Milford said.
"You can't just do one or the other.
"This study is a great example of working towards energy-efficient robotic systems that provide end-users with the performance and endurance they require for those robots to be useful in their application domains."