Desert ants' foraging lives uncovered with new tracking tech

  • New 'decade in the making' low-cost tracking technology reveals remarkable speed at which desert ants memorise homeward journeys
  • Desert ants previously tracked using pen and paper or GPS, leaving gaps in knowledge around behaviour
  • The software works across animal types and it's low-cost means it could be used by citizen scientists
  • The more complete picture it builds could inspire the next generation of bioinspired robots

Groundbreaking tracking technology that has revealed new insights into how desert ants navigate their complex worlds could inspire the next generation of smart, efficient robots.

An international research collaboration involving the University of Sheffield has developed new tracking technology which uses computer vision - a field of computer science that programs computers to interpret and understand images and videos - to track individual desert ants over their entire foraging lives. The tool documents an ant's journey from when it first leaves its nest until it finds a food site and returns to its colony.

Their new dataset has revealed that the ants learn incredibly quickly - memorising their homeward paths after just one successful trip. But intriguingly, their outward routes evolved over time indicating different strategies for exploration versus exploitation. The high precision data also revealed an underlying oscillatory movement that is invisible to the human eye, which can explain how ants generate complex search patterns suited to the current conditions.

As the new software works across animal types, and uses video captured using standard cameras, it is already being adopted by numerous international research groups, and is ideally suited to citizen science projects. The high-precision data gathered is crucial to understanding how brains can guide animals through their complex world, which could inspire a new generation of bioinspired robots.

The new technology and dataset - produced by Dr Michael Mangan, a Senior Lecturer in Machine Learning and Robotics at the University's Department of Computer Science with Lars Haalck and Benjamin Risse of the University of Münster, Antoine Wystrach and Leo Clement of the Centre for Integrative Biology of Toulouse and Barabara Webb of the University of Edinburgh - is demonstrated in a new study published in the Science Advances journal.

The study describes how CATER (Combined Animal Tracking & Environment Reconstruction) uses artificial intelligence and computer vision to track the position of an insect in video captured using off-the-shelf cameras. The system can even detect tiny objects difficult to see by eye, and is robust to background clutter, obstructions and shadows allowing it to function in the animal's natural habitat where other systems fail.

Dr Michael Mangan, Senior Lecturer in Machine Learning and Robotics at the University of Sheffield, said: "We captured this data during a summer field trip, but it has taken 10 years to build a system capable of extracting the data, so you could say it's been a decade in the making.

/Public Release. This material from the originating organization/author(s) might be of the point-in-time nature, and edited for clarity, style and length. Mirage.News does not take institutional positions or sides, and all views, positions, and conclusions expressed herein are solely those of the author(s).View in full here.