A rather unusual situation recently unfolded inside a laboratory-moths playing a “video game,” flitting their wings as they navigated through a virtual forest displayed on a projector screen.
Each of the moths’ movements was being tracked by Boston University engineers and University of Washington biologists. The team is leveraging the moth data to develop new navigational programs that provide autonomous aerial drones with a better sense of direction. The findings of their latest study, funded partly by a $7.5 million Department of Defense Multidisciplinary University Research Initiative (MURI) grant to develop self-navigating vehicles that can traverse land, sea, and air, were recently published online in PLOS Computational Biology.
The navigational challenge the moths helped the team overcome? Engineers working on control programs for autonomous vehicles have long struggled with what they call “the curse of dimensionality,” which is that a drone navigating a multidimensional world has such an overwhelming number of options to consider at any given time (which direction it travels, how far, how fast, and so on) that it struggles to determine the best paths to take. Robots just aren’t naturals at making those kinds of decisions-but living beings are.
“Humans and animals are ideal navigators; they’re our experts,” says Yannis Paschalidis, a BU College of Engineering professor of electrical and computer, biomedical, and systems engineering, and a senior author on the new study.
“They can learn very fast and navigate quickly in very complex environments,” he says. “And if we can observe them and understand the [navigational strategies] that they are using, we can take these as our starting point. Then, with less computational effort, we can adapt those strategies to fit any new situation and any new drone that needs to navigate in a certain environment.”