AI Dugong Detector makes Google Challenge impact

Software using artificial intelligence (AI) to detect dugongs in aerial photographs has been developed by QUT computer engineer Dr Frederic Maire to help scientists monitor diminishing global seagrass meadows.

The Dugong Detector project, involving QUT, Western Australia's Murdoch University, and Vienna University of Technology researchers in partnership with the United Nations Dugong MOU, is one of 10 finalists in the Google.org Impact Challenge Australia 2018.

Four Challenge winners, three chosen by judges and one by public vote, will each receive $1 million in grant funding and support to further their projects, and the other six finalists will each receive $250,000. Voting for the People’s Choice award is open until October 30.

Dugongs are a barometer of seagrass health. Image: Matthieu Juncker.

Since dugongs feed exclusively on seagrass, the detector project seeks to track their numbers to monitor the health of seagrass ecosystems. These ecosystems not only sustain dugongs in 46 countries, they support half the world’s fisheries, providing 3 billion people with food – but seagrasses are vanishing at a rate of 7 per cent a year globally.

"Dugongs are an excellent barometer of seagrass health," said Murdoch University marine scientist and project leader Dr Amanda Hodgson. "If the seagrass disappears, so do the dugongs."

Marine biologists have traditionally used teams of expert observers in small planes to spot and record dugong populations, which is costly and potential risky for the observers flying in remote locations.

The detector project instead uses inexpensive drones equipped with high-resolution cameras to capture images, and these are analysed by the deep neural network-based software developed by Dr Maire which is trained to recognise the marine mammals.

QUT researcher Dr Frederic Maire.

"A deep neural network is a computational model made of a stack of simple, trainable functions," said Dr Maire, from QUT’s Science and Engineering Faculty.

"It is the same technology used in camera phones or on Facebook that detects faces and puts a box around them.

"The Dugong Detector analyses the image data and automatically pinpoints what is likely to be a dugong.

"While it does throw false positives, it reduces the number of images the marine scientists have to manually review. And you can then use those false positive to refine the network and increase the accuracy. That is the beauty of the system, the more data you input the more you can fine tune it."

Dugongs identified in circled areas.

Dr Hodgson estimated that the Dugong Detector had reduced the hours spent manually reviewing thousands of coastal images by 95 per cent.

Dr Maire has received interest from other organisations to use the detector software to monitor whales, turtles and dolphins.

The research team’s proposal for the Challenge will help advance the technology to improve detection rates, integrate a mapping tool and establish a global database for researchers and communities to share their dugong and seagrass information.

Researchers launching a drone to survey dugongs.

Team members with Dr Maire and Dr Hodgson are Dr Christophe Cleguer and Dr Julian Tyne (Murdoch University) and Martin Wieser (Vienna University of Technology).

The public can vote in the Challenge here.

QUT

/Public Release. This material from the originating organization/author(s) may be of a point-in-time nature, edited for clarity, style and length. The views and opinions expressed are those of the author(s). View in full here.