Great Barrier Reef spatial technology

Erin Peterson

Making a splash in 2018 was the groundbreaking citizen science project Virtual Reef Diver. As the ABC’s feature project during Science Week, the app allows amateur marine biologists to assist conservation efforts by classifying imagery from their mobile phone, and uploading their own.

But behind this innovative approach to categorising data lies a highly sophisticated predictive mapping engine that is creating an extremely powerful spatial tool to aid management efforts on the Great Barrier Reef. Project lead, Dr. Erin Peterson, discusses the benefits of spatial technology in ecological conservation.

Erin, could you describe the motivation and intended outcome for this engaging, crowdsourced approach to data categorisation?

We were interested in looking at innovative ways to monitor the Great Barrier Reef.

The challenge is that it’s just massive. It’s 2, 300km long and contains something like 25,000 square kilometres of coral reefs, so it’s just physically and financially impossible to go out and monitor coral on the ground comprehensively.

In a lot of cases and environments you can monitor with high resolution, remotely sensed products, but because coral reefs are underwater, once you get a certain depth you can’t you can’t see them anymore, so you don’t have that option of using remote sensing.

So we were looking for a creative way to increase the spatial and temporal coverage of data on the Great Barrier Reef, so that the managers have better information to make more informed decisions.

My assumption is that this classified imagery will be to help train a machine learning algorithm. Could you tell some specifics about how the inputs from users of Virtual Reef Diver will be used?

Well, we’re not using machine learning but you could. What you see online is the ‘citizen science’ face of it, but Virtual Reef Diver is really an innovative step in the way that we monitor and model the marine environment. With all the new technologies coming online, there are these high volumes of data coming in.

So we’ve built a software platform that ingests data from multiple sources, and those sources might be from professional teams like the Australian Institute of Marine Science or university researchers, or these images that citizens can contribute and classify.

Behind the scenes, that data is being manipulated and modelled automatically and then we predict the maps of coral cover, with estimates of uncertainty.

So the idea is to create this automated platform, so that as data comes in, it can be modelled and used to make predictions based on the most up-to-date information possible.

With hundreds of thousands of images coming in, humans are the bottleneck in that workflow. So that’s the real spatial science innovation here in terms of environmental monitoring.

Great Barrier Reef

It feels like a stroke of genius to invite public involvement in this way.  Where has the imagery been sourced from, and what processes are in place to validate users’ input?

The citizen science data is just another data source, and so each of our data sources have different characteristics, whether they come from professional teams or citizens. The images have different extents, different quantities of points being classified, for example.

When it comes to the citizen data, the quality of the classification is different. We know that people at home aren’t always going to get the answer right, so we weight the data to account for the fact that it is collected using different methods and varying quality. Then we use a spatial statistical model to make predictions at a 500m spatial resolution, with estimates of uncertainty.

This model is quite exciting because as spatial scientists we know that when you collect data in a certain place, it’s most likely going to give you some idea of what’s happening nearby, even if they’re places where you didn’t go and sample.

So we know that things that are related in space, and we can quantify that in a spatial statistical model that allows us to make better predictions in areas where we didn’t sample.

How will the outputs of this model, within the larger project contribute to more comprehensive maps of the Great Barrier Reef?

Right now we’ve made predictions from 2002 to 2015, but the system is designed to make updated maps as new data come in. So, the key issue we are trying to solve is that there are no publicly available coral cover maps of the GBR.

Furthermore, some of the other models used to predict coral cover didn’t use all the available information. For example, a lot of the existing maps are generated based on the Australian Institute of Marine Science Long-term Monitoring Program or the Marine Monitoring Program, which revisit the same 135 sites every other year.

Our map is currently the most comprehensive map of coral cover, the only publicly available one and we make predictions at 500 square metre scale. So if you are monitoring and reporting, now you can summarise cover at different spatial scales – at a reef, at a region or the whole of the GBR – and based on the most up to date information possible.

You can really see how you’re doing with coral cover as a reef health indicator.

What will the primary use of these maps be, and how will they contribute to management and conservation efforts for the Great Barrier Reef?

So, the monitoring that goes on now is specifically designed to look at trends at a few representative sites but it doesn’t give us the big picture – the spatial heterogeneity that’s going on and how it’s changing on the reef.

This information is really important when it comes to reef interventions, such as genetically modified coral. We can’t plant coral larvae everywhere, but we can seed reefs that are strongly connected to other reefs through ocean circulation.

With traditional resources, we don’t have a good idea of overall coral cover, which you need to carry out genetic connectivity modelling, monitor disease outbreak, crown of thorns starfish outbreaks and other types of analysis for this kind of intervention.

The other key advantage of this new map is that it’s open – being publicly available, it means that people can use it in their own scientific projects or management problems that they’re working on.

This article first appeared in Position magazine and Spatial Source, and it brought to you by the Spatial Industries Business Association. Erin Peterson will present a plenary at the Locate19 Conference at 10:25am on Tuesday, April 9.

Related articles

Leave a reply

©2024 Infrastructure Magazine. All rights reserved


We're not around right now. But you can send us an email and we'll get back to you, asap.


Log in with your credentials

Forgot your details?