AI, Drones Survey Great Barrier Reef in Last Ditch Effort to Avoid Catastrophe

An Australian research team is using tech to monitor global climate change’s assault on the world’s largest living organism

3 min read

Left, AIMS boat shown as team prepares to trial the drone and hyperspectral camera on the Great Barrier Reef. Right: Hyperspectral camera mounted on drone.
On the left, an AIMS is boat shown as the team prepares to test the drone and hyperspectral camera on the Great Barrier Reef. On the right, the hyperspectral camera is shown mounted on the drone.
Photos, left: Scott Bainbridge/AIMS; right: QUT

The stats are daunting. The Great Barrier Reef is 2,300 kilometers long, comprises 2,900 individual coral reefs, and covers an area greater than 344,000 square kilometers, making it the world’s largest living organism and a UNESCO World Heritage Site. 

A team of researchers from Queensland University of Technology (QUT) in Brisbane, is monitoring the reef, located off the coast of northeastern Australia, for signs of degradation such as the bleaching caused by a variety of environmental pressures including industrial activity and global warming. 

The team, led by Felipe Gonzalez, an associate professor at QUT, is collaborating with the Australian Institute of Marine Science (AIMS), an organization that has been monitoring the health of the reef for many years. AIMS employs aircraft, in-water surveys, and NASA satellite imagery to collect data on a particular reef’s condition. But these methods have drawbacks, including the relatively low resolution of satellite images and high cost of operating fixed-wing aircraft and helicopters.

So Gonzalez is using an off-the-shelf drone modified to carry both a high-resolution digital camera and a hyperspectral camera. The monitoring is conducted from a boat patrolling the waters 15 to 70 km from the coast. The drone flies 60 meters above the reef, and the hyperspectral camera captures reef data up to three meters below the water’s surface. This has greatly expanded the area of coverage and is helping to verify AIMS’s findings.

The digital camera is used to build up a conventional 3D model of an individual reef under study, explains Gonzalez. But this conventional camera is capable of capturing light only from three spectral channels: the red, green, and blue covering the 380-to-740-nanometer portion of the electromagnetic spectrum. The hyperspectral camera, by contrast, collects the reflected light of 270 spectral bands.

If it took a year or more before the team were able to tell AIMS that a certain part of the reef is degrading rapidly, it might be too late to save it.

“Hyperspectral imaging greatly improves our ability to monitor the reef's condition based on its spectral properties,” says Gonzalez. “That’s because each component making up a reef’s environment—water, sand, algae, etc.—has its own spectral signature, as do bleached and unbleached coral.”

But this expansion in reef coverage and richness of gathered data presented the team with a new challenge. Whereas AIMS divers can gather information on 40 distinct points on a reef in an underwater session, just one hyperspectral image presents more than 4,000 data points. Consequently, a single drone flight can amass a thousand gigabytes of raw data that has to be processed and analyzed. 

In processing the data initially, the team used a PC, custom software tools, and QUT’s high-performance computer, a process that took weeks and drew heavily on the machine’s run time.

Orthomosiac imagae of Paandora Reef, Great Barrier Reef, AustraliaOrthomosiac image of Pandora Reef, Great Barrier Reef, AustraliaImages: QUT

So the team applied for and received a Microsoft AI for Earth grant, which makes software tools, cloud computing services, and AI deep learning resources available to researchers working on global environmental challenges. 

“Now we can use Microsoft’s AI tools in the cloud to supplement our own tools and quickly label the different spectral signatures,” says Gonzalez. “So, where processing previous drone sweeps used to take three or four weeks, depending on the data, it now takes two or three days.”

This speedup in data processing is critical. If it took a year or more before the team were able to tell AIMS that a certain part of the reef is degrading rapidly, it might be too late to save it. 

“And by being informed early, the government can then take quicker action to protect an endangered area of the reef,” Gonzalez adds.

He notes that the use of hyperspectral imaging is now a growing area of remote sensing in a variety of fields, including agriculture, mineral surveying, mapping, and location of water resources.

For example, he and colleagues at QUT are also using the technology to monitor forests, wheat crops, and vineyards that can be affected by pathogens, fungi, or aphids.

Meanwhile, over the next two months, Gonzalez will continue processing the spectral data collected from the reef so far; and then in September, he will start a second round of drone flights. 

“We aim to return to the four reefs AIMS has already studied to monitor any changes,” he says, “then extend the monitoring to new reefs.”

This article is for IEEE members only. Join IEEE to access our full archive.

Join the world’s largest professional organization devoted to engineering and applied sciences and get access to all of Spectrum’s articles, podcasts, and special reports. Learn more →

If you're already an IEEE member, please sign in to continue reading.

Membership includes:

  • Get unlimited access to IEEE Spectrum content
  • Follow your favorite topics to create a personalized feed of IEEE Spectrum content
  • Save Spectrum articles to read later
  • Network with other technology professionals
  • Establish a professional profile
  • Create a group to share and collaborate on projects
  • Discover IEEE events and activities
  • Join and participate in discussions