Powerful climate models have helped dispel any uncertainty about the scale of the climate crisis the world faces. But these models are large global simulations that can't tell us much about how climate change will impact our daily lives or how to respond at a local level. That's where a digital twin of the Earth could help.
A digital twin is a virtual model of a real-world object, machine, or system that can be used to assess how the real-world counterpart is performing, diagnose or predict faults, or simulate how future changes could alter its behavior. Typically, a digital twin involves both a digital simulation and live sensor data from the real world system to keep the model up to date.
So far, digital twins have primarily been used in industrial contexts. For example, a digital twin could monitor an electric grid or manufacturing equipment. But there's been growing interest in applying similar ideas to the field of climate simulation to provide a more interactive, and detailed, way to track and predict changes in the systems, such as the atmosphere and oceans, that drive the Earth's climate.
Now chipmaker Nvidia has committed to building the world's most powerful supercomputer dedicated to modeling climate change. Speaking at the company's GPU Technology Conference, CEO Jensen Huang said Earth-2 would be used to create a digital twin of Earth in the Omniverse—a virtual collaboration platform that is Nvidia's attempt at a metaverse.
"We may finally have a way to simulate the earth's climate 10, 20, or 30 years from now, predict the regional impact of climate change, and take action to mitigate and adapt before it's too late," said Huang.
The announcement was light on details, and a spokesman for Nvidia said the company was currently unable to confirm what the architecture of the computer would look like or who would have access to it. But in his talk Huang emphasized the significant role the company sees for machine learning to boost the resolution and speed of climate models and create a digital twin of the Earth.
Today, most climate simulation is driven by complex equations that describe the physics behind key processes. Many of these equations are very computationally expensive to solve and so, even on the most powerful supercomputers, models normally only achieve resolutions of 10 to 100 kilometers.
Some important processes, such as the behavior of clouds that reflect the Sun's radiation back to space, operate at scales of just a few meters though, said Huang. He thinks machine learning could help here. Alongside announcing Earth-2, the company also unveiled a new machine learning framework called Modulus designed to help researchers train neural networks to simulate complex physical systems by learning from observed data or the output of physical models.
"The resulting model can emulate physics 1,000 to 100,000 times faster than simulation," said Huang. "With Modulus, scientists will be able to create digital twins to better understand large systems like never before."
Improving the resolution of climate models is a key ingredient for an effective digital twin of Earth, says Bjorn Stevens, director of the Max Planck Institute for Meteorology. Today's climate models currently rely on statistical workarounds that work well for assessing the climate at a global scale, but make it hard to understand local effects. That will be crucial for predicting the regional impacts of climate change so that we can better inform adaptation efforts, he says.
But Steven is skeptical that machine learning is some kind of magic bullet to solve this problem. "There is this fantasy somehow that the machine learning will replace the things that we know how to solve physically, but I think it will always have a disadvantage there."
The key to creating a digital twin is making a system that is highly interactive, he says, and the beauty of a physical model is that it replicates every facet of the process in an explainable way. That's something that a machine learning model trained to mimic the process may not be able to do.
That's not to say there is no place for machine learning, he adds. It is likely to prove useful in helping speeding up workflows, compressing data and potentially developing new models in areas where we have lots of data but little understanding of the physics—for instance how water moves through earth and land. But he thinks the rapid advances in supercomputing power means that running physical models at much higher resolution is more a case of will and resources than capabilities.
The European Union hopes to fill that gap with a new initiative called Destination Earth, which was formally launched in January. The project is a joint effort by the European Space Agency, the European Organisation for the Exploitation of Meteorological Satellites, and the European Centre for Medium-Range Weather Forecasts (ECMWF).
The goal is to create a platform that can bring together a wide variety of models, simulating both key aspects of the climate like the atmosphere and the oceans, but also human systems, says Peter Bauer, deputy director of research at ECMWF. "So you're not only monitoring and simulating precipitation and temperature, but also what that means for agriculture, or water availability, or infrastructure," he says.
The result won't be a single homogeneous simulation of every aspect of Earth, says Bauer, but an interactive platform that allows users to pull in whatever models and data are necessary to answer the questions they're interested in.
The project will be implemented gradually over the coming decade, but the first two digital twins they hope to deliver will include one aimed at anticipating extreme weather events like floods and forest fires, and another aimed at providing longer-term predictions to support climate adaptation and mitigation efforts.
While Nvidia's announcement of a new supercomputer dedicated to climate modeling is welcome, Bauer says the challenge today is more about software engineering than developing new hardware. Most of the critical models have been developed in isolation using very different approaches, so getting them to talk to each other and finding ways to interface highly disparate data streams is an outstanding problem.
"Part of the challenge to actually hide the diversity and complexity of these components away from the user and make them work together," Bauer says.
Correction 24 Nov. 2021: An update was made to the description of machine learning’s utility for digital earths—it could be useful, the story now reads, in understanding how water moves through earth on land (not the mechanics of dirt as the original version of the story stated).
- Solar Storms Strike Earth in One-Two Punch - IEEE Spectrum ›
- Atomically Precise Sensors Could Detect Another Earth - IEEE ... ›
- How Space Telescopes Will Find Earth 2.0 - IEEE Spectrum ›
- Nvidia’s CTO on the Future of High-Performance Computing - IEEE Spectrum ›
Edd Gent is a freelance science and technology writer based in Bangalore, India. His writing focuses on emerging technologies across computing, engineering, energy and bioscience. He's on Twitter at @EddytheGent and email at edd dot gent at outlook dot com. His PGP fingerprint is ABB8 6BB3 3E69 C4A7 EC91 611B 5C12 193D 5DFC C01B. His public key is here. DM for Signal info.