Climate is a complex problem we face – and it would take a very powerful computer to actually be able to model the Earth system and its future with any granularity.
I recently listened to Mike Pritchard, director of research at Nvidia and professor at UC Irvine, talk about this process and how it works.
“Physics spans 10 orders of magnitude in space and time,” he said, citing research problems such as determining whether a cloud particle favors water vapor that gathers around it. of her.
“If you want to simulate the planet hundreds of times, to sample many scenarios of the future, unfortunately you won’t be able, even with the most powerful supercomputers, to do justice to all that complexity,” he said. “Meanwhile, humanity’s questions about future climate are too vast for simulation technology.”
For a real-life example of looking at the issues head-on, he talked about driving from San Diego to Irvine and seeing a particular type of cloud out the window.
“It looks like a gray band on the horizon,” he said. “We call this the marine layer. If this happens one day at the beach, you are disappointed because it makes you cold. But what matters is that it’s the edge of a huge patch of low clouds that you’ll see out your window from the plane, halfway through the flight from San Diego to Hawaii, and that cloud reflects a lot of energy from the planet. , keeping it fresher than it otherwise would be. So if it dissipates…it will amplify global warming…but if it thickens, which it might, it will mitigate it. And that represents billions of dollars of uncertainty. And it’s a simulation problem. We know that these clouds require very high resolution to simulate which we cannot yet afford to deploy in climate simulation.
A cavalcade of systems
Pritchard also mentions the word “ensemble”, which is often used in machine learning to talk about using multiple models at once, or crowdsourcing results from different LLMs, but it has a different meaning in weather forecasting .
“We don’t predict one hurricane,” he said, “we predict hundreds of hurricanes. You hope for the best and plan for the worst… Atmospheric scientists at the University of Washington use these AI weather models, which were trained on the messiness of the real atmosphere, which is very noisy and messy, and then after the fact, survey them and ask them if they learned physics by doing things like that.
Pitchard explained how this works with technology and built an archive of evidence for the ability of AI models to help us with weather forecasting.
New software and technologies
As an example, Pritchard mentions the capability of Nvidia’s AI tools such as Module, Terre2Studio which enable the research, development and validation of such AI forecasting models.
The company, which has risen to the top of the pack in the US stock market, actually has many research lines and collaborations with the atmospheric science community. These are published in the open source domain and here are some of the flagship models:
StormCast Search – demonstrates a generative AI model that emulates atmospheric dynamics, examines mesoscale weather phenomena and makes predictions (paper).
CorrDiff– this is another generative AI model that creates high-resolution weather forecasts (paper). One can learn more about AI-based downscaling using Corrdiff pre-trained here.
FourCastNet– this model provides weather forecasts with a resolution of 25 km for locations around the world with Spherical Fourier neural operatorsrecently calibrated For huge sets. More about medium-term global forecasts can be learned using Pre-trained Fourcastnet here.
Earth-2 Platform is a digital twin cloud platform that helps businesses leverage advances in AI and accelerate traditional digital simulations to reduce the computational bottleneck of climate and weather simulations. By combining these advances with advances in computer graphics like RTX rendering technology, we can create digital twins of Earth’s climate and weather to help scientists explore, analyze and explain the complexity of weather phenomena, particularly in the context of climate change.
Learn more about climate work
Pritchard also spoke about promoting optimal dispersion in large-scale AI weather forecasting and referenced new papers that provide more detail on the emerging science of using AI to simulate low-probability, high-impact climate extremes. He says this will give climate risk modelers new tools to help us understand and protect against extreme weather events.
Back and forth
Here’s another aspect of what Pritchard talked about in terms of useful AI models. He described traditional climate computing processes as “turning to an Oracle”: large simulators create large data sets, he suggested, which users should then leverage to help illuminate scenarios and questions about the future climate. The AI’s predictions, he added, can run forwards and backwards, which should help users more easily understand what might have changed, given a different initial input.
“We could be moving into a future in which we can more easily understand our influence on the future, without having to deal with all the bottlenecks of conventional simulation,” he said.
The power of twins
In conclusion, Pritchard also discussed the idea of digital twinning applied to the most important element of our world: the world itself.
“I think the really important paradigms of interactivity are chains and cascades of AI digital twins, … So you can imagine a future that evolves towards AI digital twins of climate, coupled with AI digital twins of extreme weather events. »
With a nod to current research and what everyone is doing around this very complex problem, Pritchard gives us food for thought about how to address the climate of our time with technology that goes well beyond the big datasets. Stay tuned for more on what came out of recent events on AI and the planet here in Boston.