Google researchers have built an artificial intelligence they say can predict weather and climate patterns as well as current physical models while requiring less computer power.
Existing forecasts are based on mathematical models run by powerful supercomputers that can predict with certainty what will happen in the future. Since their first use in the 1950s, these models have become increasingly detailed, requiring more and more computer power.
Several projects aim to replace these computationally intensive tasks with less demanding artificial intelligence, including the DeepMind tool for predicting local rainfall over short periods of time. But like most AI models, they are a “black box” whose inner workings are a mystery, and the inability to explain or replicate their methods is problematic. Climate scientists also point out that if models are trained on historical data, they will have difficulty predicting unprecedented phenomena due to climate change.
Now, Dmitrii Kochkov and his colleagues at Google Research in California have created a model called NeuralGCM that they believe strikes a balance between the two approaches.
Typical climate models divide the Earth's surface into grids up to 100 kilometers wide; limitations in computing power make simulations at higher resolutions impractical. Phenomena such as clouds, air turbulence and convection within these cells are only approximated through computer codes, which are constantly adjusted to more accurately match the observed data. This approach, called parametrization, hopes to capture, at least in part, small-scale phenomena that broader physical models cannot capture.
NeuralGCM is trained to take over this small-scale approximation, thereby reducing computational intensity and improving accuracy. The model can process 70,000 days of simulations in 24 hours using a single chip called a tensor processing unit (TPU), the researchers said in a paper. In comparison, a competing model called X-SHiELD used a supercomputer with thousands of processing units to process simulations in just 19 days.
The paper also claims that NeuralGCM produces prediction accuracy comparable to, and sometimes better than, best-in-class models. Google did not respond to interview requests New Scientist.
Tim Palmer of the University of Oxford said the research is an interesting attempt to find a third way between pure physics and opaque artificial intelligence approximations. “I'm uncomfortable with the idea that we're completely abandoning the equations of motion and just moving to some artificial intelligence system that even the experts would say they don't really fully understand,” he said.
This hybrid approach is likely to spark further debate and research within the modeling community, he said, but only time will tell whether it will be adopted by modellers around the world. “This is a good step in the right direction and this is the type of research we should be doing. It's great to see all these alternatives on the table.”
theme: