New Algorithm Promises to Decimate Climate Model Simulation Times and Improve Accuracy

Climate models are intricate software systems simulating various aspects of Earth’s systems, such as its atmosphere and oceans. Developed and refined over decades by countless scientists, these models can comprise millions of lines of code – tens of thousands of printed pages – making them computationally expensive. Running these simulations can take months, consuming vast amounts of supercomputing energy.

However, a novel algorithm shows promise in accelerating climate model simulations by up to tenfold, potentially revolutionizing the fight against climate change. The extended computing time needed for climate modeling stems from the inherently slow nature of certain simulated processes, such as ocean circulation. It takes thousands of years for water to complete a full circulation cycle, from the surface to the deep ocean and back, compared to the atmosphere’s mixing time of mere weeks.

Recognizing this challenge since the inception of climate models in the 1970s, scientists have employed a technique called ‘spin-up’ to initialize their models with conditions representative of the pre-industrial era. This spin-up process involves running the model until it reaches equilibrium without significant changes. Establishing a stable starting point with minimal drift is crucial for accurately simulating the effects of human-induced climate change. However, this process can take months, even on powerful supercomputers.

Increasing supercomputing power alone would not alleviate this bottleneck. Modern supercomputers consist of thousands of individual computer chips, each equipped with dozens of processing units or cores, interconnected by high-speed networks. The world’s most powerful supercomputers, like Summit, boast over 300,000 cores and perform almost 20 quadrillion arithmetic operations per second. However, any single simulation typically utilizes only a fraction of this computing power.

Climate models exploit this architecture by dividing the Earth’s surface into smaller regions or subdomains, with each region’s calculations performed simultaneously on different CPUs. While more subdomains generally lead to faster computation, the need for inter-chip communication to exchange information creates a ‘bandwidth limitation.’ This limitation arises because the speed of data transfer between chips is significantly slower than the speed of arithmetic calculations performed by modern chips.

The newly developed algorithm addresses this challenge, dramatically reducing the spin-up time of ocean and other Earth system model components. Tests on typical climate models revealed that the algorithm was approximately ten times faster than conventional approaches, reducing the spin-up time from months to weeks.

This significant time and energy savings will not only accelerate climate research but also enable scientists to calibrate models against real-world observations, enhancing their accuracy and refining uncertainty estimates in climate projections. Additionally, faster spin-up times will facilitate simulations with finer spatial resolution, allowing models to capture critical ocean phenomena occurring at scales of tens of meters to a few kilometers.

The new algorithm is rooted in an old idea known as ‘sequence acceleration,’ initially proposed centuries ago by the Swiss mathematician Leonhard Euler. This technique leverages past information to predict a ‘better’ future. It finds widespread application in chemistry and materials science for calculating atomic and molecular structures.

In the 1960s, Harvard mathematician D.G. Anderson introduced a method to combine multiple previous outputs into a single input, significantly reducing the number of iterations required to reach a final solution. Applying this scheme to the climate model spin-up problem yielded a tenfold reduction in computation time.

While developing a new algorithm is a significant achievement, ensuring its adoption by the scientific community is equally important. Encouragingly, the UK Met Office and other climate modeling centers are already evaluating the algorithm.

The next major assessment report from the Intergovernmental Panel on Climate Change (IPCC) is scheduled for 2029. While this may seem distant, the extensive time required for model development and simulation means that preparations are already underway. The simulations that will form the foundation of this report are coordinated by an international collaboration known as the Coupled Model Intercomparison Project (CMIP). It is thrilling to envision that the new algorithm and software could contribute to these efforts and advance our understanding of climate change and its impacts.

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top