HomeNewsMIT-derived algorithm helps predict frequency of maximum weather events

MIT-derived algorithm helps predict frequency of maximum weather events

To assess the chance of maximum weather events in a community, policymakers first depend on global climate models, which may be run a long time and even centuries upfront, but only at a rough resolution. For example, these models could possibly be used to estimate future climate conditions within the northeastern United States, but not specifically for Boston.

To estimate the long run risk of maximum weather events like flooding in Boston, policymakers can mix the large-scale predictions of a coarse-grained model with a finer-resolution model tuned to estimate how often Boston is prone to experience such events because the climate warms damaging floods will come. However, this risk evaluation is simply as accurate because the predictions of this primary, coarser climate model.

“If you get the knowledge incorrect for large-scale environments, you miss all the pieces which means what extreme events will appear like at smaller scales, comparable to over individual cities,” says Themistoklis Sapsis, William I. Koch Professor and director of the Center for Ocean Engineering the Department of Mechanical Engineering at MIT.

Sapsis and his colleagues have now developed a way to “correct” the predictions from rough climate models. By combining machine learning with dynamical systems theory, the team's approach “transforms” a climate model's simulations into more realistic patterns over large scales. Combined with smaller models to predict specific weather events, comparable to tropical cyclones or floods, the team's approach provided more accurate predictions of how often certain locations shall be affected by these events over the following few a long time, in comparison with forecasts without the correction scheme.

According to Sapsis, the brand new correction scheme is general and may be applied to any global climate model. Once corrected, the models may help determine where and the way often extreme weather events will occur as global temperatures rise in the approaching years.

“Climate change will impact every aspect of human life and each sort of life on the planet, from biodiversity to food security to the economy,” says Sapsis. “If we’re in a position to know exactly how extreme weather conditions will change, particularly in specific locations, this could make a giant difference by way of preparation and the best technology to develop solutions. This is the tactic that may pave the best way there.”

The team's results appear today within the . MIT co-authors of the study include postdoc Benedikt Barthel Sorensen and Alexis-TzianniCharalampoulos SM '19, PhD '23, together with Shixuan Zhang, Bryce Harrop and Ruby Leung from the Pacific Northwest National Laboratory in Washington state.

Over the hood

Today's large-scale climate models simulate weather features comparable to average temperature, humidity and precipitation across the globe on a grid-by-grid basis. Running simulations of those models requires enormous computing power, and to simulate how weather features interact and evolve over periods of a long time or longer, the models average the features roughly every 100 kilometers.

“It is a really demanding calculation that requires supercomputers,” notes Sapsis. “But these models still don’t resolve very essential processes comparable to clouds or storms that occur at smaller scales of a kilometer or less.”

To improve the resolution of those coarse climate models, scientists have typically attempted to correct a model's underlying dynamical equations, which describe how phenomena within the atmosphere and oceans should physically interact.

“People have tried to research climate model codes developed during the last 20 to 30 years, which is a nightmare because you’ll be able to lose lots of stability in your simulation,” explains Sapsis. “What we do is a totally different approach in that we aren’t attempting to correct the equations, but as a substitute correct the output of the model.”

The team's latest approach takes the output or simulation of a model and overlays it with an algorithm that pushes the simulation toward something that higher reflects real-world conditions. The algorithm is predicated on a machine learning scheme that takes data, comparable to historical information on temperature and humidity world wide, and learns relationships inside the data that represent fundamental dynamics between weather features. The algorithm then uses these learned associations to correct a model's predictions.

“We're attempting to correct for the dynamics of, for instance, what an extreme weather feature, just like the wind speeds during a Hurricane Sandy event, will appear like within the rough model in comparison with reality,” says Sapsis. “The method learns dynamics, and dynamics is universal. The right dynamics ultimately result in correct statistics, for instance the frequency of rare extreme events.”

Climate correction

As a primary test of their latest approach, the team used the machine learning scheme to correct simulations created by the Energy Exascale Earth System Model (E3SM), a U.S. Department of Energy climate model that simulates climate patterns world wide at 110 resolution kilometers. The researchers used eight years of historical data on temperature, humidity and wind speed to coach their latest algorithm, which learned dynamic relationships between the measured weather features and the E3SM model. They then ran the climate model for about 36 years and applied the trained algorithm to the model's simulations. They found that the corrected version produced climate patterns that were more consistent with real observations from the last 36 years that weren’t used for training.

“We’re not talking about big differences in absolute numbers,” says Sapsis. “An extreme event within the uncorrected simulation could possibly be 105 degrees Fahrenheit, versus 115 degrees with our corrections. But for individuals who experience this, it makes a giant difference.”

When the team then combined the corrected coarse model with a selected, finer-resolution model of tropical cyclones, they found that the approach accurately reproduced the frequency of maximum storms in specific locations world wide.

“We now have a rough model that may offer you the best frequency of events for the present climate. “It’s gotten quite a bit higher,” says Sapsis. “Once we correct the dynamics, this can be a relevant correction even when the worldwide average temperature is different, and it may well be used to know what wildfires, floods and warmth waves will appear like in a future climate.” The focus of our ongoing work lies on the evaluation of future climate scenarios.”

“The results are particularly impressive because the tactic shows promising results for E3SM, a state-of-the-art climate model,” says Pedram Hassanzadeh, an associate professor who leads the Climate Extremes Theory and Data group on the University of Chicago and was not a part of the study involved. “It could be interesting to see what climate change projections this framework provides when future greenhouse gas emissions scenarios are included.”

This work was supported partially by the US Defense Advanced Research Projects Agency.

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Must Read