Storm forecasting for infrastructure damage mitigation
February 1, 2022
The deadliest natural disaster in U.S. history was a hurricane in the year 1900 that decimated the island of Galveston, Texas. An estimated 6000–12,000 people were killed. Afterward, in a feat of engineering and a spirit of defiance, the shoreline was elevated by 17 feet, the city was rebuilt, and a 10-mile seawall was erected to protect Galveston’s inhabitants and infrastructure against future storms. This brute-force approach to fortification was extreme yet largely effective. More recent storms—Katrina, Sandy, Harvey, Maria—wrought similar devastation to coastal cities and cost hundreds of billions of dollars, hammering home the message that the storms are relentless and it’s up to humanity to prepare for them.
Modern human society relies heavily on infrastructure—buildings, roads, power, and water supplies—that is susceptible to damage from extreme weather. More extreme weather threatens to bring more damage, so Los Alamos scientists are helping determine the best approaches to weather the coming storms.
“We are already seeing the effects of climate change,” explains Laboratory physicist Donatella Pasqualini. “Hurricanes, sea-level rise, winter storms, tornados—these are all increasing in frequency and intensity. How can Los Alamos, as a national security laboratory, help local and regional decision makers adapt?”
In two ways, actually: through future planning with investment in new systems and through real-time analysis for situational awareness.
An ounce of prevention
Natural systems and engineered systems coevolve. For example, the beach in Galveston—wide and inviting before the storm—quickly eroded once the seawall was built, taking with it much of the island’s beach-based tourism and compounding its post-storm financial woes. Pasqualini includes the physics of this coevolution in the computer models she builds. She and her team are looking at how coastlines evolve—how the vegetation and geomorphology might change tomorrow, next year, ten years from now—in order to determine where and when to invest money in hardening complex networked infrastructure.
The model can help direct fortification dollars to where they’ll be most effective.
“If I have an idea how a particular coastal region will be 30 years from now, and how the climate will change in that time, that can tell me how to invest today,” says Pasqualini. “It will help me decide where and how to build new power-system assets so they’ll be physically resilient, and whether or not to harden existing assets.”
Pasqualini’s physics-based computer model, the New Science for Multisectors Adaptation (NeSMA) framework, mimics the coevolution between natural and engineered systems by integrating models of the coastal environment with models of energy infrastructure. NeSMA estimates future climate and weather threats, including their statistical uncertainty, to assess the risk to the infrastructure and to inform an optimal adaptation plan. For example, if an existing power plant is determined likely to flood under a set of conditions that might arise every ten years, might the plant be reconfigured to avoid damage? Or can its buildings be reinforced to resist the rising water? Or should the power plant be replaced entirely, by a new one located somewhere safer?
Asset fortification isn’t just about threats to physical buildings and brick-and-mortar solutions, though buildings can be costly to repair. Asset fortification is also about adjusting the power-generation capacity within the power plant, or changing distribution routes between substations, or altering the network connections that control operations, so as to either maintain a certain level of functionality or at the very least to avoid complete destruction.
Billions of dollars are spent on infrastructure rebuilding and hardening after every weather-related disaster. But the beauty of NeSMA is that it can help direct those dollars to where they’ll be most effective. Rather than automatically hardening the critical assets that were damaged in past events, would it perhaps be more effective, bang-for-the-buck-wise, to harden different assets? Every circumstance is unique, so NeSMA allows the exploration of various a la carte solutions, so decision makers can make the best decisions, not just knee-jerk ones.
Nearly 40 percent of Americans, along with all the critical infrastructure that supports them, live in coastal areas. So when those facilities are damaged and the power cuts out or the heating goes off, it’s a major problem for a lot of people. Pasqualini’s goal is to help power systems and other infrastructure assets to be as resilient as possible to long-term and short-term climate change. Whereas NeSMA mainly addresses the long term, Pasqualini’s team is also modeling for the short term to provide real-time analysis and support during weather events as they unfold.
A pound of cure
Pasqualini developed an automated workflow that takes data from the National Oceanic and Atmospheric Administration (NOAA) and forecasts the intensity and location, by county, of possible power outages as well as the number of people affected. The ability to forecast outages before they occur is a powerful situational-awareness tool for local authorities facing imminent extreme weather who need to decide what to do right now.
Forecasting power outages helps authorities decide what to do before they occur.
NOAA continually collects and collates data from many different climate and weather models. For example, each path in the “spaghetti plots” that illustrate possible trajectories of an incoming hurricane comes from a unique mathematical model. NOAA collects them all, and Pasqualini’s model automatically downloads them from NOAA every 15 minutes. If the data have changed, the model updates, recalculates, and adjusts its outage forecast accordingly. This information can then be used by utility companies or government officials to decide which short-term mitigation steps to take, such as closing switches in a network, removing power from less populated areas, or isolating certain substations to prevent outage propagation to other stations.
This real-time analysis tool uses machine learning (ML). That means the computer model is fed a bunch of historical data about real storms—tracks, gusts, sustained wind durations, etc.—as well as the conditions of the storm-hit area—population density, land cover, surface soil moisture, etc.—and it builds a predictive algorithm based on trends it detects. With each new hurricane season, the data from the previous season get added to the ML training data, including information about outages, and the model gets retrained, so it gets more accurate with each year. So far, eight years of outage data are included, with many more years’ worth of hurricane data.
To improve the ML model further, Pasqualini is collaborating with scientists at the University of Connecticut to incorporate more accurate variables, such as better historical estimates of soil moisture. Historical power-outage data can also be tricky due to inconsistencies and inaccuracies. Pasqualini’s team collaborates with Los Alamos data scientist Paolo Patelli in a project to improve these data. Only once the historical data have been improved as much as possible do they get added to the ML training data.
While coastal areas are ground zero for a lot of the consequences of climate change, the principals of Pasqualini’s models are more broadly applicable. Plunging temperatures, rising rivers, and screaming winds threaten inland infrastructure just as much as hurricanes threaten the coasts. Pasqualini believes her work will help fortify all of these communities against what’s coming.
Storms and storm damage are inevitable, so too are the impacts of Earth’s changing climate. Long-term climate change and episodic extreme-weather events present critical national security challenges. Helping the nation prepare for the future while being nimble in the present is just one way that Los Alamos scientists are rising to meet those challenges head on.