Skip to main content

Brink

A new era in modeling catastrophic risk

The days of one-size-fits all disaster models are over, ushering in a push for more localized data and predictive analytics.

This article first appeared on Brink.

The 2018 hurricane season opened with the arrival Monday of subtropical storm Alberto on the coast of Florida.

Natural disasters such as these regularly imperil human lives and trillions of dollars of infrastructure. Although we can’t stop them, we can limit their financial repercussions through the development of more accurate predictions based on an updated approach to modeling catastrophic risk.

The flawed assumption

Stationarity is the name for the concept of data remaining unchanged — or stationary — over time. When applied to climate science, it refers to the assumption that the earth’s climate is not changing. The vast majority of climate scientists believe the stationarity assumption is incorrect, and any approaches based on this assumption are fundamentally flawed.

Yet traditional catastrophic climate risk models are built on the assumption of stationarity. They project the future based on past statistics and the assumption of a static climate. Insurers actually use this approach with reasonable success for regional, national and international insurance policy portfolios.

However, when stationarity is applied to risk analyses for specific structures or large commercial properties, the model breaks down.

Localized assets

The problem is that risks to localized assets are not homogeneous across regions and properties. Localized predictions require data that accounts for the dynamics of the local environment.

Those dynamics include not only a changing climate, but human-engineered alterations as well. Simply breaking ground for a new building affects potential flooding scenarios. To accurately assess and mitigate potential risk, developers, municipalities and insurance companies need models for the individual block and street and are not constrained by stationarity.

Creating a dynamic model that collects and analyzes data with such localized resolution is not a simple matter of "downscaling" old methods. It requires a different strategy and discipline, with single-site analysis as a core objective.

Risk modeling reimagined

Incorporating natural and human-architected factors in a dynamic, integrated model is fundamental to an asset-focused solution that delivers accurate, actionable information. Such a solution requires comprehensive and current data, powerful big data analytics, and a flexible design that can easily incorporate new modeling techniques as they become available.

My company, Jupiter Intelligence, provides climate change and weather event risk-prediction services. The company’s solution is built on a cloud-based platform designed specifically for the rigors of climate analysis. The Jupiter ClimateScore Intelligence Platform links data, probabilistic and scenario-based models and advanced validation in an integrated environment.

ClimateScore runs multiple models based on a changing climate, such as Weather Research and Forecasting. Porting these models to the cloud creates a massive, fast, flexible platform that links physics-based and artificial intelligence models.

ClimateScore’s models are continuously fine-tuned using petabytes of constantly refreshed data from millions of ground-based and orbital sensors. Novel machine learning techniques reduce local biases of scientific simulations and help the system continually improve as new observations become available.

Seeking the best prediction

To incorporate the necessary volume of data and stay current, ClimateScore uses alternative and proprietary data sources, including government and commercial satellites and sensors deployed on land, water, aircraft and drones.

The architecture is designed to be easily adjusted by Jupiter’s Earth and Ocean Systems experts to provide the best possible predictions with known information—even as the science and observations evolve.

Forgoing stationarity and adding the flexibility of a cloud model, current data from multiple sources, and state of the art analytics, ML and AI technology produces asset-level predictions that are accurate from two hours to 50 years in the future.

Case study: Miami

Understanding how developed Miami’s coast has become with localized data down to the individual block and street can help insurance companies, municipalities and developers assess the potential risk and determine cost-effective solutions.

Engineering firms need this data to evaluate the potential effects of flooding at a particular site and simulate how effective individual coastal protection measures are in protecting properties and neighborhoods from flooding over the life of these structures.

Public agencies also need this granularity to figure out how vulnerable their assets (ports, airports, transit, wastewater treatment and drinking water facilities) are to a changing climate.

Similarly, private entities want to assess exposed assets (substations, buildings, generators and data centers) and critical systems that may need to be redesigned to handle changing conditions. One critical condition to evaluate is the capacity of the electrical grid to handle peak demand during longer and more intense heat waves.

New risk-transfer mechanisms

Stationarity-based catastrophic risk models never were intended to assess risks to specific assets. To mitigate asset-level risk, all aspects of the private sector, as well as government bodies at the international, national and local levels, must make informed decisions based on accurate, current, highly localized data.

Property values, liability risk and lives are at stake. With dynamic models, current data and modern analytics, mitigating risk is feasible. This type of information resource also will support new risk transfer mechanisms, including private insurance — and help reform obsolete mitigation strategies.

More on this topic

More by This Author