Ten years after Katrina —and still learning


Ten years after Katrina —and still learning

In the decade since Hurricane Katrina, the costliest ever natural disaster, much has changed in terms of the industry’s understanding of natural disasters and the models used to price the risk associated with them. Two risk modelling specialists describe some of the advances that have been made in recent years around wind and earthquake modelling.

1. Understanding storm surge

A large percentage of the losses from Hurricane Katrina were ultimately attributed to storm surge. Since then, the industry has invested heavily in getting a better handle on this previously little understood phenomenon, as Dr Sylvie Lorsolo, atmospheric scientist at AIR Worldwide, explains.

To provide the best assessment of risk in today’s marketplace, AIR developed a storm surge module that incorporates state-of-the art modelling techniques. The new storm surge module, as part of the June 2015 US Hurricane Model update, utilises the latest research on storm surge hazard to accurately assess risk at high resolution.

It does this by integrating storm parameters with high-resolution elevation data to simulate location-specific storm surge inundation depth and extent. It uses a 30-metre national elevation dataset developed by the US Geological Survey (USGS) and leverages a highly customised and optimised variation of the National Oceanic and Atmospheric Administration’s well-established Sea, Lake, and Overland Surges from Hurricanes (SLOSH) model.

The module incorporates regional and seasonal data on tide heights. It also contains the most up-to-date data on levees, seawalls, floodgates, pump systems, and other mitigating structures and equipment—including the most recent flood defence data from the US Army Corps of Engineers levee information available for New Orleans.

The update also includes a new catalogue based on the revised Atlantic Hurricane database (HURDAT2) and the latest reanalysis data, newly published land use/land cover information from USGS, explicit treatment of square footage for high-value homes, and enhancements to manufactured home vulnerability.

The new storm surge damage functions use a component-based approach, reflecting the contribution of different building components to the building’s replacement value. AIR’s approach allows the model to provide robust estimates of damage at different levels of inundation, which can vary dramatically depending on building characteristics.

The model estimates physical damage and time-element losses using damage functions that reflect local building codes and regional design practices, as well as damage survey findings, claims analyses and engineering research.

The vulnerability module incorporates the latest observational data on the impact of square footage on wind losses for large, high-value homes and important updates that reflect the latest findings on the vulnerability of manufactured homes.

Large, high-value homes generally exhibit high quality construction and maintenance. They may feature complex architecture with elaborate roof geometries containing multiple gable ends and corners. The combination of high quality construction and lower wind speeds generally reduces the wind vulnerability of large homes.

At the other end of the spectrum from large, high-value homes are so-called manufactured (prefabricated) homes. AIR engineers completed an investigation of manufactured home vulnerability that included the effects of the year of manufacture and where they are sited, as well as environmental deterioration.

Engineers investigated the impact of home size on wind damage using computational fluid dynamics and have incorporated their findings into the updated model.

The highly granular estimates produced by the model can help the insurance industry better understand risk from hurricane wind and storm surge, supporting improved risk selection, portfolio management, and risk transfer decisions.

Understanding the potentially devastating impact of storm surge will allow companies to plan effectively for this serious risk. With the new, hydrodynamic storm surge module in AIR’s Hurricane Model for the US, companies will have a greatly refined and thoroughly validated view of storm surge loss potential giving companies broad, accurate and reliable storm surge information. 

Canadian storms

AIR has looked to aid its neighbour Canada with the release in July of the Winter Storm and Tropical Cyclone models which include comprehensive updates to the hazard and engineering components of the Canada Severe Thunderstorm Model. Together, these models provide a more complete view of atmospheric peril risk and account for insurance policy conditions specific to Canada.

Updates to the AIR Severe Thunderstorm Model account for losses incurred by straight-line winds, hail, and tornadoes on insured properties. The model simulates daily convective storm activity, thereby better allowing for the capture of both small and large loss events. The new Tropical Cyclone Model for Canada captures the effects of damaging winds on insured properties in seven Canadian provinces.

Damage functions for the Canada Severe Thunderstorm, Winter Storm and Tropical Cyclone models are informed by a comprehensive study of the evolution of building codes across Canada and leverage the latest research from the Boundary Layer Wind Tunnel at Western University in Ontario; the Wind Engineering, Energy, and Environment Research Institute (WindEEE); the tornado simulator at Iowa State University; the Insurance Research Lab for Better Homes and the Insurance Institute for Business & Home Safety.

In addition to residential, commercial and auto lines of business, damage functions have been developed for large, complex industrial facilities using a component-based engineering approach. 


2. Learning from losses

The industry has enjoyed two relatively benign years in terms of losses from earthquakes: payouts equalled $313 million in 2014 and $45 million in 2013, according to data by Swiss Re. But only four years ago—in 2011—the industry paid out $45 billion in earthquake losses thanks to devastating events in New Zealand and Japan. Maiclaire Bolton, senior product manager at CoreLogic, gives an update on earthquake modelling techniques.

We are in exciting times right now from a cat modelling perspective. There have been devastating earthquakes recently and while they are disturbing in terms of human casualty, we have also learned a lot about how they occur, how they generate motion and how buildings react.

The US Geological Survey (USGS) leads the development of the National Seismic Hazard Mapping Project (NSHMP). These maps are primarily developed for setting design standards in the National Building Code, but can be adapted to form the basis of earthquake risk models. The USGS updates its maps every six years, with the most recent released in July 2014.

CoreLogic took a lead role in the development of these new national seismic hazard maps through the participation of lead earthquake scientist Dr Ken Campbell, a member of the NSHMP steering committee.

It’s standard practice across the industry for all cat modellers to extract information from national bodies such as the USGS and implant that information into their US earthquake model. A model is an estimate, and the better and more up-to-date the underlying data, the better the model we can produce.

The Uniform California Earthquake Rupture Forecast (UCERF) is a working group led by the USGS, the California Geological Survey and the Southern California Earthquake Centre.

Currently on version three, the UCERF3 model provides a comprehensive complex model of future (continued opposite)>> earthquake occurrence and formed the basis for the California part of the national seismic hazard maps.

The new maps reflect the most up-to-date scientific view of seismic hazard for the US, and contain much more relevant information than the previous version of six years ago. 

Some of the key updates in UCERF3 include new supercomputing power, which has seen UCERF3 expand its capabilities. The model has also relaxed the segmentation along faults to include multi-fault ruptures, and it accounts for modelling uncertainty using a logic-tree approach with 1,440 alternative logic tree branches.

In addition to this, from a risk perspective, the most notable changes are to earthquakes in the magnitude 6.5 to 7.0 which have decreased slightly in frequency (the frequency was overestimated in UCERF2).

These USGS seismic hazard maps, along with data from the UCERF project, form the scientific bases for the CoreLogic US Earthquake Model.

We know that the model we have in place today is going to change in the future, but it is based on the best knowledge available at the time; although many modellers are driven by the results of geographical surveys, all agencies don’t use the same technology.

Similarly, there will never be a one-size-fits-all solution for the client. Every country is different, and we offer earthquake cover in 98 countries around the world. Every exposure is different and every insurer will have different needs—that’s why different models exist and need constant revision. 

Hurricane Katrina, Dr Sylvie Lorsolo, AIR Worldwide, Swiss Re, Maiclaire Bolton, CoreLogic, Europe, North America

Bermuda Re