16 July 2013ILS

Severe floods a return to new ‘normal’

In the aftermath of the recent floods in Europe, Bermuda:Re spoke to Petr Puncochar, head of flood model development for Impact Forecasting’s international team, about the impact of the European floods on the future of flood modelling.

Why does the industry struggle so much with flood and water damage, both in terms of modelling and preventative measures?

Flood has always been one of the more challenging perils to model due to the extremely high data demands for all components of the model.

There are many possible theories for the increased frequency in floods, including climate change, solar cycles and man-made manipulations – but none individually provide a conclusive answer. Looking back at long historical records, it seems the 20th century was a calm period for floods and now we are simply returning to ‘normal’. It’s important to remember that today’s exposure has increased with more property in flood-prone areas, coupled with higher insurance penetration.

Generally short-term records of flow observations do not allow the analyses of extreme frequencies implemented in CAT models. However, people are very adaptable and their reaction to potential floods have improved significantly in last decade, so comparing modelled losses with historical observation can be misleading.

How effective are current models?

The recent flood situation in central Europe provided us with a unique situation to validate existing models and compare modelled and estimated flood losses. The situation differs per client, but on market losses we managed to match the estimated losses in the range of 10 to 15 percentage points.

On the other hand, looking at floods as a physical phenomena, there are still some limitations in accurately describing and assessing their impact. For example, in the vulnerability component there is still high uncertainty when the flood hazard is converted to losses, despite intensive research and statistical sampling. Also, it is particularly important to have a high quality of geocoding data from insurers’ portfolios.

Have you seen any marked difference in losses as modelling capabilities improve?

In the last five years, catastrophe models have evolved from some simple assessment tools to detailed expert systems that can accurately analyse literally every single policy.

Comparing the modelled results of the older and latest generation models, there is not a trend in overall numbers being constantly either higher or lower, but the uncertainty in results seems to be smaller, particularly when focusing on detailed areas for single historical or synthetic events.