Recent advances in technology and data analysis have opened new opportunities in catastrophe risk modelling, allowing companies to tackle perils that were previously considered ‘un-modellable’, according to Tom Larsen, Senior Director, Content Strategy at CoreLogic.
Larsen explained that risk models support re/insurers in many aspects of loss mitigation for large catastrophes, such as informing risk selection and underwriting guidelines, as well as enhancing risk pricing and portfolio management.
They can also provide a price signal to compare with the strengthening of properties and can help both insurers and reinsurers offer financially competitive pricing; a function that has become particularly important given the industry’s significant catastrophe losses in 2017.
“Perils such as flood and wildfire, which we saw from Harvey and the California events last year, respectively, are extremely localized and require incredibly granular resolutions to appropriately model and assess,” said Larsen.
“Advances in risk modelling data and technology have enabled more robust insights for these types of perils, providing the re/insurance markets with reliable tools to inform and serve as a foundation for their underwriting guidelines.”
Flood risk in particular has historically been regarded as too complex and unpredictable to assess accurately, but new products such as CoreLogic’s U.S flood model have leveraged new technologies to provide higher resolutions and unprecedented insights into these perils.
“The technological challenge within a catastrophe modelling platform, like those for flood events, is the need to both deliver more granular and complete risk models, requiring significantly larger data sets and computing power, while balancing the client need to constrain costs and implement modelling output more completely and efficiently into their workflows,” Larsen explained.
At CoreLogic, this challenge has been met by using cloud hosting and integrated workflows to transform what has previously been a capital cost into a manageable operational expense.
Advances in technology and Big Data applications have also created new opportunities to directly combine known exposure data and modelling frameworks up front, fundamentally changing the workflow needs and processes for generating catastrophe risk model insights, Larsen added.
Integrating new data and technologies into risk models is also becoming increasingly important as many catastrophe risks intensify due to factors such as climate change, rising populations, and the urbanisation of at-risk areas.
Larsen pointed to the recent California wildfires as a striking example of this trend, which now present a considerable solvency risk to insurers with concentrated exposure, with insured losses almost doubling homeowners’ collective premium in California in 2017.
However, by leveraging insights and data from recent fires, new catastrophe models can deliver analytics that enable better risk segmentation, selection and pricing as well as enhanced portfolio management, allowing insurers to extend coverage to potentially risky areas, Larsen stated.
These kind of insights can also support re/insurers in the insurance-linked securities (ILS) market, where catastrophe modellers often act as a third party that provides a modelled risk view of the triggers in a transaction, which is foundational for the credit rating of the structure and influence of pricing.
Purchasers of ILS also use catastrophe models to support their own bid price, and “many purchasers of ILS (reinsurers and investors) manage a portfolio of ILS that contain partial to full correlation and these parties use catastrophe models to manage this aggregate risk,” said Larsen.
“ILS are far more liquid than indemnity contracts,” he added, “and prices can vary before, during and after a catastrophe occurs. Catastrophe models and modeller advice is used to support decisions in the interval between knowing that a potential loss has occurred and the severity of the loss.”