Reinsurance News

Cyber model evaluation needed as risk horizon evolves: Gallagher Re

11th November 2022 - Author: Kane Wells

According to a report from re/insurance broker Gallagher Re, cyber models have done well to keep pace with the cyber insurance market in recent years, however, the ever-changing size and scope of the industry means that claims are often of a vector, frequency or magnitude that are still missing from the cyber models.

gallagher-re-logoThe report suggests that updates to models are typically in response to a new understanding of the risk being modelled, a change in the behaviour of the peril, or an improvement to the functionality and capabilities of the modelling systems and software.

It adds that given the dynamism of the cyber insurance market, the constantly shifting threat landscape and the immaturity of the modelling systems and software, cyber models undergo more frequent and significant updates than their natural catastrophe equivalents.

The report also references the early 2010s, when many leading Bermudian reinsurance companies were quick to establish natural catastrophe research teams to conduct in-depth evaluations of the vendor models, which ultimately lead to adjustments of the model so that they better reflected their view of the risk.

This investment in models and evaluation in part helped them free up capital to challenge the established reinsurance markets that were slower to adopt and adapt catastrophe models, it explains.

Register for the Artemis ILS Asia 2024 conference

However, Gallagher observes that cyber models will not experience the same period of unfettered model development that helped establish the first generation of natural catastrophe models and modellers.

The report cites Cody Stumpo, Senior Director of Product Management at CyberCube, who stated, “Nat cat models are on version 20 or so, with centuries of recorded history, frequent billion-dollar events, processes that change on geologic time, and multiple components of the model verifiable via computational physics and structural engineering. Even with all that, there remains much debate and competition.

“Cyber risk modelling has none of these advantages, and yet the size of the risk demands we still put our most rigorous foot forward.

“Model validation is not a binary process that outputs “yes, this model is valid.” It is a process that reveals how to best work with well-thought-out models to deal with the uncertainty inherent in the world.”

Many cyber model vendors have stated that they are aiming for a reduction in the frequency and severity of their model updates in the coming years, says the report, which is in response to some re/insurers voicing their desire to use the models as inputs into capital modelling and therefore require more annual stability.

Gallagher writes that despite companies showing cautiousness around relying on cyber models that remain untested by large and significant catastrophe events, or are displaying hesitancy about burdening themselves with the task of validating models that are still so uncertain and prone to large updates, the next stages for in the cyber model development journey will see them being generally accepted for use in solvency calculations.

The firm adds that for that reason, there is a need for evaluation and a need for understanding the nuances around evaluating cyber models, as the cyber risk modelling industry is still in a period of maturation.

The report concludes that while the cyber models remain outside of the capital calculations and reinsurance pricing is not driven by model outputs, cyber model vendors should continue to push for ambitious and broad model development road maps, and the models should be seen as trying to help the market understand a risk not dictate the losses.

Print Friendly, PDF & Email

Recent Reinsurance News