Insurers have been tightening up budgets for risk modeling investment and moving away from multi-model risk management to an approach that relies on detailed understanding of a core model, as they come under pressure to improve the bottom line in a competitive market, said Guy Carpenter.
Guy Carpenter’s Matthew Eagle, Head of International Analytics, has cautioned against insurers resorting to a passive stance in estimating their own risks in an environment where, “the resource intensive model validation required by Solvency II and the ability to maintain multi-model strategies is coming under significant pressure.”
He said; “we have seen models under or over-estimating event losses, we have seen significant changes to model results following new releases, and of course, we have experienced the losses that were not yet on the radar screen,” highlighting the limitations of catastrophe model vendors.
Increasingly, insurers appear to be using a risk-assessment method of detailed assessment of the strengths and weaknesses of a single core model – and using this to back policy and underwriting strategies.
However, this can place re/insurers in a challenging position in situations where the model’s risk expectations deviates substantially from real losses.
Eagle said this kind of approach “places greater onus on practitioners to play a proactive role in the process, rather than accepting commercial models at face value.”
“The catastrophic events of last year, including the Fort McMurray wildfire, have demonstrated that catastrophe risk remains not only a capital issue but also an earnings issue. The lower earnings reported by a number of insurers and reinsurers in 2016 reflect this,” said Eagle.
The landscape for models and their use is no exception to the change the industry’s currently undergoing, and increasingly new platforms for models to run from are being constructed.
Eagle explained that while many developers have been building platforms to run their own models, “there are many other potential providers of models or at least model components which do not have the resources or skills to build a platform.”
“As a result, we have supported initiatives such as the OASIS Loss Modeling Framework, which have not only helped to provide some standards for model components, but also created the computational engine that links the components together and carries out the loss calculations.”
Another change is a move towards an integrated, collaborative approach to risk modeling where, Eagle said, traditional commercial vendors are increasingly opening up their platforms to allow third-party models to be run from their environments.
“We believe we will increasingly see a distinction between the platform and the models themselves, although we should not lose sight of platform implementation issues such as correlation and uncertainty.”
Vendors opening up platforms is creating more opportunities for insurers to develop bespoke models, and the analytics expert recommended leveraging “widely used and validated model components of existing models” while replacing “components with bespoke elements reflecting the portfolio specifics.”
“A new peril model may be perfectly reasonable, but we all know that each model comes with its own assumptions. Unless one is introducing new science and research” he recommended “clients and their brokers focus on the pieces that leverage their own data.”