Getting Comfortable with Catastrophe Models–Questions Executives Should Ask Model Providers
June 10, 2011
Over their nearly 25-year history, catastrophe models have evolved into pervasive risk analysis tools. For companies exposed to natural and man-made hazards, catastrophe model output is a critical input into many significant business decisions. However, the rush to deploy results has sometimes outpaced critical evaluation of the models themselves, of the data being input to the software, of the workflow generating those input files, and finally of the way users take actions based on the output.
One problem is that many busy executives are wary of "drowning" in the quantitative and scientific details and therefore either defer model evaluation to those with scientific training or avoid it entirely. Regardless of whether you have scientists on staff, keeping a few questions in mind and asking them often can go a long way towards performing the basic due diligence necessary to gain comfort that your organization is viewing the models with an unbiased, even skeptical eye. A good place to begin your evaluation of the impact of models on business performance is by vetting the model and its provider. Here are some useful questions to ask:
"What is your approach to model validation?"
All components of a catastrophe model should be independently developed and validated to the extent possible, but appropriate oversight needs to take place to ensure that the component parts produce a coherent whole. Model providers should be able to demonstrate that their internal processes ensure that final model output is consistent with basic physical expectations of the underlying hazard, and unbiased when tested against both historical and real-time information.
"How has your view of the risk evolved or shifted over the past few years? Have you changed your positions on key areas of scientific research and debate? If so, why?"
Models change as new science is vetted, new data becomes available, and the user marketplace demands solutions to new problems. Change itself is not a sign of weakness, but neither is it a sign of strength. Change should be thoroughly justified and competently managed, with an emphasis on communication and understanding of the end business consequences. In the end, the numbers need to make sense.
"What are your processes for helping your clients manage the impact of model updates and changes?"
Business consequences are often determined not only by what comes out of the models, but by the degree of stability of model results. The impact of model changes anticipated to dramatically affect results should be considered very carefully by model providers prior to implementation. When such updates are made, a client-specific service infrastructure should be in place in advance, accompanied by thorough and transparent communication of component changes, underlying reasons, estimated impacts, and suggestions for transition.
"Why should I trust your scientific, engineering and actuarial expertise? What puts you on the cutting edge, yet distinguishes you from the other firms?"
Model providers all employ many experts, but they should be prepared to demonstrate their capital investments, human and otherwise, and whether those investments have paid off in terms of a robust and credible view of natural hazard risk. If model components are outsourced by the model provider, what quality controls are in place to ensure that final model output makes sense?
"Has your model been reviewed by independent outside experts?"
As in all scientific endeavors, independent peer review is the gold standard of credibility. The model provider should be able to provide you with names and credentials of reviewers who are independent and recognized as experts in their respective fields.
"What is unique about your model architecture? What's the weakest link in it and why?"
Despite certain common underpinnings in science, engineering, and historical observations, all models are not built the same way. Architecture affects everything from the ability to accept refined inputs to the credibility and versatility of outputs. Model providers should be able to honestly assess both their strengths and limitations.
"How does your model "roll up" insurable losses from policy coverage to the corporate level? Is the process of adjusting losses for policy conditions and aggregating across portfolios and simulated events transparent and accurate?"
Complex mathematics is required to assemble the individual modeled losses for each event, location, and coverage into a lucid, broad picture of enterprise catastrophe risk. Insurance policies themselves are complex, with various types of deductibles, limits, and triggers. The model provider should be able to justify its approach—which often has a large impact on the final metrics—and explain it transparently.
"What are your formats for input data? Is the data structure and import procedure transparent? If not, how am I supposed to maintain audit and internal controls for such a critical business process?"
Models reflect directly the quality of input data. Raw data suffers many opportunities for loss of fidelity as it is gathered and converted for use in models. It is impossible to eliminate all conversion issues, but transparency of formats and auditability of the conversion process minimize the potential errors.
"What are your models leaving out? What other information should I be using to understand risk?"
Model providers freely admit that models are tools, not oracles, and a deep understanding of your business is necessary to properly integrate other sources of quantitative and qualitative information into any corporate evaluation of disaster risk, including estimated losses in the aftermath of actual events. Model providers should provide transparency by clearly stating what is included in a particular model (e.g. sub-perils, lines of business and coverages) and, more importantly, what is not included. Model providers should also be able to provide expertise and consulting services to assist you in filling in the gaps.
Good Questions Drive Good Practices, Inside and Outside the Organization
This Perspective addressed key questions organizations should be asking of their model vendor. But don't forget to also question data managers, model operators, end users, and even the stakeholders evaluating the results. All should have good answers if they are playing their roles properly. In addition, they should be capable of performing basic checks, such as making sure that hurricane frequencies by coastal segment conform to a common sense understanding of how hurricanes behave.
As chief executive of a firm driven by science and technology as well as human capital, I've learned that framing and repeatedly asking the right questions to decision-makers both inside and outside the firm is often the most productive action I can take to ensure robust business processes and credible results. My advice to insurance leaders tasked with managing the complex decision analytics of catastrophe modeling is to do the same, and to insist that the answers reflect the best practices needed to provide confidence that the tools are being used properly.