Catastrophe Ratemaking Basics

December 06, 2018

Blog post illustration

Editor’s note: In the first of a series of blogs on ratemaking, risk analyst Kaitlin Reiss introduces the use case for catastrophe models in ratemaking. Ask Kaitlin your questions in the comments below!

Typically, the loss-related components of insurance rates are estimated using historical loss data. However, the use of historical data in ratemaking for low-frequency, high-severity events introduces many challenges, including a lack of credible experience for various regions or areas with minimal exposure, significant random fluctuation, the inability to account for the changing exposure landscape, and the ineffectiveness of predicting the behavior of new or future perils.

Credibility is a significant concern when limiting risk assessments to historical experience. There is relatively little historical insurance data for catastrophic perils due to their infrequent occurrence, and the propensity for loss can vary widely between events. When storms such as Harvey and Florence have set unprecedented records for precipitation, using only historical data would not reflect loss potential.

Catastrophe models offer a long-term view of risk with tens of thousands of years of simulation, and as a result, allow for a higher level of credibility when calculating rates. Various model output metrics, including average annual loss (AAL), loss costs, tail value at risk (TVaR), and standard deviation can be used in tandem with historical loss experience to mitigate the areas of concern where historical data is not sufficient.

Calculating Risk Load

If a company has a set of coastal exposures and is using the pure premium or loss ratio approaches to determine rates, this approach would first require the use of historical data to find the average loss or loss ratio. However, catastrophe model output can be used in lieu of, or blended with, this data. Catastrophe models, which are based on historical events, represent an understanding of reasonable event parameters applied to a current view of exposures—this includes changes in insured value (inflation or depreciation), the addition of mitigation features to the exposure, and changes in the concentration of exposures within the portfolio.

After incorporating modeled loss to find the AAL of an exposure or group of exposures, a risk load would need to be applied to reflect the additional cost the insurer accepts for the risk, as catastrophe perils pose a much higher risk than traditional insured perils based on the loss magnitude and the correlation of loss between exposures in the same geographical region. Model output such as standard deviation can be used directly within commonly employed catastrophe risk load formula such as the method described in by Rodney Kreps. You can also use metrics such as TVaR or WINVaR, which more accurately capture the tail of the catastrophe loss distribution, using some modified risk load formulas.

Incorporating Reinsurance Expense Load

Another opportunity for using catastrophe models within ratemaking is the ability to include reinsurance expense loads. Year and event total losses from the model can be used to reasonably estimate reinsurance layer losses. Reinsurance recoveries themselves can be proportionally allocated back to the regions, lines of business, territories, or more granular level to allocate the cost to those policies and locations that are driving the need to purchase reinsurance.

Although hurricanes and earthquakes have historically been the largest drivers of catastrophe loss and were the first perils around which catastrophe models were developed, this same approach can be used for a number of other perils, such as severe thunderstorm, winter storm, wildfire, and inland flood, which typically have a higher annual frequency of occurrence but  still have the potential for extreme tail scenarios, such as the severe thunderstorm activity in 2011 or the wildfire activity in 2017, well outside the average range of loss experience.

The use of catastrophe models within ratemaking has allowed insurers to become significantly more flexible in their long-term view of potential loss. A model’s thousands of simulation years and heavy validation allow for the integration of credible loss outputs that can be used in determining premiums that are reflective of the current exposure landscape.

Modeled output can be easily incorporated in place of or blended with historical loss data to efficiently calculate the risk for an existing portfolio, or to assess a new risk entirely. The decrease of traditional limitations with the use of only historical data has allowed the insurance industry to build and maintain portfolios that are increasingly resilient to extreme events.

If you’re interested in learning more about incorporating catastrophe models into ratemaking, a mixed team from AIR and RSM Canada have recently published a research paper titled “Incorporation of Flood and Other Catastrophe Model Results into Pricing and Underwriting,” available through the Casualty Actuarial Society.


Read the blog “Location, Location, Location: For Which Perils Does It Matter Most?” to learn how Geocoding is much more important for some perils than for others


Don't miss a post!

Don't miss a post!
Subscribe via email:


Close

Loading Video...

Loading...

Close

You’re almost done.
We need to confirm your email address.
To complete the registration process, please click the link in the email we just sent you.

Unable to subscribe at this moment. Please try again after some time. Contact us if the issue persists.

The email address  is already subscribed.