Actuarial pricing, capital modelling and reserving

Pricing Squad

Issue 10 -- January 2017

Happy New Year from Pricing Squad!

Pricing Squad is a newsletter for fellow pricing practitioners and actuaries in general insurance.

Today's issue shows you how to fit amazing pricing models using only Excel.

Amazing models with Excel only

The Fox and the Cat

A Fox was boasting to a Cat of its clever devices for escaping its enemies. "I have a whole bag of tricks," he said, "which contains a hundred ways of escaping my enemies."

"I have only one," said the Cat; "but I can generally manage with that."

Just at that moment they heard the cry of a pack of hounds coming towards them, and the Cat immediately scampered up a tree and hid herself in the boughs. "This is my plan," said the Cat. "What are you going to do?" The Fox thought first of one way, then of another, and while he was debating the hounds came nearer and nearer, and at last the Fox in his confusion was caught up by the hounds and soon killed by the huntsmen. Miss Puss, who had been looking on, said:


First things first

The following models have been easily delivered using only Excel:
  • Price-matching a telematics product. We optimised 19 pricing parameters on a batch of 1,000 market quotes.
  • Simultaneously price-matching three household covers (building, content and items) separately from an all-cover-combined market data set. We matched a non-linear, non-multiplicative pricing model with 51 parameters using 25,000 data rows.
  • Price optimisation of a private motor account. 33 pricing parameters and 50,000 data rows.
  • Building a multiplicative elasticity model for private motor - 7 parameters and 50,000 data rows.
  • Modelling Winner's Curse by segment using 25 pricing parameters and 50,000 data rows.
  • Fitting a complex non-linear commercial fleet pricing model to loss cost data using 75 parameters.

How to build the workbook

A good Excel workbook needs to
  1. contain a big multivariate data set,
  2. present parameters by rating factor clearly,
  3. not use the vlookup() function.
This will ensure mathematical accuracy, clarity and speed.

Here is how you can do it:

On the schematic, each big block is one Excel sheet. The solid red lines represent calculation flow.

Changing parameter values in the Control sheet automatically modifies parameters in their dedicated sheets, e.g. in a "driver's age" sheet. This then flows to raw data where each data row links to the correct parameter depending on this row's risk details.

The parameters then enter a non-linear calculation for each data row. This calculation can be a log-likelihood function (for GLM), a non-linear premium calculation, square error or absolute error or anything. Non-linear calculations can also be defined separately for each parameter in its dedicated sheet.

Finally an objective function is defined back in the Control sheet based on these non-linear calculations. You can then use Solver to minimise or maximise the objective function.

Why I love it

Tt's so simple. You don't need any specialised software, laborious data cleansing, external servers, or software licences.

It frees you from having to make, and believe in, spurious assumptions about the probability distributions of claims (for instance, that they belong to the exponential family of distributions).

And because all calculations are defined as Excel formulas, you can use non-linear logic. This flexibility is impossible in simplified modelling frameworks such as GLM. Consider minimum premiums, interpolations and extrapolations of cover limits, deductibles, size discounts, fleet scoring systems and add-ons. You can obtain the best possible parameters for any of those just by using Excel. For instance, you can determine not only multiplicative relativities by segment but also the best minimum premium by segment.

The ability directly to model non-linear features also removes the need for the additional model touch-ups typically required on top of GLM pricing models.

In addition, since you can set the Solver objective to directly maximise model fit for each rating factor, you will get the best fitting model possible. No other model or method will fit your historical data so spectacularly well.


You will need a VBA macro to set up workbooks like this. It takes a few days to write such macros. For example, I invested a week to develop the first version of the tool I currently use.

Fitting models based on up to 200,000 data rows takes a few minutes. Larger data sets might need to be aggregated.

Generally, your fit is better if all model functions are smooth (no "if-then" jumps), if they have a single maximum or minimum (no "sin", "cos", high degree polynomials etc.) and if they are sensibly scaled (do not use two columns in the same model if one is expressed in $ and the other in $000,000,000).

Finally, you will need to test the statistical significance of the parameters obtained. This can be done by checking the consistency of these parameters in random sub-samples and time-consistency tests.

How can we support you?

If you need access to pricing tools to radically simplify your work and deliver reduced loss ratio quickly, or if you are simply looking for an actuarial contractor, get in touch.

Thank you for reading, and have a great day,
Jan Iwanik, FIA PhD

Copyright © 2017 Jan Iwanik, All rights reserved. You are receiving this email because you are subscribed to updates from We publish data and analysis for informational and educational purposes only. You can unsubscribe from this list by emailing us.