Paper of the year: PJ de Jongh, Tertius de Wet, Kevin Panman and Helgard Raubenheimer

South African academics pioneer a quick and easy way of estimating op risk capital

L to R: Kevin Panman and Helgard Raubenheimer

OpRisk Awards 2016

Operational risk capital modellers spend most of their lives in the tail of the loss distribution. Bank regulations demand capital levels based on the 99.9% quantile, and infrequent but extremely severe loss events can dominate overall op risk losses. But modelling this part of the distribution accurately is not a trivial task.

Many institutions rely on estimates produced by Monte Carlo simulations, but these demand huge amounts of computational resources to run. Methods such as the fast Fourier transform (FFT) and Panjer recursion are quicker to run, but are still computationally demanding and require initial inputs - in other words, they require the user to have an almost-correct estimate in the first place to achieve a good answer.

The ideal would be a closed-form solution – a mathematical expression that contains only a few relatively simple components, and can therefore be evaluated relatively easily and quickly. So far, none exist. In 2013, Lorenzo Hernandez and his colleagues at Spanish consultancy Quantitative Risk Research and the University of Madrid developed approximate solutions based on the widely used single-loss approximation method, with a series of additional correcting factors applied.

And earlier this year, Riaan de Jongh, Tertius de Wet, Kevin Panman and Helgard Raubenheimer, researchers at South Africa's North-West University and Stellenbosch University, published the results of their attempts to assess whether Hernandez's methods would work accurately in practice. The paper, A simulation comparison of quantile approximation techniques for compound distributions popular in operational risk, appeared in the March issue of the Journal of Operational Risk.

riaan-de-jongh-nstfRiaan de Jongh

 De Jongh says he and his colleagues started to look for less time-consuming alternatives to Monte Carlo simulations after they began work on validating an op risk model for Barclays Africa. "We programmed some of the solutions and we were quite astounded, in the beginning, at how close some of these approximations were to the Monte Carlo approach," he says. "You look at the centre of the Monte Carlo distribution and the approximation comes very close to it."

The group tested several different approaches - three versions of the single-loss approximation, and three iterations of Hernandez's perturbation approximations – when estimating the key 99.9% quantile. With the exception of one, the approaches didn't come close to the Monte Carlo results. The single exception was the second-order perturbation approach, which was within 1% of the Monte Carlo median in every test using an infinite-mean distribution. It performed better than even the much more computationally intensive FFT approach, and delivered a capital estimate not significantly different from that obtained using a Monte Carlo simulation. With finite-mean distributions, the approximation also performed well at extreme quantiles, but less well at lower quantiles.

"What was nice about [our] paper is that the conclusions at the end were very simple," says de Jongh. "It is not often that when we do some related study, we have such clear-cut recommendations."

As mathematicians and as practitioners, I feel it is always important to study the subject deeply, and by modelling it you always get a deeper understanding of what you are dealing with
Riaan de Jongh

Local banks have expressed an interest in using the approximations to replace Monte Carlo simulations. The advantage in speed is tremendous – the millions of simulation runs required for a Monte Carlo analysis can take days to execute, while an approximation takes a fraction of a second, says Raubenheimer.

The most obvious use for this work is to calculate regulatory capital under the advanced measurement approach (AMA) – the Basel Committee on Banking Supervision's own-models approach to op risk capital. Unfortunately, the committee unveiled plans to scrap the AMA and replace it with a standardised measurement approach in March. Nonetheless, de Jongh says the approximations have other uses, which are not limited to the banking sector.

"I would say it would still benefit banks to look at losses and their distribution, maybe not necessarily the 99.9% quantile, but the extreme quantiles – you would see how close the economic capital figure is to the regulatory capital that you have calculated in other ways," he says. "As mathematicians and as practitioners, I feel it is always important to study the subject deeply, and by modelling it you always get a deeper understanding of what you are dealing with."

t-dewetTertius de Wet

There is also the potential for approximations reached through the perturbative approach to be used elsewhere – for example, modelling claim distributions in the insurance industry. "This problem is an old problem in actuarial science," says de Jongh. "For example, claim distributions are also an example of a random sum that you get in a compound distribution and where you would also like to plot the quantiles."

He believes the next step is to build on this work to develop multipliers that would allow an extreme quantile – say, 99.9% – to be derived from a good estimate of a less extreme quantile – say, 95% – based on an underlying severity distribution. This would allow capital calculations to be based on a less extreme quantile, which would be less subject to estimation sensitivity. But the feat is challenging and remains a "work in progress", he says.

"I must be honest – we are not too sure about this. It works well in certain circumstances, but we still have to do work on it," says de Jongh. "What we have to estimate in the risk multiplier is actually only the tail of the severity distribution that you are dealing with – and to estimate that quantity is a bit problematical statistically."


  • LinkedIn  
  • Save this article
  • Print this page