Taking the measure of CMS pricing

Bank of America quants propose comprehensive framework for modelling rate derivatives

Constant maturity swaps are a relatively simple form of derivative that have existed for decades. The products – in which the floating leg is reset periodically against the swap curve, and the payoff is expressed in terms of a swap rate – allow investors to take a position on the swap rate with respect to a reference rate. They also have constant duration, which is attractive from a risk management perspective.

One-factor models were commonly used to price CMSs because the products were considered to be driven by a single variable: the swap rate. However, these models failed to account for CMSs’ actual dependence on multiple sets of rates and the relationship between, on the one hand, the products’ annuities and, on the other, swap rates and volatility smiles. Because they did not account for all the risks, the models had the potential to result in hedging errors.

A 2012 paper by Vladimir Piterbarg, at the time Barclays Capital’s global head of quantitative analytics, and Simon Cedervall, then a quantitative analyst at Citi, showed that when pricing CMSs, those factors needed to be taken into account.

Risk management is by far the most important aspect, as prices are in general well observed in the markets, but risks are not
Dominique Bang, Bank of America

Dominique Bang, managing director in Bank of America’s fixed income quantitative modelling team, says the paper was “the first step to a better pricing and risk management of these products”.

But there was a tradeoff. The new approach introduced some simplifying assumptions on the dependency structure of annuities and swap rates and on the need for approximations to calibrate the model to market data. “The errors caused by the approximations could be of the range of a few basis points,” says Bang. “Which may not sound much, but when dealing with large notionals, the dollar amount can be sizeable.”

When Bang and Elias Daboussi, a senior quant in the same team, began to look into the issue, they felt little could be done to improve on Piterbarg and Cedervall’s approach. Nevertheless, in their own paper, published last month, the BofA quants came up with the idea of using a new numeraire, a reference measure for the actualisation of the annuities. This ‘annuity due’ references swaps with coupons at the beginning rather than at the end of the period – a principle which, the authors say, applies to countless examples in the real world, such as home rental payments. They add that their new approach increases the accuracy of the pricing and risk management of both CMSs and CMS-linked products and allows for a comprehensive framework under which all non path-dependent rate products can be managed consistently.

Under Piterbarg and Cedervall’s model, and the approaches that came before, pricing a CMS product relies on the ability to shift away from an annuity measure, which relates to a swap payment schedule, to a payment measure, under which the CMS is actually paid. The shift is the challenging part that attracted the most attention, and for which many arbitrary assumptions have been made.

“Our paper introduces an intermediate schedule – payment at the beginning of each period – and the related measure ‘annuity due’,” says Bang. “What we achieved in the paper is twofold.

“First, we solve a longstanding problem without suggesting new arbitrary assumptions. Given prices of vanilla swaptions and a co-dependence structure, we build the joint distribution for the family of swap rates. The solution is canonical, and does not require any of the approximations usually made in the industry for such products.

“Second, we provide explicitly a one-step Monte Carlo implementation scheme, and an improved version of it with better statistical properties.”

Service with a smile

The method is designed to price quasi-vanilla products and manage the associated risks. These include CMS-linked products – such as CMS options, spread options and mid-curves – and hybrids, though Bang says the approach would not be suitable for path-dependent products. The point is to have consistency while still reproducing, by construction, the observable swaption volatility smiles and correlation structure.

Having all these products under the same modelling framework is critical. When risk comes from various products, a trader will have a tough job making sense of Greeks coming from different models. A single framework, however, allows for meaningful risk aggregation and sensible netting effects.

“I was quite impressed to see that more innovation is possible in this space,” says Piterbarg, who is now head of quantitative analytics and quantitative development at NatWest Markets. “The introduction of the annuity due measure is the most interesting contribution in this work.

“The main focus of our paper was to project vega sensitivities to the volatility grid. We derived an expression of the annuity in terms of swap rates, which is the starting point of Bang and Daboussi’s paper, and then we did some approximations with regards to distribution. They came up with a clever trick of using the annuity due measure, which allows them not to use our approximations and to incorporate explicitly full distributions of the swap rates without the lognormal assumption.”

Fabio Mercurio, global head of quant analytics at Bloomberg and the winner of Risk.net’s 2020 quant of the year award for his work on modelling interest rate derivatives under risk-free rate benchmarks, recognises that Bang and Daboussi “propose a solution that addresses the post-Libor framework, in which it turns out to be advantageous to adopt a new numeraire”.

As is often the case in quantitative finance, one has to be aware of the potential benefits and drawbacks, and the likely trade-offs. In the case of Bang and Daboussi’s model, the question is about the stability of the output it provides.

“My intuition tells me that the presence of a Monte Carlo step might interfere with the stability of the Greeks”, says Piterbarg.

Mercurio says: “While the implementation might require some attention to avoid unstable numerical results, given the complexity of the algorithm proposed, the attractive feature of this approach is that it gives the breakdown of sensitivities, which is of great importance for traders.”

For Daboussi, while belonging to the class of Monte Carlo methods, the suggested one-step scheme ensures variance and noise reduction that leads to better convergence.

Bang adds that the breakdown of sensitivities is the model’s key advantage: “Risk management is by far the most important aspect, as prices are in general well observed in the markets, but risks are not. If prices are accurate but the sensitivities are not correctly allocated, traders’ hedging strategy may be flawed.”

Ultimately, having a framework that can be consistently deployed across different products of the same family means better risk management for the sell-side business. For the buy-side business, it means the ability to spot mispricings and take advantage of them.

Editing by Daniel Blackburn

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here