Time to move on from risk-neutral valuation?

Risk-neutral valuation could be replaced by models with a subjectivity element, writes mathematical finance head

New direction

Risk-neutral valuation could be replaced by models with a subjectivity element, writes mathematical finance head

The concept of no-arbitrage pricing via replication and risk-neutral valuation has been the cornerstone of financial instruments valuation, following the initial success of the Black-Scholes model. Alternative valuation paradigms proposed by academics have rarely found room in practitioners’ toolboxes, due to the supposed self-contained nature of risk-neutral valuation, which is thought to require very little subjective input.

It is true that under the risk-neutral measure, there is no need to estimate the expected return of assets. This has given derivatives valuation an objective flavour when compared with methods based on the physical measure.

However, degrees of freedom and arbitrariness affect risk-neutral valuation too. Many risk factors do not have corresponding trading observables and cannot really be hedged.

Traditionally, this market incompleteness has been dealt with by heroic assumptions or proxying, via calibration shortcuts or mapping methods. For example, to price derivatives valuation adjustments (XVAs), one may need credit spread curves, but the relevant entities may have no traded liquid credit default swaps or bonds.

The same applies to recovery rates as well as wrong-way correlation, for which it is very hard to find pricing statistics for calibration. More generally, hard-to-calibrate dependence and contagion parameters can have a dramatic effect on valuation adjustments even under full collateralisation (see Brigo, Capponi and Pallavicini (2014), on gap risk for CVA). This has not stopped practitioners from calculating XVAs with risk-neutral expectations, but pricing statistics are often proxied by historical estimates, and missing inputs are mapped from other names with similar physical measure ratings or characteristics.

This state of affairs is hardly confined to XVA. Consider for example bespoke collateralised debt obligations, and the way the already dubious implied correlation (Brigo, Pallavicini, Torresetti, 2010) used to be mapped from iTraxx/CDX liquid tranches to bespoke portfolios by loose analogy, ignoring most of the specific tailor-made portfolio structure.

A possible alternative would be to price starting from the physical measure, trying to link the deal value to economic fundamentals and historical analysis.

More formal physical measure methods in the academic literature have been somewhat obscured in finance by the success and ubiquitous use of the Black Scholes theory and its extensions

In a sense, practitioners have always used the physical measure, but kind of implicitly. When hedging and recalibrating sensitivities frequently, one is essentially using the physical measure dynamics of the pricing measure expectations, or adjusting the pricing measure to realisations of market movements, preferences or beliefs.

Since practitioners acknowledge the existence of hedging errors, they are essentially using an incomplete market method, in which they express a preference on the trade’s profit and loss under the physical measure. This in turn translates into a price that can be expressed in the usual way under a suitable pricing measure.

So practitioners are already working under the physical measure, but the approach could be made more explicit and rigorous, with explicit assumptions.

More formal physical measure methods in the academic literature have been somewhat obscured in finance by the success and ubiquitous use of the Black Scholes theory and its extensions. These include actuarial pricing in insurance and commodity markets, where the physical measure is used more widely due to the relevance of historical data for insurance, or the peculiarities of the specific energy markets (Biffis, Denuit, Devolder, 2010; Biffis, Blake, Pitotti, Sun, 2016; Benth, Ekeland, Hauge, Nielsen, 2003).

An alternative approach is indifference pricing (Hodges & Neuberger, 1989; Pennanen 2012), in which a utility function is specified for the trader or the market player, which assumes they always aim at maximising expected utility, possibly under budget and risk constraints. We then obtain a price for a new trade by imposing that the constrained maximum expected utility of the trading portfolio is the same with or without the new trade. The correct price for the new trade is the price that makes us indifferent to adding the new trade to the portfolio.

Even in the context of classic risk-neutral valuation, the industry has been adopting arbitrary and rough approximations to be able to proceed

In a recent working paper I co-authored (Brigo, Francischello, Pallavicini, 2017), a related framework is used to propose an initial approach to valuation of the cost of capital for a trade, related to the so-called capital valuation adjustment.

Clearly, the problem of determining the correct utility for each player is difficult, which may be one of the reasons indifference pricing methods and, more generally, utility methods have not been adopted in the industry.

However, as we have seen, even in the context of classic risk-neutral valuation, the industry has been adopting arbitrary and rough approximations to be able to proceed. The time may have finally come to willingly embrace the subjectivity behind markets’ driving forces.  

Damiano Brigo is head of the mathematical finance research group at Imperial College, London, and part of the stochastic analysis research group.

Listen to Damiano speaking to Risk.net’s Mauro Cesa and Nazneen Sherif in the first of our new Quantcast podcasts here.

References:

  • Brigo D, A Capponi and A Pallavicini, 2014

Arbitrage-free bilateral counterparty risk valuation under collateralization and application to credit default swaps

Mathematical Finance, vol 24, No 1, pages 125–146

 

  • Brigo D, A Pallavicini and R Torresetti, 2010

Credit models and the crisis: a journey into CDOs, copulas, correlations and dynamic models

Wiley, Chichester

 

  • Biffis E, M Denuit and P Devolder, 2010

Stochastic mortality under measure changes

Scandinavian Actuarial Journal, April 2010, pages 284–311

 

  • Biffis E, D Blake, L Pitotti, A Sun, 2016

The cost of counterparty risk and collateralization in longevity swaps

Journal of Risk and Insurance, vol 83, pages 387–419

 

  • Benth FE, L Ekeland, R Hauge and BF Nielsen, 2003

A note on arbitrage-free pricing of forward contracts in energy markets

Applied Mathematical Finance, 2003, 10(4), 325–336

 

  • Hodges SD and A Neuberger, 1989

Optimal replication of contingent claims under transaction costs

Review of Futures Markets 8 (1989) 222–239

 

  • Pennanen T, 2012

Introduction to convex optimization in financial markets

Mathematical Programming, vol 134, issue 1, pages 157–186

 

  • Brigo D, M Francischello and A Pallavicini, 2017

An indifference approach to the cost of capital constraints: KVA and beyond

https://arxiv.org/abs/1708.05319
 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@risk.net or view our subscription options here: http://subscriptions.risk.net/subscribe

You are currently unable to copy this content. Please contact info@risk.net to find out more.

Digging deeper into deep hedging

Dynamic techniques and gen-AI simulated data can push the limits of deep hedging even further, as derivatives guru John Hull and colleagues explain

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here