Journal of Operational Risk

Welcome to the third issue of Volume 14 of The Journal of Operational Risk.

A few months ago, I had the opportunity to host OpRisk North America 2019 in New York. It was a pleasure to be reunited with longtime friends and be introduced to some of the new faces that are renewing the industry. I was glad to see how motivated everyone was regarding the challenges that operational risk brings to the financial industry. Despite Basel III’s efforts to simplify operational risk measurement, we are still seeing lots of innovative ideas in the areas of measurement and assessment of operational risk (if not for regulatory capital, then at least for economic capital purposes), and also to help in the better management of this important risk type. During the conference, I could also see an increased focus on cyber and IT risks, as these risks are becoming more prevalent with the increased use of technology in banking. It is no surprise that we see at least one paper on these subjects in every issue of our journal, including this one.

We expect to receive an increasing number of papers on cyber and IT risks in the future, and not only on quantification but also on better ways to manage those risks. We would also like to publish more papers on important topics such as enterprise risk management and everything that encompasses this broad subject: establishing risk policies and procedures, implementing firm-wide controls, risk aggregation, revamping risk organization, etc. As I have said before, while we still hope to receive analytical papers on operational risk measurement, they are now likely to come with a focus on stress testing and actually managing those risks. These are certainly exciting times!

The Journal of Operational Risk, as the leading publication in this area, aims to be at the forefront of these discussions, and we welcome papers that will shed some light on the issues involved.

In this issue, we have three research papers and one forum paper.

RESEARCH PAPERS

In our first paper, “An investigation of cyber loss data and its links to operational risk”, Ruben D. Cohen, Jonathan Humphries, Sabrina Veau and Roger Francis note that cyber risk remains an elusive moving target due to the constantly evolving cyber- threat landscape. A lack of structured data and the systemic implications of the multifaceted effects of overlapping risk frameworks are additional factors that are making cyber risk difficult to quantify. In order to face this challenge, Cohen et al consider several topics, offering a potential definition for cyber risk that encompasses confidentiality, integrity and availability; establishing the key components of a cyber- risk framework; proposing a taxonomy to help establish a common framework for data collection to aid quantification; and, finally, discussing key quantification challenges. The paper then focuses on quantifying the direct financial and compensatory losses emanating from cyber risks. The authors apply the dimensional analysis method, incorporated in the same manner as it is applied to operational losses, which enables the identification of any similarities and/or gross deviations between the profiles of cyber and noncyber operational losses. The authors conclude that although there has been an increase in both the frequency and the severity of cyber losses over the past few years, there has not been a major paradigm shift in their fundamental risk profile over the same period of time.

In “Applying existing scenario techniques to the quantification of emerging operational risks”, our second paper, Michael Grimwade – a longtime industry practitioner – sets out techniques based on real options models, tested during his years of experience as an operational risk manager for large global financial institutions, in order to identify systematically emerging threats, their timescales and interrelationships (eg, feedback loops and domino effects); to quantify operational risks through structured scenario analysis processes that analyze the drivers of impacts and likelihoods; and to validate the outputs of scenario analysis through backtesting against internal and external data sources. Grimwade presents his ideas through case studies that illustrate how some emerging risks come in waves, peaking and then declining, leading to their potential overestimation, while others are yet to result in losses, leading to their potential underestimation. The techniques set out early on in the paper are modified to mitigate these different challenges. It is a very interesting read.

In the issue’s third paper, “On the selection of loss severity distributions to model operational risk”, Daniel Hadley, Harry Joe and Natalia Nolde note that accurate modeling is crucial for banks and the finance industry as a whole to prepare for potentially catastrophic losses. The traditional modeling approach is the loss distribution approach, which requires banks to group operational losses into risk categories and select a loss frequency and severity distribution for each category. The annual operational loss distribution is estimated as a compound sum of losses from all of these risk categories, and banks must set aside capital, called regulatory capital, equal to the 99.9% quantile of this estimated distribution. In practice, this approach may produce unstable regulatory capital from year to year as the selected loss severity distribution family changes. This paper presents truncation probability estimates for loss severity data and a consistent quantile scoring function on annual loss data as useful severity distribution selection criteria that may stabilize regulatory capital. In addition, the sinh-arcsinh distribution is a flexible candidate family for modeling loss severities that is easily estimated using the maximum likelihood approach. Finally,

the authors recommend that loss frequencies below the minimum reporting threshold be collected so that loss severity data can be treated as censored data.

FORUM SECTION

We have one paper in our forum section in this issue. In “The use of business intelligence and predictive analytics in detecting and managing occupational fraud in Nigerian banks”, Chioma N. Nwafor, Obumneme Z. Nwafor and Chris Onalo continue a Journal of Operational Risk tradition by reporting on the status of operational risk management in different parts of the world. This time the authors tackle the problem of occupational fraud, one of the most wide-reaching operational risk event types in the Nigerian banking system. This event type spans many departments, roles, processes and systems, and it causes significant financial and reputational damage to banks. As a result, fraud presents banks with a real challenge of knowing where to start. One of the main aims of this paper is to employ stochastic probability models to predict aggregate fraud severity and frequency within the Nigerian banking sector using historical data. The authors’ second objective is to describe how banks can develop and deploy business intelligence (BI) outlier-based detection models to identify fraudulent activities. As the volume of transaction data grows and the industry focuses more closely on fraud detection, BI has evolved to provide proactive, real-time insights into fraudulent behavior and activities. Nwafor et al offer a discussion on the fraud analytic development process since it is a central issue in real application domains.

Marcelo Cruz

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here