Sponsored by ?

This article was paid for by a contributing third party.More Information.

Strategies for effective real-time data capture and robust risk management

Strategies for effective real-time data capture and robust risk management
Aakash Dhage/Unsplash

Risk management systems, processes and real-time data aggregation techniques are rapidly evolving across financial institutions against a backdrop of high market volatility and rapid technological development

The need to accurately calculate real-time risk exposure is core to financial markets. It has been underscored yet again by left-tail events with the rapid increase in interest rates and recent bank failures. 

Platforms continue to improve capabilities in stress-testing to understand how portfolio positions are impacted if the market tends in a specific direction. Nevertheless, risk managers are required to constantly assess their current infrastructures and move to new ones to effectively capture, manage and protect data.

In a recent Risk Live panel session sponsored by KWA Analytics, experts discussed the challenges and opportunities behind real-time data for risk management, the potential of cloud computing and digital transformation and effective resource allocation for optimal data management.

This article explores the key themes that emerged from the discussion.

The panel

  • Ram Meenakshisundaram, Senior vice-president, quantitative services, KWA Analytics
  • Jean Carlos Alonso, Executive director, Santander
  • Steve Boras, Executive vice-president, Citizens Bank 


Real-time data integration challenges

Efficient integration of real-time data continues to be a challenge for financial institutions, especially as they seek to adapt and make informed decisions amid the complex nature of and evolving risks within financial markets. 

Assessing limitations in existing infrastructure and moving to more agile platforms is paramount for data management and to accurately calculate real-time risk exposures. This is also key for staying ahead of regulatory demands and priorities, especially during periods of heightened market volatility.

Jean Carlos Alonso, executive director at Santander, highlighted the threefold challenges, especially given the current state of volatility, the quantity of available data and the pressures around testing the adequacy of that data. “We have all learnt from the past that, in an environment like this, data is key.

“From a risk management perspective, we’re striving to tackle the adequacy of our models to handle more real-time data, addressing regulatory requirements and providing clients with real-time pricing, covering trading strategies for high-frequency trading across buy- and sell-side firms.”

The transition to integrating legacy system capabilities with new technologies, including artificial intelligence, continues to remain a challenge for organisations, particularly as regulators continue to make necessary changes to enhance controls over new risks brought to the fore by these technologies.

Ram Meenakshisundaram, KWA Analytics 2023
Ram Meenakshisundaram, KWA Analytics

Ram Meenakshisundaram, senior vice-president for quantitative services at KWA Analytics, said KWA observes that data models are fragmented across both traditional legacy systems, as well as on-premise systems. “Sometimes we have witnessed that even the same security – a bond security, for example – is implemented multiple ways in multiple systems including within the firm’s in-house system,” he said.

Leveraging digital transformation for clients has been instrumental in addressing these challenges. “Through digital transformation for our clients, we have been building out security masters by mining the data and trying to find the right parameters for these securities, even if it means going down to the prospectus,” he said.

Real-time analytics built out on-premises or on cloud-based solutions across various architectures can provide a cohesive environment for models to cover data across diverse asset classes such as fixed income, real estate or private equity.
 

The power of the cloud  

Moving beyond siloed technology solutions to a more robust scalable platform across the organisation is essential for enhancing real-time data integration and data management efforts.

The urgency to exploit the potential of cloud computing has therefore intensified for risk managers seeking effective solutions. “We live in an era of big data where we’re able to generate and consume a lot of data, but interpretation is very difficult,” said Meenakshisundaram.

He added that organisations are now encouraged to look beyond traditional client server-type models and onto cloud-based solutions that have a much more powerful compute capacity and server capability to increase throughput to calculate critical analytics in real time.

“Cloud computing is useful for regulatory reporting, XVA [valuation adjustment]-type calculations and FRTB [Fundamental Review of the Trading Book]-related reporting, since it enhances data quality and has also proven more cost-effective than investing in legacy systems,” said Meenakshisundaram.

Cloud-based solutions can be extremely helpful with segmenting dashboards into static and dynamic ones, powering dashboards with functionalities to compute sensitivity analysis or risk analysis in real time. In addition, cloud-based solutions can build interactivity into dashboards, allowing risk managers to conduct scenario analysis to calculate risk across portfolios in near real time.
 

Data capture and security 

Reflecting on the pain points when capturing data, Alonso pointed out that the ability to create high volumes of data may not always be a good thing: “We don’t have scarcity of data any more – we have too much data now, and we need to take responsible decisions with that data.”

He added that data quality, as well as quantity, is a concern. “In the end, you might have a great model that has been validated and tested but, if the data is not good, then the arc of the model is going to be terrible.”

Steve Boras, executive vice-president at Citizens Bank, reflected on the overabundance of available data, but warned about challenges relating to evaluating that data and visually displaying it in a manner that conveys action.

He cautioned that data security is also a key risk when handling high volumes of data. “We need to be aware of ‘bad actors’. In the past, the main priority was to ensure all systems spoke to each other and data was accessible, but now firms need to be careful the data isn’t being compromised.”
 

Moving to T+1 and beyond 

One of the main regulatory priorities facing trading risk management now is preparing for the forthcoming changeover from the current T+2 settlement system to T+1 in the US and Canada in May 2024.

The Securities Industry and Financial Markets Association, the Investment Company Institute and the Depository Trust and Clearing Corporation are taking steps to accelerate the US securities settlement cycle from trade date plus two days (T+2) to trade date plus one day (T+1).  

The same industry group previously led the effort that shortened the settlement cycle from T+3 to T+2 about five years ago. 

Paving the way for shortening the settlement cycle to one day is expected to improve market resilience by further reducing the risk that exists while a trade is being finalised, shortening the execution timeframe between buying or selling securities and reducing the level of margin that market participants must post to offset the settlement risk.

The transition to T+1 will undoubtedly impact back- and middle-office risk management, the panel said, as organisations will need to adapt market and trade data, set up within existing infrastructure systems to prepare for the transition under new infrastructure and conduct new model validations.

Reviewing the history of settlement cycle adjustments, Meenakshisundaram said it could be prudent to prepare not only for T+1, but for a potential switch to T+0. “I have an inkling, based on the speed of adapting to T+1 and introducing the required infrastructure, that we will also move to a T+0 environment in the next five to 10 years.”
 

Conclusion

Navigating heightened market volatility alongside high volumes of data while enhancing real-time data integration is an ongoing challenge as organisations grapple with legacy systems and siloed data management technology.

In addressing this, it is crucial to align new technologies with evolving regulatory priorities, ensuring agility to pivot and harness next-generation infrastructure for more effective real-time data aggregation, especially in a persistently volatile environment.

Cloud computing is fast proving to be a pivotal solution to elevating risk reporting, meeting regulatory demands and propelling real-time data integration efforts across an organisation swiftly and cost-effectively.

You need to sign in to use this feature. If you don’t have a Risk.net account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here