About a-team Marketing Services

A-Team Insight Blogs

Embracing the Known in FRTB: Why Banks Need to Step Away from the Data Pool and Start with the Familiar

Subscribe to our newsletter

By: Charlie Browne, Head of Market & Risk Data Solutions, GoldenSource.

The Fundamental Review of the Trading Book (FRTB) is coming and it has sent firms into a spin around how to get the data required to prove risk factor modellability. It is the first time banks will be obligated to do this, a mammoth undertaking that has seen many look to data pooling, where banks, data vendors, exchanges and trade repositories combine all their data to ensure a robust number of transactions have taken place previously.

It’s a convincing proposition; banks simply do not have enough of their own data. Add to this the fact that data is very expensive, and that the majority of firms are keen to consolidate costs after several heavy years of regulatory demands, and the initial attraction is clear.

The problem is that firms, at such an early stage of preparations, are getting bogged down in the many intricacies and unknowns of the data pool concept. Would a single vendor become a one stop shop or would banks be reluctant to rely on a single source and instead spread the risk by enlisting multiple vendors? Then there is the question of who will be responsible for working out if a risk factor is modellable or not, and whether the data pool itself is prepared to face potential questioning from regulators down the line.

Instead of getting stuck on the unknowns of risk factor modellability and data pooling, firms need to take a step back and see the bigger picture of a much wider reaching set of rules. FRTB was broadly designed to address the shortcomings of Basel 2.5, which failed to solve many key structural deficiencies in the market risk framework. Ultimately, the base intention of this regulation is much bigger than risk factor modellability: firms need to make a fundamental review of their data strategy.

They can begin to approach this task by getting the right data processes in place at the outset of FRTB preparations. This means having accurate and accessible market and risk datasets, and the right systems in place to run and interpret all of the calculations. By beginning with such a data-centric approach, firms can ensure that they are ready to meet massive potential challenges around aspects such as time-series cleansing, instrument lineage and single identifiers, to name but a few. And they might be pleasantly surprised by the benefits that fall out of the right FRTB strategy.

That is, if you get your data strategy right for FRTB then you will automatically address the data requirements for a lot of other regulations. For example, BCBS 239, Prudential Valuations and CCAR. This is a massive opportunity for firms to evaluate their entire data infrastructure and ensure they are taking a broader approach to regulation, rather than addressing different directives in silos.

As with any new regulation, the temptation with FRTB is for banks to focus largely on the aspects that are completely new and unknown. This is why the conversation around data pools as a solution to non-modellable risk factors has become so prominent. Firms that put too much time and resource into addressing this one single aspect could be missing a trick. In many ways, FRTB is a catalyst for compliance teams to take a step back, take stock, and put together a comprehensive data strategy that protects them against multiple regulatory requirements, and future-proofs them for years to come.

Subscribe to our newsletter

Related content

WEBINAR

Upcoming Webinar: Effective due diligence, screening and monitoring to mitigate financial crime risk

Date: 24 September 2024 Time: 10:00am ET / 3:00pm London / 4:00pm CET Duration: 50 minutes Managing financial crime risk requires a comprehensive approach to due diligence, screening, and continuous monitoring. Financial institutions face increasing regulatory scrutiny and staying compliant in today’s dynamic environment requires advanced technologies. Failure to comply is resulting in severe enforcement...

BLOG

Murex and Alveo Partner to Provide Murex Users with Alveo Product Master Datasets

Alveo, a provider of cloud-based market data management services, and Murex, a vendor of trading, investment management, risk and processing solutions, have partnered to bring together Murex’s MX.3 front-to-back-to-risk investment management system (IMS) and Alveo’s financial data management solution to support client data operations and provide MX.3 users with accurate, dependable and auditable data. The...

EVENT

AI in Capital Markets Summit London

The AI in Capital Markets Summit will explore current and emerging trends in AI, the potential of Generative AI and LLMs and how AI can be applied for efficiencies and business value across a number of use cases, in the front and back office of financial institutions. The agenda will explore the risks and challenges of adopting AI and the foundational technologies and data management capabilities that underpin successful deployment.

GUIDE

Entity Data Management Handbook – Third Edition

Welcome to the third edition of the Entity Data Management Handbook which is available for free download. In this updated edition we delve into the role entity data plays in the smooth running of financial institutions and capital markets, the challenges of attaining high quality data, and various aspects, approaches and technologies involved in managing...