Regulatory Data - A-Team https://a-teaminsight.com/category/regulatory-data/ Tue, 16 Jul 2024 11:34:46 +0000 en-GB hourly 1 https://wordpress.org/?v=6.5.5 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Regulatory Data - A-Team https://a-teaminsight.com/category/regulatory-data/ 32 32 DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate https://a-teaminsight.com/blog/dtcc-ficc-releases-tools-to-help-firms-address-incoming-sec-central-clearing-mandate/?brand=rti Tue, 16 Jul 2024 11:34:46 +0000 https://a-teaminsight.com/?p=69309 The Fixed Income Clearing Corporation (FICC), a subsidiary of the Depository Trust and Clearing Corporation (DTCC), has launched two new publicly available tools to help participants navigate the financial obligations that come with membership in a clearing system. The facilities are aimed at helping firms address the post-trade implications of a Securities and Exchange Commission...

The post DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate appeared first on A-Team.

]]>
The Fixed Income Clearing Corporation (FICC), a subsidiary of the Depository Trust and Clearing Corporation (DTCC), has launched two new publicly available tools to help participants navigate the financial obligations that come with membership in a clearing system.

The facilities are aimed at helping firms address the post-trade implications of a Securities and Exchange Commission (SEC) July 2023 rulemaking that mandated central clearing for a wide range of U.S. Treasury (UST) securities transactions including cash, repurchase agreements (repos) and reverse repos.

This new rule will have a significant impact on UST post-trade operations for all participants that currently clear and settle their trades on a bilateral basis. These participants will now have to find an appropriate way to connect with a central clearing system and make the necessary changes in their clearing and settlement technology.

The UST market sees daily transactions averaging over $700 billion in cash and $4.5 trillion in financing, making it vital for U.S. government funding, monetary policy, and as a safe haven for global investors. The market has grown rapidly and disproportionately where currently, 87% of this trading activity is cleared bilaterally.

Several liquidity events over the past decade highlighted vulnerabilities in the treasury market where the systemic risk of a non-participant failing required mitigating. The SEC’s final rule, adopted in December 2023, aims to expand central clearing to mitigate such counterparty and systemic risks.

The new rule seeks to transition a substantial portion of the daily US $4.9 trillion treasury market activity to central clearing through a central counterparty (CCP). Currently, the only authorised CCP for the UST market is FICC. However, other CCPs have expressed interest, among them London Clearing House (LCH).

Tools of the Trade

The first of the new FICC tools, a Capped Contingency Liquidity Facility (CCLF) Calculator, is designed to increase the transparency into the financial obligations associated with membership in the FICC Government Securities Division (GSD).

The CCLF is a critical risk management facility designed to provide FICC with additional liquidity resources to meet cash settlement obligations in the event of a default by the largest netting members (see DTCC Risk Management Tools). By allowing firms to estimate their potential CCLF obligations, the calculator aids in better liquidity planning and risk management. This can make FICC membership more attractive and manageable for a broader range of market participants, including smaller institutions and buy-side firms.

The calculator helps firms anticipate and plan for the liquidity commitments required under the new SEC clearing mandates. By providing upfront attestations regarding their ability to meet CCLF obligations, firms can ensure they are prepared to comply with the expanded central clearing requirements for U.S. Treasury securities.

The second is a Value at Risk (VaR) calculator from DTCC to help market participants evaluate potential margin and clearing fund obligations associated with joining GSD. With U.S. Treasury Clearing activity through FICC projected to increase by US$4 trillion daily following the expanded clearing mandate in 2025 and 2026, the VaR calculator will be essential for firms to accurately determine their VaR and margin obligations for simulated portfolios.

Tim Hulse, Managing Director of Financial Risk & Governance at DTCC, emphasized that VaR is a key risk management concept and a primary component of GSD’s Clearing Fund requirements. The calculator uses historical data, volatility, and confidence levels to estimate VaR, thus enhancing market transparency. It allows market participants to calculate potential margin obligations for given positions and market values using FICC’s VaR methodology.

Hulse highlighted the urgency of evaluating firms’ risk exposure with the expansion of U.S. Treasury Clearing, noting that the VaR calculator offers increased transparency into these obligations.

These tools are public and not restricted to member firms This means that as firms consider their optimal approach to access central clearing for compliance with the the new clearing rules, these risk tools can provide the necessary transparency and support as firms evaluate the different types of membership and models with GSD.

The SEC has introduced several measures to make FICC access more inclusive. FICC offers multiple membership models, including Netting Membership, Agented Clearing, Sponsored Membership, and Centrally Cleared Institutional Triparty (CCIT) Membership, catering to a wide range of market participants from large banks to hedge funds. The SEC has provided temporary regulatory relief to address custody and diversification concerns for registered funds.

CCIT membership primarily benefits institutional cash lenders such as corporations, asset managers, insurance companies, sovereign wealth funds, pension funds, municipalities, and State treasuries. It allows these entities to engage in tri-party repo transactions with enhanced risk management and operational efficiency provided by FICC. The central clearing of these transactions helps reduce counterparty risk, ensure the completion of trades, and potentially offer balance sheet netting and capital relief for participants.

The Securities Industry and Financial Markets Association (SIFMA) is actively coordinating multiple work streams that involve both buy-side and sell-side members. These efforts aim to accelerate the necessary transitions for the clearing mandates. Key aspects include engaging with the SEC and other regulatory agencies to address market access issues, particularly for registered funds and margin transfers, which are crucial for ensuring a smooth transition to central clearing.

Developing an operations timeline with key milestones is another critical task. This timeline will guide the transition to full central clearing by June 2026 for repos. Addressing issues related to market plumbing and connectivity is also vital to support the increase from 13% to 100% clearing. This involves ensuring that all participants can effectively connect to and use the central clearing infrastructure.

Regular communication with market participants is planned to keep them informed about progress and strategies for meeting the clearing deadlines. This will include updates on the status of various strategies and the overall progress towards the deadlines. SIFMA will also engage in regular discussions with the SEC and other agencies to ensure they are aware of the progress and any potential needs for timeline adjustments or phased rollouts.

Legal and enforceability issues will be addressed by obtaining netting enforceability opinions in relevant jurisdictions to support large-scale clearing. This step is closely tied to the development of market standard documentation. Additionally, new documentation approaches that leverage modern communication methods will be evaluated to increase efficiency.

Stakeholder engagement is essential to confirm the status of various strategies and ensure alignment with the clearing deadlines. SIFMA plans to reach out to market participants regularly to keep them informed and engaged. This will help ensure that all participants are on track to meet the clearing mandates.

Lastly, future planning includes preparing for additional publications and podcasts to keep the membership and broader public informed about ongoing efforts around Treasury clearing. This will ensure that everyone remains updated on the progress and any developments related to the central clearing mandate.

The post DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate appeared first on A-Team.

]]>
Duco Unveils AI-Powered Reconciliation Product for Unstructured Data https://a-teaminsight.com/blog/duco-unveils-ai-powered-reconciliation-product-for-unstructured-data/?brand=rti Tue, 09 Jul 2024 14:37:59 +0000 https://a-teaminsight.com/?p=69173 Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data. The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market...

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data.

The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market participants automate a choke-point that is often solved through error-prone manual processes.

Duco’s AI can be trained on clients’ specific documents, learning how to interpret layout and text in order to replicate data gathering procedures with ever-greater accuracy. It will work within Duco’s SaaS-based, no-code platform.

The company won the award for Best Transaction Reporting Solution in A-Team Group’s RegTech Insight Awards Europe 2024 in May.

Managing unstructured data has become a key goal of capital markets participants as they take on new use cases, such as private market access and sustainability reporting. These domains are largely built on datasets that lack the order of reference, pricing and other data formats with which it must be amalgamated in their systems.

“Our integrated platform strategy will unlock significant value for our clients,” said Duco chief executive Michael Chin. “We’re solving a huge problem for the industry, one that clients have repeatedly told us lacks a robust and efficient solution on the market. They can now ingest, transform, normalise, enrich and reconcile structured and unstructured data in Duco, automating data processing throughout its lifecycle.”

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Managing Cognitive Dissonance in Regulatory Compliance with Corlytics https://a-teaminsight.com/blog/managing-cognitive-dissonance-in-regulatory-compliance-with-corlytics/?brand=rti Tue, 09 Jul 2024 12:50:26 +0000 https://a-teaminsight.com/?p=69165 This past 18 months has been a time of significant growth for RegTech consolidator Corlytics. RegTech Insight recently spoke with founder and CEO John Byrne to delve into the Corlytics backstory and learn more about the company’s development. Corlytics is Byrne’s fourth company. He describes how, after the 2018 financial crisis, experiences at his prior...

The post Managing Cognitive Dissonance in Regulatory Compliance with Corlytics appeared first on A-Team.

]]>
This past 18 months has been a time of significant growth for RegTech consolidator Corlytics. RegTech Insight recently spoke with founder and CEO John Byrne to delve into the Corlytics backstory and learn more about the company’s development.

Corlytics is Byrne’s fourth company. He describes how, after the 2018 financial crisis, experiences at his prior company shaped the insights and innovation that would become Corlytics.

“If you look back at the early 2000s, banking was about the P&L but after 2008, banking and the capital markets became about the balance sheet and risk. Compliance and operations practitioners were seeing risk in lots of different places that they’d never seen before.”

This shift in the perception of critical success factors revealed the importance of understanding and managing the settlement risks of complex financial instruments. Regulators globally began looking deeper into the activities of banks and financial service companies, particularly those considered to be systemically important financial institutions (SIFIs).

With an extensive background in fund accounting and post-trade operations, Byrne recognised a growing gap between the understanding of how regulations should be interpreted versus their operational implementation, and a new venture was conceived.

Corlytics launched in late 2013 and Byrne’s aim was to bridge that gap by treating regulation as a class of risk requiring careful management. By risk-ranking regulations and updates into a clear set of obligations, firms could use this to shape and maintain policies that reflect the latest regulatory expectations.

Cognitive Dissonance

Byrne describes the emergence of a “cognitive dissonance” in the financial sector, where “the lawyers could understand the regulation but couldn’t implement them, and the people implementing the regulations didn’t fully understand them and the resulting exposures.”

To address this, Corlytics adopted an alternative approach to regulatory compliance. As Byrne explains “I wanted to look at regulation as a class of risk, rather than just something that had to be done. In many parts of banking and post trade, people take a risk-based approach to credit risk, market risk and counterparty risk. And I felt we should take a risk-based approach to legal and regulatory risk, hence the name Corlytics (compliance risk analytics).”

Corlytics’ foundation was also rooted in Byrne’s desire to combine expertise from different fields, and, like his previous company, he chose to start Corlytics in a university setting, as a campus-based company. This setting fostered an interdisciplinary collaboration with PhDs in law and data science, aimed at building a robust business capable of tackling the complexities of modern regulatory compliance.

Byrne’s previous experience in operationalizing various aspects of banking and post-trade processes, such as fund accounting and corporate actions, provided a strong basis for Corlytics’ mission. In his words, “I wanted to bridge the knowing-doing loop, ensuring that regulations weren’t just understood but effectively implemented.”

Growth Strategy

Last year the company acquired regulatory lifecycle platform ING Sparq and policy management platform Clausematch. Earlier this year, specialist growth investor Verdane took a majority equity stake in the company and has committed to accelerating both organic growth and M&A.

In May the company acquired a RegTech platform from Deloitte UK adding considerable breadth and domain expertise to further Corlytics’ capabilities, from interpreting regulatory change, to mapping and validating policies and implementing controls.,

Corlytics has established strong relationships with 12 of the top 50 SIFIs. Corlytics has also established a strong presence with non-bank payment processors. Byrne points out that “most of the top 10 payment companies in the world are not banks, but technology companies.” These include giants like PayPal, Amazon, and Google. Corlytics has secured about 50% of the market share in this space.

Regulatory Coverage

In line with the global growth in financial markets and the evolution of novel asset classes, the numbers of regulators and regulatory authorities global firms have to deal with has grown substantially. According to Byrne, “a typical Corlytics client might have 900 regulators and regulatory authorities to deal with,” underlining the scale and complexity of the current regulatory environment.

At the same time, the scope and depth of regulatory scrutiny continues to increase. In the UK, the Financial Conduct Authority (FCA) has introduced the Senior Managers and Certification Regime (SMCR) that requires senior managers to have statements that clearly outline their regulatory responsibilities. These managers are permitted to delegate certain responsibilities to other individuals within the firm, provided they ensure that these delegations are appropriate and properly overseen?.

This is having organizational impacts as Byrne has observed, “if you look at the senior persons regime, it’s very typical now within an enterprise, not just to organize regulations by business units, but actually to start organizing regulations, policies and controls by ‘accountable executive’.”

This has huge implications on the technology, since accountable executives must now be able to demonstrate that the controls they supervise reflect the latest version of the regulations and that these are clearly defined in the latest version of their policies.

Data Science

Corlytics keeps an open mind on the adoption of new technologies but the primary criteria for selecting the latest AI and ML techniques is model accuracy. “We try to work to a level of accuracy of 99% or greater because if a firm is going to automate compliance, it needs very high levels of accuracy. Human error is about 98%, so, by setting a target above the level of human error, ensures you’re automating to a high standard” explains Byrne.

Corlytics combines extensive backtesting on historical data with regulatory subject matter expertise to validate model accuracy.

One consequence of prioritising high accuracy is the need for detailed examination of use cases, in particular when considering advanced AI techniques – GenAI and LLMs. Corlytics approach is to use Gen AI in combination with other techniques rather than just on its own. Byrne sees the value-add of these techniques as a new search technology, particularly for the higher volume, lower risk use cases e.g. ‘can I accept that gift?’, or ‘does this comply with the expense policy?’

Byrne continues “but for a more complex, high-risk use case – e.g., a swaps trader asking, ‘can I put on this trade?’ – we might use something else”

GenAI and LLMs become extremely expensive in compute and storage cost compared the traditional AI when deployed at scale. Also, there’s a growing awareness of the carbon footprint these technologies generate, and Byrne cautions to not fall into the trap of “using a sledgehammer to crack a nut.”

Regulatory Convergence

The convergence of events on the regulatory calendar and regulators adopting a big-bang approach across multiple jurisdictions is creating severe stress on global firms governance risk and compliance (GRC). In some cases, firms are being forced to consider whether it makes economic sense to remain in certain markets.

The impact of MiFID II in 2018 put the kiss of death on the stock broking business for all but the biggest players and as Byrne notes “there are no mid-sized institutional brokers anymore in London. I would say that this (regulatory convergence) is favouring the bigger incumbents, and the regulators need to be careful about creating barriers to entry which is what’s currently happening.”

Regulatory harmonization is a worthy goal but it’s hard enough getting alignment across the regulators within a single jurisdiction, let alone globally. In the meantime, it will be up to the RegTech sector to take the lead as Corlytics has demonstrated with two significant projects.

One of Corlytics’ early projects, making the FCA Handbook machine-readable, was a major step in bridging the gap between text based regulatory content and implementation by the covered entities. Corlytics created the taxonomy (a mechanism for classifying and categorising information) which is structured into sourcebooks and manuals and covering the various sectors and compliance aspects including conduct standards, prudential standards, and reporting requirements.

Byrne’s recounts his experience in creating a regulated subsidiary at his previous firm and being confronted by the original version of the handbook. “If you were to print it out on double-sided paper, it would stand about seven feet tall.”

Each section is methodically organized into modules, sub-modules, and chapters for easy navigation. The handbook’s machine-readable features include XML and JSON formats, enabling automated compliance checks and integrations with RegTech solutions. Byrne recalls, “the FCA CEO at the time describing the initiative as the democratisation of the handbook.” The project went live in 2017.

Corlytics completed a similar project at the Financial Industry Regulatory Authority (FINRA) on the FIRST Rulebook that went live in 2022. With many small firms among its members, FINRA wanted to make sure these smaller players could get value from the website recalls Bryne. “So, we created the taxonomy and redesigned all of the documents making them easy to tag and search. Both FINRA and the FCA have a competition mandate so creating a level playing field for both large and smaller firms is important.”

There are indications that other regulatory authorities are starting to embrace the idea of making their regulations machine readable, but for now, the FCA and FINRA are the thought leaders in this space and Corlytics innovation helped make that happen.

The post Managing Cognitive Dissonance in Regulatory Compliance with Corlytics appeared first on A-Team.

]]>
Investment Firms Embrace Generative AI: A Boon for Monitoring and Compliance https://a-teaminsight.com/blog/investment-firms-embrace-generative-ai-a-boon-for-monitoring-and-compliance/?brand=rti Tue, 09 Jul 2024 10:49:07 +0000 https://a-teaminsight.com/?p=69144 By Osvaldo Berrios, SME, Compliance, NICE Actimize. The financial services industry is undergoing a transformative shift, with artificial intelligence (AI) playing a central role. Investment firms are starting to explore the potential of Generative AI (GenAI) to enhance their business dealings, particularly in the areas of monitoring, surveillance and regulatory compliance. Monitoring and Surveillance One...

The post Investment Firms Embrace Generative AI: A Boon for Monitoring and Compliance appeared first on A-Team.

]]>
By Osvaldo Berrios, SME, Compliance, NICE Actimize.

The financial services industry is undergoing a transformative shift, with artificial intelligence (AI) playing a central role. Investment firms are starting to explore the potential of Generative AI (GenAI) to enhance their business dealings, particularly in the areas of monitoring, surveillance and regulatory compliance.

Monitoring and Surveillance

One of the primary areas that GenAI provides value to investment firms is detecting anomalies. GenAI can be trained on historical data to identify patterns of normal advisor activity which can then detect aberrant activity. This allows firms to detect potential red flags, such as unusual trading patterns or suspicious communication with clients, much faster than traditional methods.

By generating realistic hypothetical scenarios, GenAI can help firms test and refine their surveillance processes. This can be particularly valuable in areas like fraud detection and market manipulation. GenAI can automate the creation of reports on advisor activity and potential compliance issues. This frees up human compliance staff to focus on more complex investigations.

The effectiveness of GenAI models is highly dependent on the quality and quantity of data used for training. Biased datasets can lead to biased AI models, potentially amplifying existing inequalities in the financial system

Compliance with Regulations

Regulatory Document Generation is another key role played by GenAI techniques. GenAI can be used to generate regulatory reports and other compliance documents, saving firms significant time and resources. And since regulatory landscapes are constantly evolving GenAI can be trained to stay updated on new regulations and identify potential compliance risks associated with new investment products or strategies.

GenAI can also personalize compliance training for advisors based on their specific risk profiles and areas of expertise.

Challenges and Considerations

While GenAI offers exciting possibilities, implementing it effectively requires addressing some key challenges. The effectiveness of GenAI models is highly dependent on the quality and quantity of data used for training. Biased datasets can lead to biased AI models, potentially amplifying existing inequalities in the financial system. Understanding how GenAI models arrive at their conclusions is crucial. Firms need to ensure these models are transparent and explainable to maintain trust and mitigate potential regulatory concerns.

GenAI is a powerful tool, but it should not replace human expertise. Firms still need experienced compliance professionals to interpret AI outputs and make informed decisions.

Negative Aspects

Is job displacement an issue today? Automation through GenAI may lead to job losses in compliance departments. This necessitates retraining and upskilling existing staff to adapt to new workflows. There may also be a potential for misuse. Like any powerful technology, GenAI could be used for malicious purposes such as generating fraudulent documents or manipulating markets. Robust security measures are crucial to mitigate these risks.

The Road Ahead

GenAI holds immense potential for investment firms to enhance their monitoring, surveillance, and compliance capabilities. However, successful implementation requires careful consideration of data quality, bias, explain ability, and the role of human oversight. As technology matures and regulatory frameworks adapt, GenAI is poised to revolutionize how investment firms manage their business dealings and navigate the ever-changing regulatory landscape.

For more information on NICE Actimize’s applications for capital markets, see: https://www.niceactimize.com/financial-markets-compliance/.

The post Investment Firms Embrace Generative AI: A Boon for Monitoring and Compliance appeared first on A-Team.

]]>
Navigating the MiFIR Refit in 2024 https://a-teaminsight.com/blog/navigating-the-mifir-refit-in-2024/?brand=rti Mon, 01 Jul 2024 09:14:55 +0000 https://a-teaminsight.com/?p=69065 The MiFIR Refit came into force in May to overhaul the European financial landscape with its focus on transparency and data integrity. Its ban on Payment for Order Flow aims to remove any vestiges of conflict of interest, while the consolidated tape is set to provide a comprehensive view of market data in a standardized...

The post Navigating the MiFIR Refit in 2024 appeared first on A-Team.

]]>
The MiFIR Refit came into force in May to overhaul the European financial landscape with its focus on transparency and data integrity. Its ban on Payment for Order Flow aims to remove any vestiges of conflict of interest, while the consolidated tape is set to provide a comprehensive view of market data in a standardized format that the market can readily decipher.

Financial institutions now face the challenge of updating their systems, policies, processes and procedures to meet these new regulatory demands, proving once again that in the world of global finance, change is the only constant.

The EU’s Markets in Financial Instruments Regulation, along with the Markets in Financial Instruments Directive (MiFID II), aims to increase transparency across the region’s financial markets and standardize regulatory disclosures for investment firms. MiFIR specifically relates to trade reporting, market transparency, and the obligations of trading venues and systematic internalisers.

The MiFIR Refit introduces several key changes and updates to the existing framework, aimed at enhancing transparency, improving data quality, and optimising market operations. The implementation is being phased in over a two-year period.

The regulation became official with its publication in the Official Journal on April 14 and entered into force on May 4.

One of the immediate and most debated impacts of this regulation is the prohibition on brokers accepting Payment for Order Flow. This ban is designed to eliminate conflicts of interest, ensuring brokers act in the best interests of their clients.

Another immediate change is the elimination of annual best execution reports for execution venues. This is being replaced by a new consolidated tape system, which will provide comprehensive market data, including the best bid and offer information.

By May of 2025, several significant milestones must be achieved:

ESMA will complete its 12-month assessment of the inclusion of Alternative Investment Fund Managers (AIFMs) and management companies in the scope of transaction reporting

Trading venues and systematic internalisers (SIs) must comply with real-time data access requirements to ensure all market participants have timely access to crucial trading information.

Financial institutions must be ready to adopt standardized reporting formats by this date.

Banks and trading venues involved in commodity derivatives must comply with enhanced disclosure requirements. These changes aim to increase transparency and oversight in commodity derivatives trading, addressing speculative activities and improving market stability.

The development and implementation of Regulatory Technical Standards (RTS) by ESMA is another critical aspect of the MiFIR Refit. These standards, which will manage trading halts, price collars, and other market structure enhancements, are expected to be developed and implemented by November 4, 2025.

The consolidated tape system (CTS) is a major structural change and is targeted to be fully operational by May 2026. The initial setup and framework for data submission by contributors and the selection of consolidated tape providers (CTPs) will occur over a two-year period ensuring a smooth transition to the new system. CTPs will provide a unified source of trade information by asset class.

In summary, the MiFIR Refit introduces a structured implementation schedule with key milestones designed to enhance market transparency, data quality, and operational efficiency. Financial institutions and market participants must adhere to these timelines to comply with the new regulatory framework.

Enhanced Regulatory Oversight

The scope of transaction reporting under MiFIR is expanded to potentially include Alternative Investment Fund Managers (AIFMs) and management companies. ESMA will assess this inclusion over the next 12 months.

Investment firms can now act as designated publishing entities for specific financial instruments, improving the clarity and responsibility of transaction reporting. ESMA will maintain a public register of these entities.

Prohibition of Payment for Order Flow (PFOF) prohibits brokers from receiving fees, commissions, or non-monetary benefits from third parties for order execution or forwarding. This aims to eliminate conflicts of interest and ensure that brokers act in the best interests of their clients.

While the prohibition took effect in May, Member States may exempt firms under their jurisdiction from this prohibition until June 30, 2026. Regulatory authorities will monitor brokers’ activities and impose penalties for non-compliance. Brokers must also provide clear and transparent disclosures about their order execution policies.

The introduction of a consolidated tape for each asset class will provide necessary market data, including best bid and offer information, replacing the need for separate reports. As a result, the requirement for execution venues to publish annual best execution reports has been permanently suspended.

Pending Standards

ESMA is consulting on three new regulatory technical standards (RTSs) under the MiFIR to enhance market transparency and data quality. The first standard focuses on pre- and post-trade transparency for non-equity instruments such as bonds, structured finance products, and emissions allowances. This standard aims to ensure timely and clear trade information for stakeholders while balancing the need for real-time transparency with the ability to defer publication when necessary.

The second standard mandates that pre- and post-trade data be made available on a reasonable commercial basis (RCB). This is to ensure that market data is accessible, fair, and non-discriminatory. The consultation includes discussions on the cost-based nature of fees and the applicable reasonable margin, aiming to make this data affordable for users while maintaining fair access.

The third standard addresses the obligation to provide high-quality instrument reference data suitable for both transaction reporting and transparency purposes. The proposed amendments aim to align this data with other relevant reporting frameworks and international standards, thereby improving data quality and consistency across the board.

At this time, feedback from stakeholders is still being collected, and ESMA will publish a final report and submit the draft technical standards to the European Commission by the end of the fourth quarter of 2024. This review process is crucial for ensuring that the technical standards effectively support the regulatory objectives of MiFIR.

Market Structure Enhancements

ESMA is developing RTSs to manage trading halts, price collars, and other market structure enhancements ensuring better market stability during volatility. These are planned to be rolled out by November 4, 2025.

Enhanced disclosure requirements for commodity derivatives is introduced to curb speculative activities and improve market oversight.

The Double Volume Cap (DVC) mechanism under MiFIR has undergone significant changes aimed at enhancing market transparency and simplifying the regulatory landscape. The updated MiFIR now introduces a single volume cap set at 7% for trading under the reference price waiver. This replaces the previous double volume cap system, which had separate thresholds for individual venues and the entire EU market. By consolidating the thresholds into a single 7% cap, the regulation aims to reduce complexity and ensure a more straightforward approach to monitoring and controlling dark trading activities.

Changes to systematic Internalisers (SI’s) quoting obligations will require technology updates to ensure compliance with new minimum quote size requirements and facilitate better pricing transparency.?Equity SIs must now make public firm quotes based on a minimum size determined by regulatory technical standards (RTS). Non-equity SIs are no longer obligated to publish firm quotes but may do so voluntarily.

Improved Data Quality and Transparency

The regulation requires real-time publication of data to ensure all market participants have timely access to the same information, crucial for making informed trading decisions and maintaining a fair market.

Trading venues and SIs must ensure the accuracy, completeness, and consistency of their data, covering transaction details, order book data, and post-trade information. The introduction of standardized reporting formats is designed to create a more transparent and cohesive market environment.

MiFIR Refit enhances the scope and consistency of transaction reporting by introducing new data fields and aligning reporting standards across EMIR, SFTR, and MiFIR. This standardization facilitates easier comparison and consolidation of data across different platforms.

Technology Impacts

The consolidated tape is a significant component of the MiFIR Refit, aiming to aggregate trade data from multiple sources into a single, unified view for each asset class. This initiative is designed to enhance market transparency, reduce information asymmetry, and improve the quality of market data available to investors.

As of now, the groundwork for the consolidated tape initiative, including the legislative framework and initial criteria for CTP selection are in place.

The consolidated tape system is expected to be fully operational by May 4, 2026. This timeline allows for the necessary steps to be completed, including the selection and approval of CTPs, the setup of data submission frameworks, and the establishment of robust data aggregation and dissemination systems.

Financial institutions will need to update their reporting systems for real time processing and to accommodate new data fields and harmonized reporting standards across EMIR, SFTR, and MiFIR.

The MiFIR Refit and MiFID II updates represent significant steps towards a more transparent, resilient, and competitive financial market environment in the European Union. Financial institutions must adapt to these changes by enhancing their governance frameworks, streamlining reporting workflows, improving data management practices, and updating their technology infrastructure to comply with new regulatory requirements. These efforts are intended to provide a more efficient and investor-friendly market landscape.

View our full agenda and more details here.

The post Navigating the MiFIR Refit in 2024 appeared first on A-Team.

]]>
n-Tier – Bringing Order to Regulatory Data Chaos https://a-teaminsight.com/blog/n-tier-bringing-order-to-regulatory-data-chaos/?brand=rti Tue, 25 Jun 2024 13:04:29 +0000 https://a-teaminsight.com/?p=69023 Navigating the complex world of regulatory data management is no easy task. But the challenges posed by the need to meet the concurrent demands of many new regulations and updates to existing ones should come as no surprise. Certainly, the regulators’ stance is clear: Firms are expected to comply; no excuses. According to Peter Gargone,...

The post n-Tier – Bringing Order to Regulatory Data Chaos appeared first on A-Team.

]]>
Navigating the complex world of regulatory data management is no easy task. But the challenges posed by the need to meet the concurrent demands of many new regulations and updates to existing ones should come as no surprise. Certainly, the regulators’ stance is clear: Firms are expected to comply; no excuses.

According to Peter Gargone, Founder and CEO of n-Tier, “This situation has been coming for a really long time, and if you go back and look at the older regs from around 2008 and the Flash Crash, you will see that regulators have been ramping this up for years.”

Gargone argues that compliance with newer regulations will be challenging for firms if they don’t have in place people who truly understand the data requirements and business flows. And the concurrency of regulatory updates becomes a massive time and resource constraint for firms across the board, he says.

“The concurrency of the requirements is challenging for firms because each regulation demands very specialised skills – interpreting the regulations, sourcing and validating the data, and the technology to comply efficiently. You can’t just rely on your operational groups like T+1, which was inherently operational in nature.”

Getting Governance Right

Gargone acknowledges that regulators’ expectations are high, especially when it comes to governance processes around reporting. “There’s an ingrained expectation from regulators around what you do to ensure your reporting is correct,” says.

As a result, the checking processes and the controls firm must now have in place are no longer optional: “If you put in a reporting framework in the US and your annual exam reveals that you have failed to put controls and checks around it, you will get into trouble. Regulators expect you to demonstrate these controls and checks.”

From n-Tier’s perspective, a control framework and a comprehensive set of checks on regulatory data are fundamental requirements for delivering a complete regulatory reporting service.

“We’re not just spitting out reports,” says Gargone. “We’re focused on a holistic process that encompasses data controls and governance because it’s what the regulators now expect to see. This is lot harder than the reporting itself because you need a lot more data – 2x more in many cases. In the US, for Consolidated Audit Trail (CAT) and Customer Account Information System (CAIS), for example, you’re looking at trillions of data points a day. That’s way beyond what most firms can handle as a platform or service.”

Workflow

The complexity of regulatory requirements demands seamless workflows that can handle large volumes of data efficiently. Automated workflows integrated with regulatory reporting systems help minimise manual intervention, reduce the risk of errors, and ensure timely submissions. Gargone continues:

“We’re seeing the way firms are planning for these changes coming this December and January of next year. This has created an environment where they’re flat out; the book of work is fully booked up, and for anything new, it’s like trying to get a reservation in a three- or four-star Michelin restaurant. You might as well call back in a year.”

The n-Tier platform is designed to operate across regulatory jurisdictions and markets but “that’s not the norm” according to Gargone.

“The norm is either a specialised vendor for each segment and each reg individually or custom-built frameworks and toolkits for each reg, where everything’s a little bit different. But we’re seeing a lot of pushbacks against that.”

Gargone continues, “When you look at this from a global perspective, those variations add complexity and cost firms more money. So, we see a move to centralising data governance and reg reporting within our platform, across regs and around the world because of the flexibility we have by default. This is part of our core design and one of our strengths.”

The company has built strong regulatory team from people with deep experience at major firms. These are former practitioners that understand regulations across the different jurisdictions and markets.

Gargone continues, “This team has been very instrumental in designing enhanced frameworks for making sure the data and accuracy of the reporting are correct. Anybody could spit out a report – many of these firms can say, ‘You know, you look at this from the outside, the numbers are correct,’ and you might assume, ‘Okay, that’s going to meet this regulatory requirement’. But what exactly does that mean?

You spit a report out and send it to the regulator but that doesn’t make it right. It doesn’t mean it’s accurate. It doesn’t mean it’s complete.

Then it becomes a ‘game’ of how long it takes the regulators to figure out you haven’t fulfilled the requirement. And then how much risk has your firm acquired following a process you haven’t designed properly?

So, we see a divergence in the market between firms that say they ‘do reporting’ and firms like n-Tier that actually offer a comprehensive suite of functionality and expertise where we actually care about the data quality.”

n-Tier’s regulatory reporting and trade surveillance platforms provide comprehensive visibility and searchability across all regulatory reporting requirements, helping firms manage and monitor trade data effectively.

Regulatory Data

Data management is crucial in this environment. Firms need the ability to aggregate, validate, and reconcile data from multiple sources. Advanced data management solutions like those offered by n-Tier integrate disparate data sets, perform continuous validation, and provide comprehensive exception management capabilities. These solutions are designed to aggregate regulatory reporting data from different sources while meeting reporting obligations for validation and research, supporting reporting for regulations such as CAT, CAIS, TRACE/MSRB, and more.

“Complexity doesn’t come just from the fact that there are new and overlapping regs” continues Gargone, it also comes from the fact that the data sets the regulators are asking for today don’t normally sit together in a regulatory model in the banks.

If you look back at a brokerage workflow and the reporting from 10 years ago, it used to be much more normal that the data would come from one system and have one owner. It belonged in a business function, or a line function, and data was isolated for that line function. So, when doing some kind of risk for a line of business, they basically had all the data.”

Gargone goes on to describe what regulators are expecting to see in this new environment. “They include looking for nuances from the data where much of that data is now coming from different parts of the organisation which don’t normally talk to each other and aren’t necessarily in sync in a timely manner or what that data represents.

If you look at CAIS, regulators are looking at reference data en masse and have turned that reference data into a regulatory reporting requirement. Previously, correction processes were based on a three-day or four-day correction cycle where you could correct it when you got to it. But now it must be corrected immediately.

“If you don’t have a process around this, where data is sourced from multiple systems in inconsistent formats along with versions of it coming off the master copy, it creates a huge workflow challenge that’s even more difficult than just generating the data. It’s mind-boggling, and that’s why we built our software.”

The company started in 2000 with the core of the n-Tier platform being built as a data platform, not a regulatory platform. Today, n-Tier is a large-scale, high-volume, completely configurable engine with a no-code interface for regulatory data management.

According to Gargone, some of the most critical work, is figuring out if the data feeding the regulatory report is right or wrong in the first place – “To do that, you have to be able to compare against different sources of that data to figure out what’s right or wrong. And, even within a single record, your data points may not map directly versus another related data point coming from a different source. And you may have to compare that against three or four different sources to figure out if it’s right or wrong.”

But that’s not quite the whole story as Gargone continues “This is where human assistance is important, because once you get to the point of figuring out within the guardrail framework, is it right or wrong – it then becomes a different question – ‘What is right in this context? Are they the same? If they’re not the same, are they the right values? Where did it break down?’ The exciting part about this is having the data at scale to do the integrity checking in one place, along with a tech stack that can actually get through that volume of data. This framework is very hard to build out, and that’s where we’re at now.”

Emerging Technologies

Solutions such as advanced analytics, machine learning, and AI can help identify patterns, predict compliance risks, and automate regulatory reporting processes. n-Tier’s platform, with its no-code environment, allows practitioners to configure datasets and data controls easily, ensuring that processes remain adaptable to evolving requirements. This flexibility and scalability are vital for maintaining compliance in a dynamic regulatory environment.

Gargone is cautiously optimistic about emerging technologies like Generative AI (GenAI) making a difference in regulatory data and reporting. For example, on regulatory horizon scanning, “That’s great if you can get the regulations machine-readable, but how far will that get you? I know some firms do the aggregation, but the terminology for risk data is vastly different.”

Gargone stresses the importance of context throughout the validation process where things that sound the same aften mean different things across markets. “You need everything to be taken in context, and that insight is something our staff have built up over their careers in this industry.”

n-Tier is beginning to leverage these technologies in correction frameworks and similar repetitive tasks. “I think as we progress through this and get more into next-gen stuff, which we’re looking at different variations of, I think the value prop for that becomes better and better.”

Closing Takeaways

Given the current state of the industry and n-Tier’s depth of experience, we asked Gargone for his top three messages for the Compliance community:

“Top of the house is don’t underestimate the regulator’s ability to focus on and find problems in your systems and processes. Moreover, they’re going to continue getting better at this. Because if you take a lackadaisical approach to it and you think, ‘They’re not going to know if your data reporting is inaccurate,’ you’re just playing with fire.”

Gargone reminds us that regulators are focussing heavily on internal controls and come with expertise and tools and they will uncover discrepancies in data and in process – “So, that’s our top line. If I were looking at it from the practitioner side, I wouldn’t feel comfortable until that was taken care of.”

The next consideration are the controls themselves and the need for a holistic approach across the jurisdictions with different structures in place to make sure there’s some independence in the software and the processes (e.g. maker/checker) around those controls versus where the actual data flow processes live. Gargone makes clear that “a single framework with some built-in checks from the same people that did the reporting is not a good idea.”

Gargone’s final take-way is “You have to pick a good partner. We see a lot of ‘try and build yourself’ at this point, but it’s very hard. There’s a stack of functionality, which has taken us a very long time to build.

You should look for a good partner and you should look for something that’s flexible enough where you’re not going to have 50 solutions. The fewer solutions and the more common processes you can have as a firm, the better you’re going to be at implementing standards and controls to make sure you don’t make mistakes.

You really need a solid partner – someone who fully understands the requirements, knows the regs. And that’s where we sit.”

The post n-Tier – Bringing Order to Regulatory Data Chaos appeared first on A-Team.

]]>
Regulatory Reporting: Best Practices in 2024 and Beyond https://a-teaminsight.com/blog/regulatory-reporting-best-practices-in-2024-and-beyond/?brand=rti Tue, 25 Jun 2024 12:46:30 +0000 https://a-teaminsight.com/?p=69013 Regulatory reporting can often feel like an endless and expensive grind. Achieving reporting excellence demands robust data governance, seamless automated data collection, standardized reporting formats, a centralized system, and a proactive approach to regulatory changes. While these requirements are well-understood, they are hard to implement. But emerging AI-powered solutions are beginning to show efficiency gains...

The post Regulatory Reporting: Best Practices in 2024 and Beyond appeared first on A-Team.

]]>
Regulatory reporting can often feel like an endless and expensive grind. Achieving reporting excellence demands robust data governance, seamless automated data collection, standardized reporting formats, a centralized system, and a proactive approach to regulatory changes.

While these requirements are well-understood, they are hard to implement. But emerging AI-powered solutions are beginning to show efficiency gains in compliance use-cases, with the promise of making the regulatory data management and reporting process more efficient.

To explore the current landscape of regulatory reporting, identifying key challenges and practical solutions, A-Team is hosting its Best Practices in Regulatory Reporting webinar on July 16.

In this webinar, we’ll delve into next-generation best practices and innovative technologies, including domain trade data, AI, and machine learning. Our experts will discuss actionable insights on implementation, ensuring you walk away with practical strategies.

You’ll hear from Jehangir Abdulla, Head of Back Office Development at Schonfeld Strategic Advisors LLC.  

Jehangir will be joined by Unmesh Bhide, Director, Securitized Products Valuations at LSEG Data & Analytics and Joshua Beaton Head of Non-Financial Regulatory Reporting (NFRR) at Wells Fargo. 

Finally, Paul Rennison, Director, Corporate Strategy at deltaconX, will be on hand to share his 25 years of experience working for the likes of the London Stock Exchange, Trayport, FIS and now with the Swiss regulatory transaction reporting specialists, deltaconX. Speaking with RegTech Insight Rennison had this message for prospective attendees:

“I think being able to report and manage and track internally up to executive level has been really, really difficult. And I think if you’ve done this alone, i.e. you’ve not used a technology provider who has multiple other clients and experiences, the current low levels of transparency have created unease and uncertainty about whether you are complying. Regardless that this is a market-wide problem not being able to get shared validation of your experiences has made the whole experience far more damaging, I think it is important for people to know that what they are experiencing isn’t unique and it will get better but the experience has been worse for some and that is not a great outcome.”

Don’t miss out on this opportunity to hear about best practices for regulatory reporting and opportunities to unlock significant operational and business benefits.

Register now to discover:

  • The current state of regulatory reporting
  • The necessity of adopting new approaches
  • The latest technologies, services, and solutions
  • Practical guidance for seamless implementation
  • The operational and business advantages of modernized regulatory reporting

The post Regulatory Reporting: Best Practices in 2024 and Beyond appeared first on A-Team.

]]>
Generative AI Poised for Leading Role as Regulatory Data Burden Grows https://a-teaminsight.com/blog/generative-ai-poised-for-leading-role-as-regulatory-data-burden-grows/?brand=rti Tue, 18 Jun 2024 11:11:59 +0000 https://a-teaminsight.com/?p=68964 Amidst the hype around Generative AI (GenAI) and Large Language Models (LLMs), practitioners are beginning to realise that these emerging technologies can make a positive impact on the collection and validation of regulatory data. The categories and scope of regulatory data requirements have expanded considerably in response to rapid market developments and growing regulatory scrutiny....

The post Generative AI Poised for Leading Role as Regulatory Data Burden Grows appeared first on A-Team.

]]>
Amidst the hype around Generative AI (GenAI) and Large Language Models (LLMs), practitioners are beginning to realise that these emerging technologies can make a positive impact on the collection and validation of regulatory data.

The categories and scope of regulatory data requirements have expanded considerably in response to rapid market developments and growing regulatory scrutiny. It is no longer enough to present a set of numbers. Regulators want to know the origins of the underlying data, how that data was selected, how it was vetted and the lineage back through transformations to a certified provisioning point or other auditable record. Above all, regulators want to see evidence of a robust, principles-based approach to regulatory data management.

Faced with this increasingly onerous regulatory data environment, data managers are assessing how the new breed of AI technologies can make a difference.

It’s worth considering the variation of data types that can be drawn upon to fulfill financial institutions’ regulatory reporting responsibilities. Regulatory data can be considered as any data that is reported to or contributes via transformation(s) to the information disclosed in regulatory filings. This boils down to a number of categories:

Trade Data

Trade reporting involves providing detailed information on trading activities across the financial markets, such as equities, derivatives, fixed income securities and newer alternative and crypto markets. This data ensures transparency and helps regulators monitor market activity. Under MiFID II, ESMA requires firms to report trades to Approved Reporting Mechanisms (ARMs) within a specified timeframe. In the U.S., the SEC’s Consolidated Audit Trail (CAT) requires broker-dealers to report comprehensive trade data to facilitate market oversight and analysis. Similarly, the U.S. Commodity Futures Trading Commission (CFTC) requires firms to report swap transactions to Swap Data Repositories (SDRs), ensuring transparency in the derivatives market.

Audit Data

Audit trails are comprehensive logs that provide a traceable history of transaction status changes and changes to data, ensuring accountability and transparency. These logs are essential for regulatory investigations and compliance verification. The SEC mandates that firms maintain detailed audit trails for all transactions as part of their recordkeeping requirements. Similarly, the CFTC requires firms to maintain audit trails for all futures and options trades to ensure transparency and compliance with regulatory standards.

Customer Data

Know Your Customer (KYC) data involves collecting and verifying the identity of clients to prevent money laundering, terrorist financing, and other financial crimes. This includes personal identification information, financial status, and transaction history. The Financial Conduct Authority (FCA) mandates strict KYC procedures as part of its Anti-Money Laundering (AML) regulations. In the U.S., the Securities and Exchange Commission (SEC) requires broker-dealers to adhere to the Customer Identification Program (CIP) rules under the Patriot Act, ensuring proper identification and verification of their clients.

Risk Data

Risk data encompasses information related to an institution’s exposure to various types of risk, such as credit, market, operational, and liquidity risks. Regulators use this data to assess the resilience of financial institutions and the broader financial system. The Basel Committee on Banking Supervision (BCBS) outlines principles for effective risk data aggregation and reporting in BCBS 239. Originally targeted for 2016, the fact that the market has yet to fully comply with BCBS 239 underpins the data challenges that remain. But more on BCBS 239 later.

Compliance Data

Compliance data includes records that demonstrate an institution’s adherence to regulatory requirements, such as AML measures, sanctions compliance, and tax reporting. This data ensures that institutions are operating within the legal and regulatory frameworks set by authorities. For example, FINRA in the U.S. requires firms to maintain comprehensive records of their compliance activities and report any suspicious activities through Suspicious Activity Reports (SARs). Similarly, the ESMA requires investment firms to comply with the Market Abuse Regulation (MAR) by reporting any instances of market manipulation or insider trading.

Operational data

Operational data pertains to information about an institution’s internal processes, governance structures, and internal audits. This data helps regulators assess the effectiveness of an institution’s internal controls and governance. The FCA’s Senior Managers and Certification Regime (SM&CR) requires firms to maintain detailed records of their governance structures and the roles and responsibilities of senior managers. The Federal Reserve also mandates that banks submit reports on their operational risk management and internal control systems as part of their regulatory filings.

Performance data

Performance data includes financial metrics and reports such as profit and loss statements, balance sheets, and capital adequacy ratios. This data is crucial for assessing the financial health and stability of institutions. The SEC requires publicly traded companies to submit quarterly and annual financial statements as part of their regulatory filings. In Singapore, the MAS mandates that banks provide regular updates on their financial performance, including capital adequacy and liquidity coverage ratios, to ensure they maintain sufficient capital buffers.

Incident Reports

Incident reporting includes information on any incidents or breaches, such as cybersecurity incidents, fraud, or operational failures. This data is critical for regulators to understand the impact of such events and to take appropriate action. The UK Financial Conduct Authority (FCA) requires firms to report significant operational incidents, including IT failures, under its Incident Reporting Rules. FINRA requires firms to file suspicious activity reports (SARs) when incidents of financial crimes like money laundering or insider trading are suspected. The Monetary Authority of Singapore (MAS) also has stringent requirements for reporting cybersecurity incidents, ensuring that financial institutions promptly notify the regulator of any significant breaches.

Marketing, Corporate and Communications Data

Information communicated in advertising, promotional and marketing materials is subject to regulatory oversight to ensure that the information is not misleading or makes unrealistic promises or guarantees. The company’s annual report, 10K and quarterly filings are all subject to regulatory scrutiny. This category of regulatory data will include a considerable amount of text-based information including statements by the officers and board, auditors reports and statement of financial condition.

Clearly, this set of data types represents a wide spectrum of characteristics that needs to be embraced by any regulatory data management approach. Practitioners are recognising that GenAI and LLMs can be deployed differently from traditional regression, clustering and early NLP models, allowing them to address the entire regulatory data spectrum.

AI for Regulatory Data Management

Whilst AI has been used in some way at each stage of the regulatory data life cycle from sourcing and collection through transformation, reporting and archival, Generative AI (GenAI) and Large Language Models (LLMs), represent significant advancements over previous generations of AI, offering enhanced capabilities in several key areas.

These advancements are already delivering substantial performance improvements in several regulatory data use cases by providing more sophisticated, context-aware, and efficient solutions. The main capabilities of GenAI and LLMs that set them apart from their predecessors are contextual understanding, normalisation and transformation and, lineage and transparency.

Contextual Understanding

Unlike earlier AI models, which often struggled with context and nuance, GenAI and LLMs excel in understanding and generating content based on context. This ability allows them to perform complex tasks such as analysing and summarizing vast quantities of text-based information, generating coherent narratives, and understanding nuanced queries.

This deep contextual understanding is a crucial step up in capability for applications like natural language processing (NLP) and conversational AI, where understanding the subtleties of language is essential.

Firms are already seeing substantial improvements in productivity and efficiency in text use cases like scanning for and interpreting regulatory changes and text-based information for streamlining KYC, Onboarding and scanning for AML violations and exposure to Politically Exposed Persons (PEPs).

Other use cases are content oriented with GenAI and LLM’s ability to generate well formatted ‘boiler plate’ regulatory narratives for Suspicious Activity Reports (SARs), 10Qs etc. Another text-based use case is horizon scanning for changes in regulatory text, translating and interpreting their impact and highlighting any required policy. AI-powered language translation has reached a level of accuracy for compliance demands with firms seeing dramatic improvements in accuracy and productivity.

Normalization and Transformation

GenAI and LLMs bring significant improvements in data normalization and transformation processes. They can be trained to accurately map and convert data between different formats, ensuring consistency and integrity across diverse datasets. This capability is essential for applications requiring standardized and aggregated data for regulatory compliance and analysis.

Lineage and Transparency

Maintaining clear and accurate data lineage is crucial for regulatory compliance and data governance. GenAI and LLMs provide robust capabilities for tracking and documenting the history of data transformations and movements. This transparency ensures that organizations can demonstrate compliance and maintain high standards of data governance.

GenAI and LLMs can offer substantial improvements over previous AI generations by providing enhanced contextual understanding, efficient data processing, improved accuracy, advanced data normalization, and comprehensive data lineage capabilities. But these improvements come at a cost. GenAI and LLMs are resource intensive and should be deployed carefully. Use cases that reduce repetitive manual efforts such as reviewing large volumes of text or, where sampling methods can be replaced by comprehensive scans are ripe for GenAI.

AI-Enabled BCBS 239

The Basel Committee on Banking Supervision (BCBS) has outlined principles for effective risk data aggregation and risk reporting – see BCBS 239. These principles, while initially focused on risk data, can be adapted to apply more broadly to all regulatory data. Additionally, GenAI technologies can accelerate compliance with these principles by enhancing data quality, accuracy, and management.

Principle 1: Governance

Strong governance frameworks should be established for all types of regulatory data, not just risk data. This includes setting clear policies, procedures, and accountability for data management across the organization.

GenAI can enhance governance by automating the documentation of data governance policies, ensuring consistent application across different types of data. GenAI can also assist in monitoring compliance with these policies in real-time, providing alerts and recommendations when deviations occur.

Principle 2: Data Architecture and IT Infrastructure

Data architecture and IT infrastructure should support comprehensive data management capabilities, ensuring that all regulatory data is accurately captured, stored, and processed, even during times of stress or crisis.

GenAI can help design and maintain a robust data architecture by optimizing data storage and retrieval processes, reducing redundancies, and enhancing data integration from multiple sources. This ensures data availability and reliability across different regulatory requirements. BCG highlights that AI can bring efficiency and accuracy to data management tasks that traditionally required significant manual effort – see The Solution to Data Management’s GenAI Problem? More GenAI.

Principle 3: Accuracy and Integrity

All regulatory data should be accurate and reliable, pre-processed and/or aggregated largely through automated processes to minimize errors. This principle ensures that data used for compliance and reporting is dependable.

GenAI agents can automate data validation and error correction, significantly enhancing the accuracy and integrity of regulatory data. These models can detect anomalies and inconsistencies in real-time, reducing the likelihood of errors in compliance reporting. GenAI can streamline data cleaning processes by generating code for parsing, formatting, and identifying data quality issues.

Principle 4: Completeness

Regulatory data management should capture and aggregate all material data across the organization. This includes data from various business lines, legal entities, and other relevant groupings to identify and report exposures, concentrations, and emerging risks.

GenAI can enhance data completeness by automating the aggregation of data from diverse sources, ensuring no critical information is omitted. These technologies can continuously monitor data inputs to verify that all necessary data is captured and integrated accurately.

Principle 5: Data Lineage and Transparency

Maintaining clear and accurate data lineage is essential for all regulatory data, enabling organizations to trace the origins, movements, and transformations of data throughout its lifecycle.

GenAI can generate detailed data lineage reports automatically, providing transparency and traceability of data processes. These reports help organizations demonstrate compliance with data governance standards to regulators. The ability to document and audit data transformations effectively ensures that organizations can respond to regulatory inquiries with confidence and clarity.

Back to Principles

The industry has struggled to fully implement BCBS 239 and we’re well past the original 2016 target date. But, GenAI and LLM technologies offer real potential for the industry to make significant progress on BCBS compliance and regulatory data management in general.

By leveraging these technologies, organizations can build robust data management practices that can keep pace with and anticipate changes in regulatory requirements and improve overall operational efficiency. Best-practices for standards follow a principles-based approach and so should regulatory data management if GenAI is to deliver on its potential.

The post Generative AI Poised for Leading Role as Regulatory Data Burden Grows appeared first on A-Team.

]]>
FRTB Compliance – New Rules and Data Challenges for Global Banks https://a-teaminsight.com/blog/frtb-compliance-new-rules-and-data-challenges-for-global-banks/?brand=rti Mon, 10 Jun 2024 20:24:20 +0000 https://a-teaminsight.com/?p=68797 The Fundamental Review of the Trading Book (FRTB) was developed by the Basel Committee on Banking Supervision (BCBS) as part of the Basel III framework to address several key shortcomings identified in the market risk regulatory framework that existed under Basel II.5. FRTB was finalized in January 2016 and initially scheduled for implementation by January...

The post FRTB Compliance – New Rules and Data Challenges for Global Banks appeared first on A-Team.

]]>
The Fundamental Review of the Trading Book (FRTB) was developed by the Basel Committee on Banking Supervision (BCBS) as part of the Basel III framework to address several key shortcomings identified in the market risk regulatory framework that existed under Basel II.5.

FRTB was finalized in January 2016 and initially scheduled for implementation by January 2019. However, following feedback from market participant and regulators, the implementation was delayed. The new target date for compliance was revised to January 2023, with full adoption expected by January 2025. The majority of regulatory jurisdictions are targeting this implementation date except for the United States which will start its phased roll-out in July 2025 with anticipated completion by 2028.

In this article we’ll examine the impact of FRTB on firms’ Governance Risk and Compliance (GRC) frameworks, workflows and data management challenges.

One of the primary objectives of FRTB is to enhance the risk sensitivity of the capital framework for market risk. The existing framework under Basel II.5 was criticized for its inability to adequately capture certain risk exposures, particularly during periods of market stress.

Another objective is limiting opportunities for regulatory arbitrage. This refers to the practice of exploiting differences between regulations to gain an advantage, often leading to risk being transferred in ways that are not transparent or adequately capitalized. FRTB aims to reduce opportunities for such arbitrage by providing clearer criteria for the boundary between the trading book and banking book, ensuring that similar risks are treated consistently regardless of where they are booked.

By improving risk sensitivity and reducing arbitrage, FRTB is expected to lead to an increase in capital requirements for market risk. This ensures that banks hold sufficient capital to cover potential losses, thus enhancing the overall resilience of the banking sector. This increase in capital is necessary to address the underestimation of risks observed under the previous framework and to restore confidence in the capital adequacy of banks.

The New Rules and Data Impacts

The FRTB imposes strict requirements to ensure a clear and clean separation between trading book and banking book transactions. This separation is crucial to mitigate regulatory arbitrage and accurately assess and manage risks associated with each book. The additional data requirements necessary to maintain this separation include:

  • Each transaction must be clearly classified as either trading book or banking book based on its intent and characteristics. This involves detailed tagging of transactions with metadata that indicate their book classification.
  • Maintain comprehensive transaction-level data including trade date, settlement date, instrument type, and purpose of the trade to support the classification.
  • Establish and document policies and criteria for classifying transactions into trading or banking books. This documentation should include the rationale for classification decisions and be reviewed regularly.
  • Maintain robust audit trails to track the decision-making process for classifying transactions. This includes records of approvals, changes, and reviews to ensure transparency and accountability.
  • Use consistent data formats and standards across systems to ensure data integrity and facilitate aggregation and reporting.
  • Implement regular data reconciliation processes to ensure that data across trading and banking books are accurate and up to date.
  • Implement data quality controls such as validation checks, error detection mechanisms, and data cleansing procedures to maintain high-quality data.
  • Establish a single, authoritative source of data (golden source) to ensure consistency across different systems and reports.
  • Ensure that data related to trading book transactions are updated in real-time or near real-time to reflect intraday trading activities accurately.
  • Implement continuous monitoring systems to detect any discrepancies or anomalies in the classification and reporting of transactions.
  • Develop comprehensive reporting frameworks that meet regulatory requirements for both trading and banking books. Reports should include detailed breakdowns of positions, risk exposures, and capital requirements.
  • Generate internal reports to support management and oversight functions, providing insights into the risk profile and performance of both books.
  • Collect and maintain data on risk factors relevant to both trading and banking books, ensuring that these are properly attributed and segregated.
  • Ensure availability and accuracy of market and reference data used for pricing, risk assessment, and capital calculation purposes.

The new rules under FRTB impose stricter requirements for the use of internal models, including rigorous validation processes and backtesting. This is designed to ensure that models used to calculate capital requirements are reliable and accurately reflect the risk exposures.

By standardizing the methodologies and criteria for calculating market risk capital requirements, FRTB helps ensure that the risk-based capital ratios are comparable across banks globally. This consistency is crucial for maintaining a level playing field and for stakeholders to accurately assess and compare the risk profiles of different institutions.

The New Modelling Approaches

The revised Standardized Approach (SA) and Internal Models Approach (IMA) under the FRTB offer distinct methodologies for calculating market risk capital requirements, each with unique data demands and implications.

The SA is more prescriptive and designed to be universally applicable across all banks. It introduces a Sensitivities-Based Method (SBA), which calculates risk based on specific sensitivities (Delta, Vega, and Curvature) across seven defined risk classes. This approach relies heavily on standardized risk weights and correlations provided by regulators, necessitating accurate and granular data from banks’ pricing models to derive capital requirements. The SA capital charge is composed of three main components: the SBA, the Default Risk Charge (DRC), and the Residual Risk Add-on (RRA), each of which has its own data requirements for precise calculation of risk exposures.

In contrast, the IMA allows banks to use their internal risk models, subject to regulatory approval and ongoing performance testing. This approach shifts from the traditional Value at Risk (VaR) method to an Expected Shortfall (ES) methodology, which better captures the risk of extreme market movements and tail events. The IMA requires banks to perform daily profit and loss attribution tests and backtesting at the trading desk level, demanding a higher granularity of data on individual trades and risk factors.

Additionally, the IMA requires comprehensive historical data to model the expected shortfall accurately and to manage non-modellable risk factors (NMRFs), which necessitates frequent data observations to ensure robustness and compliance. NMRFs are subject to standardized charges if they do not meet data availability criteria.

The key difference in data requirements between the SA and IMA lies in the level of granularity and the frequency of data needed. The SA relies on standardized regulatory inputs and is thus less data-intensive in terms of internal calculations. However, it still requires precise input data to apply the prescribed risk weights and correlations effectively. The IMA, on the other hand, demands a much more detailed and continuous flow of data, including high-frequency observations and extensive historical datasets, to validate internal models and meet stringent regulatory standards

These differences underscore the need for robust data management systems and advanced analytics capabilities, particularly for banks opting for the IMA, which faces more stringent data demands to ensure model accuracy and regulatory compliance.

Expected Shortfall (ES) vs Value at Risk (VaR)

Expected Shortfall (ES), also known as Conditional Value-at-Risk (CVaR), is a risk measure used to assess the risk of extreme losses in a portfolio. Unlike Value-at-Risk (VaR), which only provides the potential loss at a certain confidence level, ES gives an average of the losses that occur beyond the VaR threshold.

ES is determined by selecting a confidence level (97.5% or 99%), calculate VaR which represents the threshold loss value, identify all the losses that exceed the VaR threshold and calculate the average of the tail losses. This average represents the Expected Shortfall.

The data requirements for calculating Expected Shortfall (ES) differ from those for Value-at-Risk (VaR) in several key ways:

VaR requires historical data to calculate the loss distribution up to a specified quantile (e.g., the worst 1% of losses for a 99% confidence level).

ES on the other hand requires additional data to assess the distribution of losses beyond the VaR threshold. This means not only identifying the worst losses but also calculating the average of these extreme losses, which demands a more detailed loss distribution analysis.

VaR focuses on the quantile threshold and does not consider the magnitude of losses beyond this point whilst ES needs granular data on all losses in the tail beyond the VaR threshold. Accurate ES calculation depends on having sufficient data points in the tail to reliably estimate the average loss.

VaR can be estimated using simpler models such as historical simulation, variance-covariance, or Monte Carlo simulation. ES requires more sophisticated modelling techniques to accurately capture the tail behaviour of loss distributions. This includes advanced statistical methods and more complex simulation techniques to ensure the tail losses are well understood and averaged correctly.

VaR backtesting involves comparing the VaR estimates to actual losses to see how often actual losses exceed VaR. ES validation is more challenging because it requires that the average of the tail losses is accurate. This involves deeper statistical analysis and validation against observed tail losses.

FRTB Progress – a Tale of Two Continents

In Europe, the FRTB framework is being integrated into the Capital Requirements Regulation (CRR) and Capital Requirements Directive (CRD) packages. The European Banking Authority (EBA) has provided detailed guidelines and timelines, with initial reporting requirements using the Standardized Approach (SA) starting back in 2021 and a clear path towards full compliance by 2025

This timeline aligns with the UK’s implementation plan, where the Prudential Regulation Authority (PRA) is working towards the same 2025 deadline.

European banks have been active in their preparations, with major institutions like BNP Paribas, Deutsche Bank, and Intesa Sanpaolo already applying for internal model approach (IMA) approvals from the European Central Bank (ECB).

Japan, Canada, and Switzerland have finalized their domestic rules, with Canada and Japan bringing these rules into force by mid-2024. Australia’s implementation is set for January 2025. In the United States, while detailed rulemaking is still pending, regulators have indicated a phased approach, with significant movement expected following the Advanced Notice of Proposed Rulemaking (ANPR) later this year.

Many banks in the Asia Pacific region are favouring the IMA approach, reflecting their commitment to adopting more sophisticated and risk-sensitive models

In contrast, the U.S. approach has been more cautious, with regulators taking additional time to assess the potential impacts on the domestic banking sector. While European banks are moving towards more standardized and prescriptive regulatory frameworks, U.S. regulators are considering a more flexible approach that takes into account the unique characteristics of the U.S. financial markets and the need for a balanced regulatory burden.

There has also been strong resistance to any additional capital requirements from some of the strongest voices in the industry. Jamie Dimon, Chairman and CEO of JPMorgan Chase, has been vocal in his criticism of the Basel III Endgame, including its implications for the Fundamental Review of the Trading Book (FRTB). In his remarks to the Senate Banking Committee in December 2023, Dimon highlighted several concerns regarding the new regulatory framework.

Dimon emphasized that the Basel III Endgame, which includes FRTB, could lead to a significant increase in capital requirements for banks. He argued that this could have harmful effects on the banking sector by reducing lending capacity and increasing costs for consumers. Specifically, Dimon pointed out that the proposal would raise capital requirements for large banks by 20-25%, which he believes could stifle economic growth and innovation within the financial industry.

He also highlighted the complexity and operational challenges associated with implementing the FRTB framework. Dimon noted that the increased data and technological demands required to comply with FRTB would place a substantial burden on banks, particularly in terms of upgrading their risk management systems and ensuring data quality.

In summary, Jamie Dimon has expressed significant concerns about the potential negative impacts of the Basel III Endgame and FRTB on the banking industry, emphasizing the increased capital requirements and operational complexities that could arise from these regulations.

The U.S. implementation of FRTB will begin on July 1, 2025, with a phased approach culminating in full compliance by July 1, 2028. This timeline is part of the broader Basel III “endgame” reforms aimed at enhancing the robustness of the financial system by addressing shortcomings in the current market risk framework. The U.S. regulatory agencies, including the Federal Reserve, the Office of the Comptroller of the Currency (OCC), and the Federal Deposit Insurance Corporation (FDIC), have proposed these changes to align with international standards while considering the unique aspects of the U.S. banking system. More details can be found here – Interagency Overview of the Notice of Proposed Rulemaking for Amendments to the Regulatory Capital Rule.

Staggard Timelines Raise New Concerns

The staggered implementation dates of the FRTB between the United States and other regions, particularly Europe, present several challenges for financial institutions operating across multiple jurisdictions. These challenges include regulatory arbitrage, operational complexities, competitive disparities, and difficulties in achieving consistent risk management practices.

Different implementation timelines can create opportunities for regulatory arbitrage, where firms exploit the differences in regulations to gain a competitive advantage. For instance, banks might shift trading activities to jurisdictions with less stringent or delayed regulations to benefit from lower capital requirements temporarily. This could undermine the global financial stability that FRTB aims to enhance by ensuring consistent risk management standards worldwide.

Financial institutions with global operations will need to manage and comply with different regulatory timelines, which can be operationally challenging. This involves maintaining multiple sets of risk management systems, reporting frameworks, and compliance protocols to meet the varying requirements. The need for dual reporting and parallel systems increases the operational burden and can lead to inefficiencies and higher costs.

Banks in regions where FRTB is implemented earlier, such as Europe, may face higher capital requirements and stricter risk management standards before their U.S. counterparts. This could place European banks at a competitive disadvantage, as they would need to allocate more capital to cover market risks sooner than U.S. banks. The disparity in implementation could affect the pricing of financial products and the competitive landscape of global financial markets.

Achieving consistent risk management practices across different jurisdictions becomes more challenging with staggered implementation dates. Global banks need to ensure that their risk management frameworks are robust enough to comply with the most stringent standards while managing the transition in regions with delayed implementation. This inconsistency can lead to fragmented risk management practices and potential gaps in risk coverage, impacting the overall effectiveness of FRTB.

Coordinating with multiple regulatory bodies across different timelines requires effective communication and strategic planning. Financial institutions must stay abreast of regulatory updates, interpret diverse regulatory expectations, and engage in proactive dialogue with regulators to ensure compliance. The lack of synchronized implementation can lead to confusion and increased regulatory scrutiny, complicating the compliance landscape for global banks.

Despite these challenges, the implementation of FRTB is designed to address critical deficiencies in the previous market risk framework by enhancing risk sensitivity, reducing regulatory arbitrage, improving model governance, increasing capital requirements, and promoting consistency across jurisdictions, and strengthening the stability and resilience of the global banking system.

The post FRTB Compliance – New Rules and Data Challenges for Global Banks appeared first on A-Team.

]]>
UK’s Debut SDR Rules Raise Data Management Concern https://a-teaminsight.com/blog/uks-debut-sdr-rules-raise-data-management-concern/?brand=rti Mon, 03 Jun 2024 15:00:52 +0000 https://a-teaminsight.com/?p=68703 The UK’s newly implemented sustainability disclosure requirements (SDR) have placed additional data management burdens on financial institutions that operate in the UK. The country’s first such framework, created by the Financial Conduct Authority (FCA), is aimed at preventing greenwashing and fostering trust in British sustainability markets. It’s designed to protect the interests of investors by...

The post UK’s Debut SDR Rules Raise Data Management Concern appeared first on A-Team.

]]>
The UK’s newly implemented sustainability disclosure requirements (SDR) have placed additional data management burdens on financial institutions that operate in the UK.

The country’s first such framework, created by the Financial Conduct Authority (FCA), is aimed at preventing greenwashing and fostering trust in British sustainability markets. It’s designed to protect the interests of investors by enshrining strict rules on how financial products can be advertising, marketed and labelled, and seeks to ensure such information is “fair, clear, and not misleading”.

Critics, however, have pointed to several potential pitfalls that face institutions as they put processes in place to comply with the new SDR. Because the FCA requires that all claims must be backed by robust and credible data, many of the new challenges are likely to be borne by firms’ data teams.

New Classifications

Under the SDR, asset managers – and later portfolio managers – will be expected to provide greater transparency into the sustainability claims attached to their funds and provide data to demonstrate the ESG performance of the funds’ component companies.

Institutions and companies in scope will be asked to voluntarily categorise their investment products according to the concentration of sustainability-linked assets within them. There are four categories of declining levels of sustainability, ranging from “Sustainability Focus” to “Sustainability Mixed Goals”.

This reflects but differs from the European Union’s Sustainable Finance Disclosure Regulation (SFDR), in which asset managers are compelled to classify their products’ according to a similar range of categories.

Among several other SDR requirements, asset managers will be asked to provide entity- and product-level disclosures and adhere to new fund naming regulations – which forbid the use of descriptions that it terms as “vague”, including “ESG” and “sustainability”.

Effective Strategy

While the SDR has been welcomed as a good first step by campaigners for stronger and more transparent sustainability markets in the UK, its implementation could prove tricky. Among the challenges institutions face is the code’s apparent incompatibility with other similar regulations that firms would face overseas. Some observers have complained that the SDR’s fund sustainability categories don’t easily match the Articles 6, 8 and 9 classifications of the SFDR.

This is where data managers will be of critical importance.

“As with all regulations, financial institutions must ensure they have an effective data management strategy in place from now, enabling systems to efficiently collect and aggregate ESG risk-related data to evidence sustainability claims both internally and externally,” GoldenSource head of ESG, connections and regulatory affairs Volker Lainer told Data Management Insight.

“Now, much higher levels of scrutiny are needed on the underlying methodologies and calculations involved in determining ESG scores. Firms that prioritise this will find themselves in a much stronger position as and when the next stages of the UK’s SDR are implemented.”

Data Doubts

The FCA announced the details of the SDR in November last year. It stressed at the time the importance of data management to compliance with the SDR last year. Firms in scope should “have in place appropriate resources, governance, and organisational arrangements, commensurate with the delivery of the sustainability objective”, it said.

“This includes ensuring there is adequate knowledge and understanding of the product’s assets and that there is a high standard of diligence in the selection of any data or other information used (including when third-party ESG data or ratings providers are used) to inform investment decisions for the product,” it said.

Legal experts questioned whether the UK’s financial industry would be able to fully comply. In a report published in April, international law firm Baker McKenzie asked whether firms would be able to keep up with the data requirements expected of the regulation, and questioned whether the data would even be available.

Careful Consideration

While gaps in ESG data still exist, A-Team Group’s ESG Data and Tech Summit London heard that the data record is improving with many more vendors providing ever granular datasets. Market figures caution, however, that the data imperative of the SDR should still be carefully considered.

“With more specific product labelling rules set to apply to from July, UK firms must brace themselves for these ongoing changes to better navigate the complexity jungle. It is clear data and regulatory content mapping is the key differentiator for service providers here – relying on trusted vendors that can provide quality, accurate data and content in pre-established delivery formats,” said Martina Macpherson, head of ESG product strategy and management in the Financial Information division at SIX.

“This is the only way firms can back up their sustainability credentials, meaning they will be better placed to meet new regulatory requirements and prepare for those to come later this year.”

The post UK’s Debut SDR Rules Raise Data Management Concern appeared first on A-Team.

]]>