Regulatory Compliance - A-Team https://a-teaminsight.com/category/regulatory-compliance/ Wed, 17 Jul 2024 09:20:12 +0000 en-GB hourly 1 https://wordpress.org/?v=6.5.5 https://a-teaminsight.com/app/uploads/2018/08/favicon.png Regulatory Compliance - A-Team https://a-teaminsight.com/category/regulatory-compliance/ 32 32 DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business https://a-teaminsight.com/blog/dmi-webinar-preview-how-to-maximise-the-use-of-data-standards-and-identifiers-beyond-compliance-and-in-the-interests-of-the-business/?brand=dmi Tue, 09 Jul 2024 14:43:57 +0000 https://a-teaminsight.com/?p=69176 Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation. Identifiers and standards ensure those attributes can be met. The challenge...

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
Data must be consistent, accurate and interoperable to ensure financial institutions can use it in their investment, risk, regulatory compliance and other processes. Without those attributes, they won’t achieve the efficiencies, surface the insights, action decisions or realise the many other benefits of digitalisation.

Identifiers and standards ensure those attributes can be met. The challenge facing institutions, however, is that such rules often conflict or don’t exist. At the most fundamental level, for instance, company names may not be identically represented across datasets, meaning any analytics or other process that incudes that data could be skewed.

When identifiers and standards do align, however, they offer value beyond the advantages that come with clear categorisation. These benefits will form an important part of the conversation in A-Team Group Data Management Insight’s next webinar, entitled “How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business”.

Industry Leaders

The webinar will see leading figures from the sector delve into the importance of identifiers and standards as well as provide context about their uses and benefits. On the panel will be: Alexandre Kech, chief executive of the Global Legal Entity Identifier Foundation (GLEIF); Robert Muller, director and senior group manager, technology product owner, at BNY; Emma Kalliomaki, managing director at Derivatives Service Bureau (DSB); and, Laura Stanley, director of entity data and symbology at LSEG.

“Identifiers and standards play a critical role in data management,” GLEIF’s Kech tells DMI. “They facilitate clear identification and categorisation of data, enabling efficient data integration, sharing, and analysis.

Without them financial institutions, corporates and other legal entities, would struggle with several challenges, he said.

Among those pain points are data inconsistency resulting from different systems using different naming conventions, which would lead to difficulties in data reconciliation and integration, and operational inefficiencies, with manual processes being used to verify and match data increasing the risk of errors and operational costs.

Additionally, Kech said, compliance risks that stem from fragmented and inconsistent data would prevent regulatory requirements to be met effectively; and, limited transparency would make tracing transactions and entities accurately difficult, potentially hindering risk management and auditing processes.

In essence, this would erode trust and reliability in the data, said DSB’s Kalliomaki.

“That is fundamental for firms to fulfil a lot of functions, but regulatory reporting is one that comes with great consequences if not undertaken properly,” she tells DMI.

“When it comes to having data standards, everyone is very aware that to better manage your data, to better assure the quality of your data, to ensure consistency alignment harmonisation with your counterparties and to mitigate the number of omissions and errors you may have, having standards is much more effective from a data management standpoint.”

Growing Need

“The amount of data that financial services firms are engaging with in their financial instrument processes is growing exponentially. Therefore, the need for data standards and identifiers is growing alongside this,” said Stanley at LSEG, which supports a number of identifiers, enabling delivery of a firm’s existing and evolving use cases.

LSEG issues proprietary identifiers such as SEDOL and RIC and acts as an National Numbering Agency for UK ISIN codes, is a globally accredited Local Operating Unit for LEI codes and recognises the importance of standards across the ecosystem and beyond regulation.

“At LSEG we acknowledge the potential of data when shared, the PermID is fully open and acts as the connective tissue that enables us to identify different objects of information and stitch data sets together.”

More Than Compliance

With robust identifiers and standards in place, the full value of data can be extracted. Among the benefits expected to be discussed in the webinar are:

  • Improved decision-making and analysis
  • Lower costs from reducing the need for manual data processing and reconciliation and from accelerating transaction processing
  • Innovation driven by seamless data exchange between different systems and organisations
  • Enhanced business agility and competitiveness that comes from providing reliable data for strategic planning and risk management.

“I see financial institutions using data standards and identifiers – beyond compliance – to a great extent,” says BNY’s Muller. “There are a number of best practices firms can employ, for instance strategy, design and education, to ensure standards and identifiers deliver value through associated business cases.”

With regulatory demands likely to increase over time the need for common identifiers and standards is expected to grow in importance and lead to harmonisation across borders.

“As a broader community, we all have to be willing to look at the greater good rather than commercialisation or IP-related aspects,” says Kalliomaki. “That harmonisation of us working together collaboratively is key.”

  • A-Team Group’s How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business webinar will be held on July 18 at 10am ET / 3pm BST / 4pm CET. Click here to join the discussion.

The post DMI Webinar Preview: How to Maximise the use of Data Standards and Identifiers Beyond Compliance and in the Interests of the Business appeared first on A-Team.

]]>
Duco Unveils AI-Powered Reconciliation Product for Unstructured Data https://a-teaminsight.com/blog/duco-unveils-ai-powered-reconciliation-product-for-unstructured-data/?brand=dmi Tue, 09 Jul 2024 14:37:59 +0000 https://a-teaminsight.com/?p=69173 Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data. The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market...

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Duco, a data management automation specialist and recent A-Team Group RegTech Insight Awards winner, has launched an artificial intelligence-powered end-to-end reconciliation capability for unstructured data.

The Adaptive Intelligent Document Processing product will enable financial institutions to automate the extraction of unstructured data for ingestion into their systems. The London-based company said this will let market participants automate a choke-point that is often solved through error-prone manual processes.

Duco’s AI can be trained on clients’ specific documents, learning how to interpret layout and text in order to replicate data gathering procedures with ever-greater accuracy. It will work within Duco’s SaaS-based, no-code platform.

The company won the award for Best Transaction Reporting Solution in A-Team Group’s RegTech Insight Awards Europe 2024 in May.

Managing unstructured data has become a key goal of capital markets participants as they take on new use cases, such as private market access and sustainability reporting. These domains are largely built on datasets that lack the order of reference, pricing and other data formats with which it must be amalgamated in their systems.

“Our integrated platform strategy will unlock significant value for our clients,” said Duco chief executive Michael Chin. “We’re solving a huge problem for the industry, one that clients have repeatedly told us lacks a robust and efficient solution on the market. They can now ingest, transform, normalise, enrich and reconcile structured and unstructured data in Duco, automating data processing throughout its lifecycle.”

The post Duco Unveils AI-Powered Reconciliation Product for Unstructured Data appeared first on A-Team.

]]>
Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie https://a-teaminsight.com/blog/building-future-growth-around-a-foundational-data-core-sixs-marion-leslie/?brand=dmi Wed, 03 Jul 2024 08:20:31 +0000 https://a-teaminsight.com/?p=69100 There’s a neat symmetry in speaking to Marion Leslie, head of financial information at SIX after one of the busiest six months in the company’s recent history. SIX, a global data aggregator and operator of exchanges in its native Switzerland, as well as in Spain, has released a flurry of new data products since January,...

The post Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie appeared first on A-Team.

]]>
There’s a neat symmetry in speaking to Marion Leslie, head of financial information at SIX after one of the busiest six months in the company’s recent history.

SIX, a global data aggregator and operator of exchanges in its native Switzerland, as well as in Spain, has released a flurry of new data products since January, including a suite of ESG tools and two global equities index families that herald a plan to become a one-stop-shop for ETFs.

According to Leslie, the frenetic pace of partnerships, product releases and enhancements this year is just the tip of the iceberg. The Zurich-based, bank-owned organisation has more to come, all built around a trove of data and data capabilities it has built up over more than 90 years of operations.

At heart, it remains a global pricing reference data provider – that’s the “base data” that SIX “is built on”, says Leslie. But the company is putting in place ambitious plans to leverage that core data competency to meet the increasingly complex demands and use cases of financial institutions.

“I believe that the fundamental data set – having really good-quality reference data and pricing data – allows us to create new value-added services and insights to our clients, and that remains the same whether we’re talking about GenAI or good old fashioned master reference,” Leslie tells Data Management Insight from SIX’s offices in London. “Unless you’ve got those basics you can’t really make sensible decisions, let alone produce reliable analytics.”

Expansion Plans

Leslie says SIX sees its USP as the ability to leverage that core data product to create applications for a multiplicity of use cases. Already it is using its fundamental datasets as the backbone of regulatory, corporate actions, tax, sanctions and ESG products for its banking clients.

A slew of recent acquisitions, investments and partnerships have been similarly guided by SIX’s programme of creating services that can tap into its core offering. The purchase of ULTUMUS in 2021 and the deepening of a long-standing association with BITA earlier this year were part of a plan to forge the company’s ETF-servicing business, each deal enhancing SIX’s indexing capabilities.

In ESG too, it has been aggressively striking deals to help burnish a slate of new sustainability offerings. Products unveiled in the past year by ESG product strategy and management head Martina MacPherson all benefit from supply deals struck with vendors including Sustainalytics, MSCI, Inrate and the CDP, as well as new partnerships with companies including Greenomy. Among the ESG products launched recently is an SME assessment tool, which MacPherson said will bring thousands of smaller companies into the ESG data ecosystem, into which banks and investors might otherwise have had no visibility.

Working Data

SIX’s ESG provisions illustrate what Leslie describes as the company’s dedication to making data work for companies.

“Organisations need to figure out how they’re going to incorporate data and how they’re going to make it relevant,” she says. “Well, the only way you can make it relevant is if it’s got something to hook on to, and that’s where you get back to those fundamental data sets.”

Leslie explains that one of the driving forces behind the company’s vigorous expansion plans is the changing demands for data among banks. No longer can any part of the industry rely on end-of-day pricing data, or monthly and quarterly reports. Ditto for risk managers and compliance teams.

The consequence has been a shift in the workloads of the front-, middle- and back-offices. No longer is research the premise of middle-office teams, Leslie offers as an example; the front office needs those insights quicker and so it has made sense for banks to embed data access and functionality within asset managers own analytical workflows.

“Asset managers see that the speed of data is increasing all the time and so the buy side, which was perhaps in the past much more built around end-of-day or less immediate requirements, is moving much more into real-time and intraday needs,” she says. “That requires, therefore, real-time market data, and that is expected by regulators, it’s expected by customers, and its therefore expected by market participants.”

AI Challenge

Jokingly, Leslie likens data operations to raising a child: it needs constant attention and feeding to grow and thrive. The simile is just as true for banks’ data management needs too; they are constantly changing and growing, influenced by internal needs and external innovations. That’s exemplified by the race to integrate artificial intelligence (AI) into processes and workflows.

Recent SIX research found that more than nine out of 10 asset managers expect to be using AI within the next three years and that half already do. Driven by its own clients’ need to understand what AI will mean to them, SIX has begun looking at how it can enhance its products with the various forms of AI available.

It has taken a structured approach to the programme and is looking at where AI can help clients improve efficiency and productivity; examining how it can improve customer experience and support; and, testing how it can be incorporated into products. For the latter, SIX is experimenting with off-the-shelf GenAI technology to identify aberrations in trading patterns within a market abuse solution.

On this subject, too, Leslie stresses that SIX can only think about such an evolution because it is confident that it has a solid foundational data offering.

“Our role is to make sure that we’re providing data that is fit for purpose and enables our clients to do business in a competitive way,” she says. “So that will include, as it always has, providing trusted, reliable data that the client knows is fit for purpose and on which they can make decisions. And that’s as true if it’s going to an AI model as if it’s going into a client digital wealth platform or portfolio reporting or risk solution.”

Values Align

Leslie took up her latest role at SIX in 2020 and also is a member of the board for the SIX-owned Grupo BME, Spain’s stock exchange, previously holding roles at LSEG and Thomson Reuters.

She is proud to be part of an organisation whose stakeholders are banks – about 120 of them – and not shareholders “trying to race to hit a quarter result”. She feels a very strong alignment with its values, too.

“It’s an organisation whose purpose is to enable the smooth functioning of the economy and has consistency and trust at the very core,” she says. “When half the world is voting this year, this stuff’s important, and when we’re talking about AI, or we’re talking about market failures then the thing that brings trust and progress is the data that sits behind it. To be a trusted provider in this day-and-age is a critical service.”

The post Building Future Growth Around a Foundational Data Core: SIX’s Marion Leslie appeared first on A-Team.

]]>
AI Startup BlueFlame Raises $5m for Alternative Markets Data Platform https://a-teaminsight.com/blog/ai-startup-blueflame-raises-5m-for-alternative-markets-data-platform/?brand=dmi Tue, 02 Jul 2024 10:11:09 +0000 https://a-teaminsight.com/?p=69082 BlueFlame AI, a start-up that harnesses artificial intelligence to help alternative market participants streamline their operational, regulatory and clerical processes, has raised US$5 million in a Series A funding round. The cash injection, which will be used to further develop BlueFlame’s AI platform, raises the company’s value to $50m, the New York- and London-based company...

The post AI Startup BlueFlame Raises $5m for Alternative Markets Data Platform appeared first on A-Team.

]]>
BlueFlame AI, a start-up that harnesses artificial intelligence to help alternative market participants streamline their operational, regulatory and clerical processes, has raised US$5 million in a Series A funding round.

The cash injection, which will be used to further develop BlueFlame’s AI platform, raises the company’s value to $50m, the New York- and London-based company said.

BlueFlame’s platform enables private market clients to use generative AI and natural language processing to source deals and streamline communications. It can also accelerate the review and surfacing of insights from PDFs and other unstructured data sources, such as confidential information memoranda, when negotiating deals.

The company is among a number of data services providers to launch products designed to help limited and general partners better use data in the running of funds that draw about a third of all institutional investment capital.

“AI is now a ‘must-have’ tool that alternative investment managers recognise is critical to streamline their operations, improve efficiencies and help them deliver cutting-edge strategies,” said BlueFlame chief executive Raj Bakhru, who helped form the company last year. “The value AI can deliver is clear and our investors understand the challenges of bringing structured and unstructured data together through AI tools while meeting compliance, security, and regulatory requirements.”

According to a statement, BlueFlame was founded by former cybersecurity, financial technology and governance, risk and compliance specialists. It has a 20-strong workforce and serves clients that have “hundreds of billions” of dollars-worth of assets under management, including private equity, hedge fund and wealth managers.

The post AI Startup BlueFlame Raises $5m for Alternative Markets Data Platform appeared first on A-Team.

]]>
Practicalities of Implementing GenAI in Capital Markets https://a-teaminsight.com/blog/practicalities-of-implementing-genai-in-capital-markets/?brand=dmi Wed, 26 Jun 2024 10:27:41 +0000 https://a-teaminsight.com/?p=69037 Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and...

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>
Following the opening keynote of A-Team Group’s AI in Capital Markets Summit (AICMS), a panel of expert speakers focused on the practicalities of implementing GenAI. The panel agreed that industry hype is waning and there is enthusiasm for GenAI with firms beginning to develop use cases, although one speaker noted: “People understand the risks and costs involved, but they were initially underestimated, I would say dramatically in some cases.”

The panel was moderated by Nicola Poole, formerly at Citi, and joined by Dara Sosulski, head of AI and model management markets and securities services at HSBC; Dr. Paul Dongha, group head of data and AI ethics at Lloyds Banking Group; Fatima Abukar, data, algorithms and AI ethics lead at the Financial Conduct Authority (FCA); Nathan Marlor, head of data and AI at Version 1; and Vahe Andonians, founder, chief product officer and chief technology officer at Cognaize.

Considering the use of GenAI, an early audience poll question asked to what extent organisations are committed to GenAI applications. Some 46% said they are testing GenAI apps, 24% are using one or two apps, and 20% are using a number of apps. Nine percent are researching GenAI and 2% say there is nothing in the technology for them.

Value of GenAI applications

A second poll questioned which GenAI applications would be of most value to a delegate’s organisation. In this case, 53% of respondents cited predictive analytics, 39% risk assessment, 39% KYC automation, 28% fraud detection and 19% portfolio management.

The panel shared their own use cases, with one member experimenting with GenAI to produce programming code and creating an internal chat box for data migration, as well as scanning data to surface information that can be categorised, sorted, filtered and summarised to create ‘kind of conversational extracts that can be used.’

All agreed that GenAI produces some low hanging fruit, particularly in operational activities such as KYC automation, but that the technology is too young for many applications, leading firms to build capability internally before unleashing GenAI apps for customers as there is still work to do around issues such as risk integration and ensuring copyright and data protection are not compromised. One speaker said: “There is a lot of experimentation and some research to do before we’re confident that we can use this at scale.” Another added: “There are just not enough skilled people to allow us to push hard, even if we wanted to. There’s a real pinch point in terms of skills here.”

Risks of adopting GenAI

Turning to risk, a third audience poll asked the audience what it considered to be the biggest risk around adopting GenAI. Here data quality was a clear leader, followed by lack of explainability, hallucinations, data privacy and potential misuse. Considering these results, a speaker commented: “We’ve already got existing policies and governance frameworks to manage traditional AI. We should be using those to better effect, perhaps in response to people identifying data quality as one of the key risks.”

The benefits of AI and GenAI include personalisation that can deliver better products to consumers and improve the way in which they interact with technology. From a regulatory perspective, the technologies are focused on reducing financial crime and money laundering, and resulting enforcements against fraudulent activity.

On the downside, the challenges that come with AI technologies are many and include ethical risk and bias, which needs to be addressed and mitigated. One speaker explained: “We have a data science lifecycle. At the beginning of this we have a piece around the ethical risk of problem conception. Throughout the lifecycle stages our data scientists, machine learning engineers and future engineers have access to python libraries so that when they test models, things like bias and fairness are surfaced. We can then see and remediate any issues during the development phase so that by the time models come to validation and risk management we can demonstrate all the good stuff we’ve done.” Which leads us to the need, at least in the short term, for a human element for verification and quality assurance of GenAI models in their infancy.

Getting skills right

Skills were also discussed, with one panel member saying: “We are living in a constantly more complex world, no organisation can claim that all its workforce has the skill set necessary for AI and GenAI, but ultimately I am hopeful that we are going to create more jobs than we are going to destroy, although the shift is not going to be easy.” Another said: “In compliance, we will be able to move people away from being data and document gatherers and assessors of data in a manual way to understand risk models, have a better capability and play a more interesting part.”

Taking a step back and a final look at the potential of GenAI, a speaker concluded: “Figuring out how to make safe products that we can offer to our customers is the only way we have a chance of reaching any sort of utopian conclusion. We must chart the right course for society and for people at work, because we’re all going to be affected by generative AI.”

The post Practicalities of Implementing GenAI in Capital Markets appeared first on A-Team.

]]>
Mastering Data Compliance: Strategies to Overcome the Data Mountain https://a-teaminsight.com/blog/mastering-data-compliance-strategies-to-overcome-the-data-mountain/?brand=dmi Mon, 24 Jun 2024 13:31:09 +0000 https://a-teaminsight.com/?p=68994 By Adam Quirke, Business Development Lead, Financial Services, InterSystems UK & Ireland. With the financial sector’s regulation landscape continuously evolving, compliance officers at financial institutions play a crucial role in maintaining integrity and trust. They maintain compliance despite a growing burden of responsibilities and the increasingly acute problem of legacy technology, which lacks agility, requires...

The post Mastering Data Compliance: Strategies to Overcome the Data Mountain appeared first on A-Team.

]]>
By Adam Quirke, Business Development Lead, Financial Services, InterSystems UK & Ireland.

With the financial sector’s regulation landscape continuously evolving, compliance officers at financial institutions play a crucial role in maintaining integrity and trust. They maintain compliance despite a growing burden of responsibilities and the increasingly acute problem of legacy technology, which lacks agility, requires significant maintenance efforts, introduces operational inefficiencies and may delay implementing required regulatory changes.

This reliance on outdated systems is becoming a serious threat as regulators become more demanding. Manual processes remain common at a time when new regulation demands more detail and greater frequency of reports. Multiple disparate data sources often struggle to connect and work together to manage the growing mountain of data required for analysis, insight and reporting in this sensitive area.

By leveraging modern data technology, compliance teams can significantly enhance efficiency and avoid continuing to toil away on lengthy and complex processes that eat up unnecessary amounts of their time. A survey of 375 asset management companies around the world commissioned by InterSystems found two-thirds of firms employ between six and nine people just to process data.

Data management challenges have intensified

Asset managers need to think hard about new approaches to data. The survey found that 44% of respondents see improving responses to regulators’ requests as one of their major data management challenges. Some 54% state data errors between disparate sources is their number one driver for investing in data management.

Many firms also struggle to obtain data that is current. Only 38% use data less than a day old for reporting, with processing unstructured information from a wide variety of sources, errors, and manual methods all contributing to the problem.

Firms need to innovate and adopt a smarter architecture

Innovation is now essential for compliance. Data fabric architecture is rising in popularity as it is one of the most effective approaches to deliver accurate, harmonised data for reporting in near real time. As an architectural layer, it simplifies complex data infrastructures without replacing systems, maximising current investments in technology. The data fabric layer sits on top of a firm’s infrastructure, delivering a unified version of data from high-volume internal and external sources without time-consuming manual processes or complicated wrangling. It streamlines compliance work and supplies the kind of timely and trusted data compliance officers need for reporting.

The truth is that without greater automation in compliance processes, and more efficient data management, costs will rise. Recruitment of skilled and expensive staff is becoming necessary to cope successfully with the burdens of compliance. The Thomson Reuters’ 2023 Cost of Compliance Report, which reviewed 1,374 regulators in 190 countries, found a third of respondents with compliance-related responsibilities at financial institutions in the UK, EU, and US expecting compliance teams to grow and the overall costs to increase. Many also see how compliance will steadily have more involvement in cyber resilience, corporate governance and the setting of risk appetite.

Coping with these extra demands is difficult when firms have legacy systems and applications. These create enormous complexities and prevent organisations from accessing and processing the clean standardised data needed for regulatory reporting. Harmonising data and rendering it usable and meaningful can be time-consuming and costly when firms require data spread across spreadsheets, data warehouses, data marts or data lakes.

Significant regulatory changes will demand more reporting

A quick scan of the regulatory horizon shows why such innovation is necessary. The entire financial services world faces a tide of new rules. EMIR 3.0, and Basel 3.1 will introduce more than 80 new data-field requirements. For Basel 3.1, the Prudential Reporting Authority (PRA) plans to introduce 19 new COREP (Common Reporting) templates and revise 12 of those already in use. Last year’s wide-ranging UK Financial Services and Markets Act also introduced significant updates to the regulatory reporting framework (also giving the Bank of England and PRA expanded enforcement powers).

The UK and EU are not alone in developing regulations. The US Securities and Exchange Commission (SEC) has adopted final Private Fund Adviser Rules. These rules include five regulations and are backed up with a large volume of summary materials.

Environmental, social and governance (ESG) obligations are also an increasingly strong factor in regulation. As of May this year, for example, UK Financial Conduct Authority (FCA) regulated firms must review product types and disclosures in the context of ESG to eliminate greenwashing. These changes will increase workloads substantially.

The benefits go beyond compliance

When an asset management firm transforms its data architecture, the gains extend beyond the compliance function. The firm can build machine learning (ML) models to inject greater efficiency into risk management and to streamline back- and middle-office processes. They can also more easily meet bespoke client requirements and provide rapid and detailed insight into performance, fees and charges, changing market conditions and competitors’ activities.

By investing in smart data fabric, which takes the data fabric approach a step further by embedding a wide range of analytics capabilities, asset managers can enable compliance teams to excel at a time when their firm may well be struggling with low margins and sluggish growth.

The compliance function needs agility and new technology just as much as line-of-business teams, which is why asset managers should think seriously about smart data fabric. It will give them the compliance capabilities they need to streamline onerous new regulatory requirements and achieve higher margins and greater internal efficiency.

The post Mastering Data Compliance: Strategies to Overcome the Data Mountain appeared first on A-Team.

]]>
Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard https://a-teaminsight.com/blog/datactics-enhances-augmented-data-quality-solution-with-magic-wand-and-rule-wizard/?brand=dmi Wed, 19 Jun 2024 11:42:03 +0000 https://a-teaminsight.com/?p=68978 Datactics has enhanced the Augmented Data Quality Solution (ADQ) it brought to market in November 2023 with the addition of an AI magic wand, Snowflake connectivity and an SQL rule wizard in ADQ v1.4. The company is also working towards the release of ADQ v1.5 that will include generative AI (GenAI) rules and predictive remediation...

The post Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard appeared first on A-Team.

]]>
Datactics has enhanced the Augmented Data Quality Solution (ADQ) it brought to market in November 2023 with the addition of an AI magic wand, Snowflake connectivity and an SQL rule wizard in ADQ v1.4. The company is also working towards the release of ADQ v1.5 that will include generative AI (GenAI) rules and predictive remediation suggestions based on machine learning (ML).

ADQ was born out of market interest in a data quality solution designed for not only dedicated data quality experts and data stewards, but also non-tech users who could write their own data quality roles. A user-friendly experience, automation and a reduction in manual processes were also top of mind.

Kieran Seaward, head of sales and business development at Datactics, explains: “Customers said their challenges with data quality were the time it took to stand up solutions and enable users to manage data quality across various use cases. There were also motivational challenges around tasks associated with data ownership and data quality. We took all this on board and built ADQ.”

ADQ v1.4

ADQ made a strong start with v1.4 also a response to customer interests, this time in automation, reduced manual intervention, improved data profiling and exception management, increased connectivity, predictive data quality analytics, and more.

Accelerating automation, ADQ v1.4 offers enhanced out-of-the box data quality rules that ease the burden for non-tech users. The AI magic wand includes reworked AI and ML features and an icon showing where users can benefit from Datactics ML in ADQ. Data quality process automation also accelerates the assignment of issues to nominated data users.

Increased connectivity features the ability to configure a Snowflake connection straight through the ADQ user interface, eliminating the need to set this up in the backend. The company is working on additional integrations as it moves towards v1.5.

Predictive data quality analytics monitor data quality and alert data stewards of breaks and other issues. Stewards can then view the problems and ADQ v1.4 will suggest solutions. Based on a breakage table of historical data from data quality rules, ADQ v1.4 can also predict why data quality will fail in the future. Seaward comments: “Data quality is usually reactive but now we can put preventative processes in place. Predictive data quality is very safe to use as the ML does not change the data, instead providing helpful suggestions based on pattern recognition.”

The SQL rule wizard allows data quality authors to build SQL rules in ADQ, performing data quality checks in-situ to optimise processing time.

ADQ v1.5

Moving on to ADQ v1.5 and the integration of GenAI, users will be able to query the model, write a rule for business logic specific to their domain and test the rule to see if it produces desired results. Datactics is currently using OpenAI ChatGPT to look at the potential of GenAI, but acknowledges that financial institutuiosn are likely to have their own take on LLMs and will point its solution to these internal models.

Other developments include a data readiness solution including preconfigured rules that can check data quality and allow any remedial action before regulatory data submissions are made for regulations including EMIR Refit, MiFID III and MiFIR II, and the US Data Transparency Act and SEC rule 10c-1.

Criticality rules that will help data stewards prioritise data problems and solutions are also being prototyped, along with improved dashboards and permissioning, and as it started, next stage development will continue to make ADQ more friendly for business users.

The post Datactics Enhances Augmented Data Quality Solution with Magic Wand and Rule Wizard appeared first on A-Team.

]]>
UK’s Debut SDR Rules Raise Data Management Concern https://a-teaminsight.com/blog/uks-debut-sdr-rules-raise-data-management-concern/?brand=dmi Mon, 03 Jun 2024 15:00:52 +0000 https://a-teaminsight.com/?p=68703 The UK’s newly implemented sustainability disclosure requirements (SDR) have placed additional data management burdens on financial institutions that operate in the UK. The country’s first such framework, created by the Financial Conduct Authority (FCA), is aimed at preventing greenwashing and fostering trust in British sustainability markets. It’s designed to protect the interests of investors by...

The post UK’s Debut SDR Rules Raise Data Management Concern appeared first on A-Team.

]]>
The UK’s newly implemented sustainability disclosure requirements (SDR) have placed additional data management burdens on financial institutions that operate in the UK.

The country’s first such framework, created by the Financial Conduct Authority (FCA), is aimed at preventing greenwashing and fostering trust in British sustainability markets. It’s designed to protect the interests of investors by enshrining strict rules on how financial products can be advertising, marketed and labelled, and seeks to ensure such information is “fair, clear, and not misleading”.

Critics, however, have pointed to several potential pitfalls that face institutions as they put processes in place to comply with the new SDR. Because the FCA requires that all claims must be backed by robust and credible data, many of the new challenges are likely to be borne by firms’ data teams.

New Classifications

Under the SDR, asset managers – and later portfolio managers – will be expected to provide greater transparency into the sustainability claims attached to their funds and provide data to demonstrate the ESG performance of the funds’ component companies.

Institutions and companies in scope will be asked to voluntarily categorise their investment products according to the concentration of sustainability-linked assets within them. There are four categories of declining levels of sustainability, ranging from “Sustainability Focus” to “Sustainability Mixed Goals”.

This reflects but differs from the European Union’s Sustainable Finance Disclosure Regulation (SFDR), in which asset managers are compelled to classify their products’ according to a similar range of categories.

Among several other SDR requirements, asset managers will be asked to provide entity- and product-level disclosures and adhere to new fund naming regulations – which forbid the use of descriptions that it terms as “vague”, including “ESG” and “sustainability”.

Effective Strategy

While the SDR has been welcomed as a good first step by campaigners for stronger and more transparent sustainability markets in the UK, its implementation could prove tricky. Among the challenges institutions face is the code’s apparent incompatibility with other similar regulations that firms would face overseas. Some observers have complained that the SDR’s fund sustainability categories don’t easily match the Articles 6, 8 and 9 classifications of the SFDR.

This is where data managers will be of critical importance.

“As with all regulations, financial institutions must ensure they have an effective data management strategy in place from now, enabling systems to efficiently collect and aggregate ESG risk-related data to evidence sustainability claims both internally and externally,” GoldenSource head of ESG, connections and regulatory affairs Volker Lainer told Data Management Insight.

“Now, much higher levels of scrutiny are needed on the underlying methodologies and calculations involved in determining ESG scores. Firms that prioritise this will find themselves in a much stronger position as and when the next stages of the UK’s SDR are implemented.”

Data Doubts

The FCA announced the details of the SDR in November last year. It stressed at the time the importance of data management to compliance with the SDR last year. Firms in scope should “have in place appropriate resources, governance, and organisational arrangements, commensurate with the delivery of the sustainability objective”, it said.

“This includes ensuring there is adequate knowledge and understanding of the product’s assets and that there is a high standard of diligence in the selection of any data or other information used (including when third-party ESG data or ratings providers are used) to inform investment decisions for the product,” it said.

Legal experts questioned whether the UK’s financial industry would be able to fully comply. In a report published in April, international law firm Baker McKenzie asked whether firms would be able to keep up with the data requirements expected of the regulation, and questioned whether the data would even be available.

Careful Consideration

While gaps in ESG data still exist, A-Team Group’s ESG Data and Tech Summit London heard that the data record is improving with many more vendors providing ever granular datasets. Market figures caution, however, that the data imperative of the SDR should still be carefully considered.

“With more specific product labelling rules set to apply to from July, UK firms must brace themselves for these ongoing changes to better navigate the complexity jungle. It is clear data and regulatory content mapping is the key differentiator for service providers here – relying on trusted vendors that can provide quality, accurate data and content in pre-established delivery formats,” said Martina Macpherson, head of ESG product strategy and management in the Financial Information division at SIX.

“This is the only way firms can back up their sustainability credentials, meaning they will be better placed to meet new regulatory requirements and prepare for those to come later this year.”

The post UK’s Debut SDR Rules Raise Data Management Concern appeared first on A-Team.

]]>
The New Frontier of Outsourced Data Management: S&P Global Market Intelligence Report https://a-teaminsight.com/blog/the-new-frontier-of-outsourced-data-management-sp-global-market-intelligence-report/?brand=dmi Mon, 03 Jun 2024 09:27:19 +0000 https://a-teaminsight.com/?p=68690 Digitalisation has taken financial institutions along a prosperous path of better understanding, management and utilisation of the data that their activities generate. But technological evolution and the changed economic environment have placed a new set of challenges onto their shoulders. Institutions are now grappling with how they can take their digital programme further, especially given...

The post The New Frontier of Outsourced Data Management: S&P Global Market Intelligence Report appeared first on A-Team.

]]>
Digitalisation has taken financial institutions along a prosperous path of better understanding, management and utilisation of the data that their activities generate. But technological evolution and the changed economic environment have placed a new set of challenges onto their shoulders.

Institutions are now grappling with how they can take their digital programme further, especially given that the rising demand for data-management expertise has made it difficult to find the talent to put plans into action. The answer lies in outsourcing data management capabilities to a partner that can take a holistic view of an organisation’s data estate and processes, argues S&P Global Market Intelligence.

In a report published by A-Team Group, the company says that a new generation of third-party data provision is called for, one that can offer the technology and the data feeds to accelerate the digitalisation of institutions as well as the know-how to execute their programmes.

“Today, institutions’ needs are more nuanced and sophisticated. In this new marketplace, the service providers that will prosper are those that can offer data management and analytics skills alongside trusted, robust data sources and underpinned by scalable technology,” the report states. “Not only that, but these solutions must also be configurable to the new investment and risk-management use cases.”

Evolving Strategies

The S&P Global Market Intelligence report, entitled “The Evolution of Outsourcing Data Operations for ESG and Private Assets”, argues that established outsourcing strategies have tended to be focused on providing solutions to specific challenges.

The new alternative is a strategy such as that taken by S&P Global Market Intelligence’s cloud-based Data Management as a Service offering. This solution considers institutions’ broader needs – from sourcing through to distribution and monitoring – and, importantly, it is scalable.

“This solution can be seen as a one-stop-shop in which institutions leverage all the opportunities of software, data and third-party competencies via the cloud, to fully extract the value inherent in their data and scale their operations,” the report states.

S&P Global Market Intelligence illustrates how its solution can help in this scenario through the lens of two new use cases that such organisations are increasingly having to tackle: private market investment and integration of ESG data and processes.

The report argues that both domains offer separate novel data challenges that can be solved through the

The report also offers insights into how:

  • The new trading environment is placing novel data challenges
  • Cloud solutions are helping institutions overcome new data management pressures
  • Tight data talent markets are impacting institutions
  • Data Management as a Service brings together tools and skills that enable professionals to tailor individual solutions to specific challenges.

Download the full report here.

The post The New Frontier of Outsourced Data Management: S&P Global Market Intelligence Report appeared first on A-Team.

]]>
Better Data, Better Business: Combat Identity-Related Fraud with the LEI https://a-teaminsight.com/blog/better-data-better-business-combat-identity-related-fraud-with-the-lei/?brand=dmi Mon, 03 Jun 2024 09:21:36 +0000 https://a-teaminsight.com/?p=68685 By Clare Rowley, Head of Business Operations at the Global Legal Entity Identifier Foundation (GLEIF). The global economy is wrestling with never-before-seen levels of identity-related fraud. Cybercrime costs in the US reached an estimated $320 billion as of 2023, according to Statista. Between 2017 and 2023, this figure has seen a significant increase of over...

The post Better Data, Better Business: Combat Identity-Related Fraud with the LEI appeared first on A-Team.

]]>
By Clare Rowley, Head of Business Operations at the Global Legal Entity Identifier Foundation (GLEIF).

The global economy is wrestling with never-before-seen levels of identity-related fraud. Cybercrime costs in the US reached an estimated $320 billion as of 2023, according to Statista. Between 2017 and 2023, this figure has seen a significant increase of over $300 billion. According to the latest estimates, this dynamic will continue in upcoming years, reaching approximately $1.82 trillion in cybercrime costs by 2028. This increase in digital crime is causing substantial financial damage globally and destroying vital trust between counterparty organisations, particularly those operating across borders and legal jurisdictions.

In a world facing unprecedented digital crime, secure, reliable, and globally recognised organisational identities are a vital prerequisite and a foundation for prospering global trade. Data quality is the bedrock of trust and compliance in the international business sphere. Yet, for this data to deliver on its potential, it must be trustworthy, easily accessible, and accurate.

The Legal Entity Identifier (LEI)

This is where the LEI comes in. The LEI is the only global solution providing organisations with reliable data to unambiguously identify companies and corporate structures worldwide. As a universal ISO identification standard and a code that connects entities to crucial reference information, including ownership structure, the LEI tackles data reconciliation problems across borders and promotes an interoperable identity standard. With more than 2.5 million entities and 400,000 relationships, the openly available LEI dataset provides crucial information about the names, locations, and legal forms of subsidiaries, parents, and company holdings. LEI data helps businesses understand who they’re dealing with.

The Global LEI System creates a never-before-seen level of transparency in party identification. It lays the groundwork for more informed business decisions, fosters growth, encourages collaborations, and deters financial crimes. By addressing inconsistencies in identifying entities, connecting a greater range of datasets, and capturing entity relationships and ownership structures, the LEI can support improved risk management and enable enhanced monitoring, reporting, and analytics.

A deep understanding of corporate legal complexity to identify any challenges

The power of the LEI is underpinned by the quality and precision of its data. A new initiative, the Policy Conformity Flag, designed to encourage all LEI-holding entities to declare and maintain all requested data in the LEI record, is poised to further drive transparency on the completeness and accuracy of reference data within the LEI record. LEIs help businesses identify entities across borders and jurisdictions quickly and easily, making global trade safer and more transparent.

The Policy Conformity Flag is a compelling opportunity to encourage and promote enhanced transparency in transactions via the current and complete reporting by legal entities of open, standardised, and high-quality legal entity reference data available to users and maximising the transparency and trust among market participants.

Entities with current and complete information demonstrate their unwavering commitment to transparency by enabling closer monitoring of transaction data and supporting greater clarity in their ownership structures. Their conforming status also signals to partners and other organisations that their LEI can reliably streamline due diligence checks, onboarding, and other counterparty processes, making them easier to do business with.

An up-to-date LEI helps ensure compliance with international regulations

A legal entity with an LEI also benefits from being fast-tracked to regulatory compliance. A diverse and growing number of regulatory frameworks have already mandated its use globally. Offering a clear-cut and accessible profile of each entity can shortcut myriad due diligence processes, many of which are routinely hampered by basic questions like ‘Who is who?’ and ‘Who owns whom?’

For businesses, simplifying and streamlining risk management, compliance, Know Your Customer and Know Your Business processes and client relationship management will result in a faster and smoother path to growth.

LEI to facilitate faster and simpler transactions and partnerships

The effectiveness of LEI data is dependent on its timeliness and accuracy, making the LEI renewal process crucial. Delayed renewals can lead to many issues, including data reliability concerns and the potential for non-compliance penalties.

The quality of data is the bedrock of trust and compliance in the international business sphere. This call for exemplary quality in organisational data is not merely a play for greater compliance. It is a gateway to untapped global growth. It invites legal entities everywhere to unite for an open, transparent, and trustworthy business environment. Because better data means better business.

The post Better Data, Better Business: Combat Identity-Related Fraud with the LEI appeared first on A-Team.

]]>