A-Team https://a-teaminsight.com/ Mon, 22 Jul 2024 09:23:25 +0000 en-GB hourly 1 https://wordpress.org/?v=6.5.5 https://a-teaminsight.com/app/uploads/2018/08/favicon.png A-Team https://a-teaminsight.com/ 32 32 Evolving with the Market: Technology Strategies for Modern Sell Side Firms https://a-teaminsight.com/blog/evolving-with-the-market-technology-strategies-for-modern-sell-side-firms/?brand=ati Mon, 22 Jul 2024 09:23:25 +0000 https://a-teaminsight.com/?p=69422 When making strategic decisions regarding trading technology, sell-side firms such as investment banks and brokers face some difficult choices. Their technology platforms must do more than just meet their internal needs, such as; accessing liquidity on multiple trading venues, managing diverse asset classes, facilitating high touch and low touch order flow, providing their sales traders...

The post Evolving with the Market: Technology Strategies for Modern Sell Side Firms appeared first on A-Team.

]]>
When making strategic decisions regarding trading technology, sell-side firms such as investment banks and brokers face some difficult choices. Their technology platforms must do more than just meet their internal needs, such as; accessing liquidity on multiple trading venues, managing diverse asset classes, facilitating high touch and low touch order flow, providing their sales traders with efficient workflows, and ensuring compliance and security needs are met. This essential functionality and connectivity is a given. Beyond satisfying these fundamental requirements however, the technology also needs to accommodate the constantly changing needs of their buy side clients, whether hedge funds, asset managers, or other investment firms.

The buy-side landscape is never static, it continually evolves. While the pursuit of alpha remains a constant, and cost and risk optimisation are ever-present concerns, buy-side firms today operate amidst an ever more complex array of tools, applications, and data sources.

This presents an opportunity as well as a challenge to the sell-side. If they can help their clients streamline and enhance workflows to reduce manual intervention, minimise errors, and accelerate trading and investment decisions, and if they can provide them with a way to lower operational costs while also enabling fast and profitable responses to market opportunities to better generate alpha, they can gain a serious competitive advantage.

Servicing the buy side ecosystem

With multi-asset strategies becoming more and more common, firms increasingly look to their sell-side providers to facilitate trading across a diverse range of instruments and asset classes through a single interface – whether UI or API – and to handle complex orders involving multiple instrument types, such as structured trades or multi-asset baskets for example, across different trading venues and diverse markets.

There is also an increasing emphasis on data-driven decision-making. While systematic and quantitative traders have always relied on data and models, fundamental and ‘quantamental’ firms are increasingly relying on data-driven insights to drive – or at least support – their investment and trading strategies. Firms now seek from their sell-side providers not only market data, analytics, and research, but also well-documented open APIs that allow them to seamlessly integrate such data into their proprietary models to inform and execute their trading strategies.

“The real challenge for the sell-side is adopting a technology strategy that balances their own internal needs with the ever-changing needs of their clients, one that effectively serves both,” observes Medan Gabbay, Chief Revenue Officer of multi-asset trading solutions vendor Quod Financial. “The buy side have their own technological ecosystem, made up of Portfolio Management Systems, Order Management Systems, applications for creating and managing trading strategies, various types of analytics tools, spreadsheet-based models, and a wide range of other systems they use in their day-to-day trading activities across the front, middle and back office. Forward-looking sell-side firms understand that a key part of their role is to facilitate this ecosystem, by using their technology to help clients trade their chosen markets in the way they want to trade them, as well as providing the necessary analytics and data in a format that helps them identify trading opportunities and manage their investment strategies.”

The key question for the sell side is, how to achieve the necessary agility in technology that will enable them to respond to the changing demands of their clients?

Moving beyond the buy-build debate

Several options exist. There are a number of well-established vendors who sell ‘off-the-shelf’ trading platforms, which can address many of the sell side’s needs. These platforms provide a range of essential features such as liquidity access, connectivity, order and execution management, analytics, and market data handling. However, while such off-the-shelf systems are generally adequate for day one requirements, they often lack the flexibility to rapidly adapt to changing customer needs and the dynamic nature of the markets. Firms relying solely on these platforms might therefore find themselves constantly behind the curve, limited as they are by their vendor’s upgrade and development cycles.

At the other end of the spectrum, firms may opt to build their own bespoke platforms tailored to their own specific requirements.  While this offers maximum control over the design and development process, it’s an expensive and complex undertaking, and is out of reach for most firms, other than tier one banks with substantial technology budgets.

A third option is becoming increasingly popular amongst forward-looking firms, that of buy and build. Vendor platforms that are built on modern, scalable, and adaptable technology, can be quickly deployed to meet a firm’s immediate needs and then adapted, customised and expanded as requirements evolve.

This type of approach offers various benefits, according to Gabbay. “Platforms built on this type of architecture are highly interoperable, easily integrating with other systems on both the front end – through desktop widgets for example – and the back end, through APIs. They are also much more scalable, capable of being deployed on hosted services including Cloud, on-premise, or a hybrid of the two, which leads to improved performance and better customisation to the clients specific infrastructure requirements. Additionally, being built around a component-based architecture, they offer flexibility and allow for rapid customisation, as individual modules can be created and adapted to suit specific customer requirements, new areas of functionality, evolving business processes, or changing regulatory and market structures.”

Gabbay points out that trading platforms architected in this way can also be more easily integrated with clients’ trading desks. This level of integration benefits both the client – for example through more efficient and transparent trade execution and real-time order/position monitoring – and the sell-side firm itself, by providing a better understanding and greater visibility of their clients’ activities and workflow.

For sell-side firms with limited resources, or those that believe their resources can be better invested in creating IP and not rebuilding existing technology, this approach can offer the best of both worlds – the rapid implementation and comprehensive functionality of a vendor platform, together with the flexibility, adaptability, scalability and capacity for integration of a custom-built solution. By adopting such an approach, a firm can distinguish itself from competitors who use generic or outdated vendor platforms, and compete more effectively with larger tier one banks that have developed their own solutions.

Artificial intelligence and machine learning

Another strategic choice for the sell-side is how to make best use of Artificial Intelligence (AI) and Machine Learning (ML). Although neither are new in Capital Markets, interest in AI has exploded since OpenAI introduced ChatGPT in November 2022. Since then, firms have identified a wide range of applications for Generative AI (GenAI) and the use of Large Language Models (LLMs).

One area where GenAI can add significant value in modern, component-based trading platforms, explains Gabbay, is its ability to accelerate the development and testing lifecycle, by automating coding processes and influencing all disciplines involved in defining, building, testing, operating, and supporting complex requirements. This allows firms to bring new functionality to market much more quickly than was previously possible.

“GenAI can generate test scenarios automatically by analysing the code base and understanding the purpose of different components,” he says. “It can then identify potential test cases, simulate different scenarios, and generate test data, thereby eliminating the need for manual test scenario creation. Additionally, by leveraging ML and AI algorithms, it can simulate user interactions, input test data, and validate the expected outputs. This automation reduces the reliance on manual testing, speeds up the testing process, saves time and effort, and improves overall efficiency.”

Outside of GenAI, modern trading platforms can also utilise ML within algorithmic trading, identifying and exploiting patterns in trade execution by analysing market conditions, liquidity, and order book dynamics. By scrutinising vast amounts of historical and real-time order book data to identify patterns and trends, ML-trained algorithms can determine the optimal timing, price, and quantity for executing trades, thus minimising transaction costs and market impact.

ML is also being increasingly used to develop intelligent Algo Wheels. These allow firms to analyse their incoming flow, so that the right execution strategies and order routing destinations can be automatically chosen, and optimised based on current market conditions and client-specific requirements.

Primary considerations for the sell side

Given the numerous challenges that sell-side firms face from a trading technology perspective, and the various choices they have available, what are the key considerations they need to take into account when evaluating trading platforms?

First of all, support and training are vital aspects of any technology implementation. Even the most intuitive platforms require a period of adaptation, and comprehensive training is crucial to maximise their potential. Vendors should provide robust support services to assist with both onboarding and continuous usage. This includes not only technical support but also strategic guidance to help teams leverage the platform’s full capabilities. Adequate training and support ensure that any investment in trading technology yields the maximum possible return.

Interoperability is another key factor. “A new trading platform should integrate seamlessly with existing systems to avoid operational disruptions,” advises Gabbay. “Ensuring smooth interoperability minimises the risk of data silos and ensures that all parts of your trading ecosystem can communicate effectively. This not only streamlines operations but also enhances data accuracy and decision-making processes.” Platforms that fail to integrate well can lead to significant headaches, requiring additional resources to bridge gaps between systems and potentially leading to costly errors.

Scalability is also essential for any trading platform. As trading volumes increase and new asset classes are added, the platform must scale efficiently to handle these changes. Scalability includes the ability to automate processes and manage higher trading volumes without performance degradation. “A scalable platform supports business growth by ensuring that system performance remains robust even as demands increase,” says Gabbay. “This scalability is not just about handling volume but also about expanding capabilities and accommodating new functionalities as trading strategies evolve.”

Flexibility around customisation is also important, according to Gabbay. “The platform should be capable of swiftly adapting to evolving workflows without causing bottlenecks,” he says. “Your technology shouldn’t become an obstacle, but a facilitator of change. Customisable platforms ensure that you can tailor the tools to meet specific trading needs.”

Key success factors

It’s clear that the dynamic nature of the buy-side presents both challenges and opportunities for sell-side firms. To stay competitive, banks and brokers need to consider a technology strategy that balances their internal needs with the ever-evolving demands of their clients. Whether choosing off-the-shelf platforms, bespoke solutions, or a hybrid approach, sell-side firms might want to prioritise agility, integration, and scalability in their technology stack.

Additionally, the strategic use of AI and ML can significantly enhance trading efficiency and decision-making processes. By embracing these advanced technologies and maintaining a flexible, client-centric approach, sell-side firms can not only meet the complex requirements of today’s market but also position themselves for sustained success in the future.

Robust support and training, seamless interoperability, and the ability to scale and customise are also critical factors that will determine the sell-side’s ability to capitalise on market opportunities and deliver superior value to their clients.

The post Evolving with the Market: Technology Strategies for Modern Sell Side Firms appeared first on A-Team.

]]>
Euronext Launches EWIN Microwave Link Between London & Bergamo, Halving Order Transmission Latency https://a-teaminsight.com/blog/euronext-launches-ewin-microwave-link-between-london-bergamo-halving-order-transmission-latency/?brand=ati Thu, 18 Jul 2024 10:49:32 +0000 https://a-teaminsight.com/?p=69414 Euronext, the pan-European exchange and market infrastructure group, has launched the Euronext Wireless Network (EWIN), making it the first exchange in Europe to offer ‘plug & Play’ order entry via microwave technology, and significantly enhancing the speed of order transmissions between London, UK, and Bergamo, Italy. The launch of EWIN represents a significant technological advancement...

The post Euronext Launches EWIN Microwave Link Between London & Bergamo, Halving Order Transmission Latency appeared first on A-Team.

]]>
Euronext, the pan-European exchange and market infrastructure group, has launched the Euronext Wireless Network (EWIN), making it the first exchange in Europe to offer ‘plug & Play’ order entry via microwave technology, and significantly enhancing the speed of order transmissions between London, UK, and Bergamo, Italy.

The launch of EWIN represents a significant technological advancement for the exchange group. Developed in collaboration with McKay Brothers, the independent microwave network provider, and leveraging the faster transmission speeds of microwave technology, EWIN offers a direct and highly efficient communication pathway, reducing the time required to send orders from London Equinix LD4 to Euronext’s Optiq matching engine in Bergamo IT3 to under four milliseconds, around half the latency of existing fibre links. EWIN is also designed to ensure seamless and efficient order handling, offering 100% resilience, thanks to its full fibre back-up.

Major financial firms Goldman Sachs and Morgan Stanley have already adopted the new technology.

“After establishing our new IT3 data centre in Bergamo, near Milan, we realised from a few large tier-one brokers that they were interested in exploring the performance benefits of microwave technologies,” explains Nicolas Rivard, Global Head of Cash Equity and Data Services at Euronext, in conversation with TradingTech Insight. “Although microwave networks have been around for some time and are relatively established for certain participants, it is a costly and complex technology with a high barrier to entry. Typically, you cannot buy a small amount of bandwidth, which makes the solution expensive. Additionally, there is a technical aspect because you need to develop IT capabilities to route your orders through the microwave. By default, if you buy bandwidth from a microwave provider, it’s not plug-and-play; you need to develop your protocol into the technology.”

Euronext has worked closely with McKay Brothers to address these challenges, says Rivard. “To lower the barrier to entry in terms of cost, we have purchased a bulk of bandwidth and are offering it to clients in slices, starting from 1 Mbps upwards. This means that clients can try it for six months at 1 Mbps for example, and then scale up as needed, rather than committing to a costly solution from the outset. And to address the technical complexities, the solution we’ve developed together with McKay Brothers allows clients to use the microwave link as if it were any other standard connectivity, making it very plug-and-play.”

The microwave route, provided by McKay Brothers, has been operational for two years, since Euronext went live on IT3 in Bergamo in June 2022. But this is the first time McKay’s technology has been used to underpin an exchange’s own solution.

“The development and design of this service has been quite new compared to our usual offerings,” says Stéphane Tyc, Co-Founder of McKay Brothers. “Typically, when a client purchases microwave bandwidth, they need to undertake significant internal development to integrate with the network. However, Euronext’s end clients don’t need to perform any additional integration work; they simply need to set up logical access to Euronext’s matching engine, a process they are already familiar with. And then they can benefit from a fast network that competes with the microwave products used by market makers. The important thing here is that firms who want to use this link can now just go direct to the exchange to access it, without having to put in place dedicated technology.”

Given that microwave networks are susceptible to weather and other atmospheric conditions, how does Euronext ensure resiliency? “We have two routes, one microwave and one fibre, and they work seamlessly together,” says Rivard. “We have ensured, with McKay and our internal IT team, that every order gets sent twice, once via microwave and once via fibre. The first order that reaches the IT3 datacentre is processed, and the other is blocked by the system. This guarantees 100% redundancy, increasing the overall availability of the service.”

The link is now operational, with Morgan Stanley and Goldman Sachs having gone live on day one, 10th July. “The technology has delivered on its promise so far, with latency below four milliseconds and very stable performance,” says Rivard. “Clients are currently only sending specific order types via EWIN to improve certain latency sensitive execution strategies, such as IOC (immediate or cancel) and other aggressive orders. The number of packets going through the microwave is what we expected. And of course, this is just the beginning.”

Both Euronext and McKay Brothers talk of this new service as a way of further democratising the market, bridging the gap between prop trading firms/market makers and banks/brokers. So will it be rolled out to other European centres?

“First, we need to make sure it works from London, to prove that it has an impact and is beneficial for our clients. That will take a few months to confirm,” says Rivard. “But we already have clients interested in having the same service from other locations and asset classes in Europe.”

The post Euronext Launches EWIN Microwave Link Between London & Bergamo, Halving Order Transmission Latency appeared first on A-Team.

]]>
Liquidnet and Boltzbit Collaborate, Utilising GenAI to Accelerate Bond Primary Markets Workflow by 90% https://a-teaminsight.com/blog/liquidnet-and-boltzbit-collaborate-utilising-genai-to-accelerate-bond-primary-markets-workflow-by-90/?brand=ati Wed, 17 Jul 2024 13:11:10 +0000 https://a-teaminsight.com/?p=69336 Liquidnet, the technology-driven agency execution specialist, has partnered with AI startup Boltzbit to enhance its fixed income primary markets workflow, using generative AI (GenAI) technology to reduce the time required to process unstructured deal data and prepare bonds for trading, by 90%. The collaboration accelerates the processing and display of newly announced bond deals by...

The post Liquidnet and Boltzbit Collaborate, Utilising GenAI to Accelerate Bond Primary Markets Workflow by 90% appeared first on A-Team.

]]>
Liquidnet, the technology-driven agency execution specialist, has partnered with AI startup Boltzbit to enhance its fixed income primary markets workflow, using generative AI (GenAI) technology to reduce the time required to process unstructured deal data and prepare bonds for trading, by 90%. The collaboration accelerates the processing and display of newly announced bond deals by leveraging Boltzbit’s advanced AI machine learning solutions and custom workflow model.

By integrating Boltzbit’s AI technology, Liquidnet can now offer members and partner syndicate banks faster access to trading and data distribution, processing and displaying bond deals at a rate significantly faster than its previous parsing technology. This ensures that bonds are quickly available through the company’s deal announcement dashboard and new issue order book.

“This partnership improves the speed at which we can process messages, create, and then send structured data directly to our clients, which in turn allows them to quickly populate their OMS and prepare for trading,” says Mark Russell, Head of Fixed Income Strategy at Liquidnet, in conversation with TradingTech Insight. “The quicker we can do this, the better it is for those clients. Beyond this, the clients of our new issue Trading Platform (grey market) benefit as we are able to launch the new bonds on the screen earlier, giving those clients earlier access and more time to trade.  More trading time on our visible trading platform means more transparent data points, which is very useful for the syndicates and issuers as they get a view as to what is going on in the market.

“Structuring the bond data is not done in a single step, during the bond creation process we need to interpret the market chat, back and forth messaging, that drives the final structure of the bond,” explains Russell. “Our system needs to be able to capture and update any changes to the meta-data, such as coupons, issuers, benchmark, maturity etc. that describe the bond and feed those changes into the trading platform and other information platforms.”

He continues: “We’ve automated this process extensively with our partners at Boltzbit, creating a tool that handles the heavy lifting of structuring this data into a comprehensible bond format. Our partnership with Boltzbit is focused on speeding up and enhancing accuracy, bypassing traditional parsing tools and leveraging artificial intelligence instead.”

Boltzbit’s GenAI technology utilises the data captured from messages exchanged across various mechanisms and channels to create a large language model (LLM) that transforms the information into a structured and usable format.

“This process might seem simple, but it was actually extremely challenging,” explains Dr Yichuan Zhang, CEO and co-founder at Boltzbit. “Firstly, it involves very complex business processes. It’s not just about parsing one email; understanding the context of the conversations and the associated business processes is essential. Secondly, this is a highly specific solution, requiring the model to be extremely accurate and to follow the precise logic of the business flow around new issues. Finally, the solution needed to be highly secure and deployed in a way that allowed Liquidnet full control.”

Since the launch of its primary markets offering in 2022, Liquidnet has achieved record trading volumes in its new issue order book and increased participation from over 35 European syndicate banks, highlighting the company’s commitment to modernising primary markets and delivering substantial value to clients and the industry.

In addition to partnering with Boltzbit, Liquidnet has previously collaborated with NowCM and BondAuction, reinforcing its dedication to fostering efficiencies and connectivity for investors, banks, and issuers through strategic partnerships.

The post Liquidnet and Boltzbit Collaborate, Utilising GenAI to Accelerate Bond Primary Markets Workflow by 90% appeared first on A-Team.

]]>
Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale https://a-teaminsight.com/blog/alveo-and-gresham-merge-to-offer-data-services-at-significant-scale/?brand=ati Wed, 17 Jul 2024 10:45:20 +0000 https://a-teaminsight.com/?p=69331 Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale. The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the...

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>
Data management software and services providers Alveo and Gresham Technologies have merged in a deal that the newly augmented company says will offer clients data automation and optimisation at “significant” scale.

The new business, which will be known as Gresham, will be based in London with former Gresham Technologies chief executive Ian Manocha continuing the role at the company and Mark Hepsworth, who headed Alveo, taking the chair’s position.

The combined companies marry Gresham Technologies’’ transaction control and reconciliations, data aggregation, connectivity solutions and regulatory reporting capabilities with Alveo’s enterprise data management for market, reference and ESG data.

The range of data automation and process solutions it can offer will reduce the total cost of ownership of clients’ data, Gresham said.

“The combination of the two firms accelerates our journey to bring digital integrity, agility, operational efficiency and data confidence to financial markets globally,” said Manocha. “It creates a comprehensive set of solutions for data automation, operational efficiency, data management, analytics and risk mitigation for financial and corporate clients globally.”

The terms of the deal were not disclosed but Alveo’s majority owner, technology-focused private equity firm STG, backed the merger.

London-based Alveo was founded in 1991 as Asset Control, one of the first third-party enterprise data management service providers. It changed its name in 2020 after becoming a cloud-native, managed-service provider.

Gresham Technologies began life as Gresham Computing offering real-time transaction control and enterprise data integrity solutions.

Hepsworth said the newly enlarged company will be able to meet the increasing data demands of clients.

“We can now offer clients greater scale and a wider range of solutions that will simplify their operations and enable them to manage data more effectively,” he said.

The post Alveo and Gresham Merge to Offer Data Services at ‘Significant’ Scale appeared first on A-Team.

]]>
DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate https://a-teaminsight.com/blog/dtcc-ficc-releases-tools-to-help-firms-address-incoming-sec-central-clearing-mandate/?brand=ati Tue, 16 Jul 2024 11:34:46 +0000 https://a-teaminsight.com/?p=69309 The Fixed Income Clearing Corporation (FICC), a subsidiary of the Depository Trust and Clearing Corporation (DTCC), has launched two new publicly available tools to help participants navigate the financial obligations that come with membership in a clearing system. The facilities are aimed at helping firms address the post-trade implications of a Securities and Exchange Commission...

The post DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate appeared first on A-Team.

]]>
The Fixed Income Clearing Corporation (FICC), a subsidiary of the Depository Trust and Clearing Corporation (DTCC), has launched two new publicly available tools to help participants navigate the financial obligations that come with membership in a clearing system.

The facilities are aimed at helping firms address the post-trade implications of a Securities and Exchange Commission (SEC) July 2023 rulemaking that mandated central clearing for a wide range of U.S. Treasury (UST) securities transactions including cash, repurchase agreements (repos) and reverse repos.

This new rule will have a significant impact on UST post-trade operations for all participants that currently clear and settle their trades on a bilateral basis. These participants will now have to find an appropriate way to connect with a central clearing system and make the necessary changes in their clearing and settlement technology.

The UST market sees daily transactions averaging over $700 billion in cash and $4.5 trillion in financing, making it vital for U.S. government funding, monetary policy, and as a safe haven for global investors. The market has grown rapidly and disproportionately where currently, 87% of this trading activity is cleared bilaterally.

Several liquidity events over the past decade highlighted vulnerabilities in the treasury market where the systemic risk of a non-participant failing required mitigating. The SEC’s final rule, adopted in December 2023, aims to expand central clearing to mitigate such counterparty and systemic risks.

The new rule seeks to transition a substantial portion of the daily US $4.9 trillion treasury market activity to central clearing through a central counterparty (CCP). Currently, the only authorised CCP for the UST market is FICC. However, other CCPs have expressed interest, among them London Clearing House (LCH).

Tools of the Trade

The first of the new FICC tools, a Capped Contingency Liquidity Facility (CCLF) Calculator, is designed to increase the transparency into the financial obligations associated with membership in the FICC Government Securities Division (GSD).

The CCLF is a critical risk management facility designed to provide FICC with additional liquidity resources to meet cash settlement obligations in the event of a default by the largest netting members (see DTCC Risk Management Tools). By allowing firms to estimate their potential CCLF obligations, the calculator aids in better liquidity planning and risk management. This can make FICC membership more attractive and manageable for a broader range of market participants, including smaller institutions and buy-side firms.

The calculator helps firms anticipate and plan for the liquidity commitments required under the new SEC clearing mandates. By providing upfront attestations regarding their ability to meet CCLF obligations, firms can ensure they are prepared to comply with the expanded central clearing requirements for U.S. Treasury securities.

The second is a Value at Risk (VaR) calculator from DTCC to help market participants evaluate potential margin and clearing fund obligations associated with joining GSD. With U.S. Treasury Clearing activity through FICC projected to increase by US$4 trillion daily following the expanded clearing mandate in 2025 and 2026, the VaR calculator will be essential for firms to accurately determine their VaR and margin obligations for simulated portfolios.

Tim Hulse, Managing Director of Financial Risk & Governance at DTCC, emphasized that VaR is a key risk management concept and a primary component of GSD’s Clearing Fund requirements. The calculator uses historical data, volatility, and confidence levels to estimate VaR, thus enhancing market transparency. It allows market participants to calculate potential margin obligations for given positions and market values using FICC’s VaR methodology.

Hulse highlighted the urgency of evaluating firms’ risk exposure with the expansion of U.S. Treasury Clearing, noting that the VaR calculator offers increased transparency into these obligations.

These tools are public and not restricted to member firms This means that as firms consider their optimal approach to access central clearing for compliance with the the new clearing rules, these risk tools can provide the necessary transparency and support as firms evaluate the different types of membership and models with GSD.

The SEC has introduced several measures to make FICC access more inclusive. FICC offers multiple membership models, including Netting Membership, Agented Clearing, Sponsored Membership, and Centrally Cleared Institutional Triparty (CCIT) Membership, catering to a wide range of market participants from large banks to hedge funds. The SEC has provided temporary regulatory relief to address custody and diversification concerns for registered funds.

CCIT membership primarily benefits institutional cash lenders such as corporations, asset managers, insurance companies, sovereign wealth funds, pension funds, municipalities, and State treasuries. It allows these entities to engage in tri-party repo transactions with enhanced risk management and operational efficiency provided by FICC. The central clearing of these transactions helps reduce counterparty risk, ensure the completion of trades, and potentially offer balance sheet netting and capital relief for participants.

The Securities Industry and Financial Markets Association (SIFMA) is actively coordinating multiple work streams that involve both buy-side and sell-side members. These efforts aim to accelerate the necessary transitions for the clearing mandates. Key aspects include engaging with the SEC and other regulatory agencies to address market access issues, particularly for registered funds and margin transfers, which are crucial for ensuring a smooth transition to central clearing.

Developing an operations timeline with key milestones is another critical task. This timeline will guide the transition to full central clearing by June 2026 for repos. Addressing issues related to market plumbing and connectivity is also vital to support the increase from 13% to 100% clearing. This involves ensuring that all participants can effectively connect to and use the central clearing infrastructure.

Regular communication with market participants is planned to keep them informed about progress and strategies for meeting the clearing deadlines. This will include updates on the status of various strategies and the overall progress towards the deadlines. SIFMA will also engage in regular discussions with the SEC and other agencies to ensure they are aware of the progress and any potential needs for timeline adjustments or phased rollouts.

Legal and enforceability issues will be addressed by obtaining netting enforceability opinions in relevant jurisdictions to support large-scale clearing. This step is closely tied to the development of market standard documentation. Additionally, new documentation approaches that leverage modern communication methods will be evaluated to increase efficiency.

Stakeholder engagement is essential to confirm the status of various strategies and ensure alignment with the clearing deadlines. SIFMA plans to reach out to market participants regularly to keep them informed and engaged. This will help ensure that all participants are on track to meet the clearing mandates.

Lastly, future planning includes preparing for additional publications and podcasts to keep the membership and broader public informed about ongoing efforts around Treasury clearing. This will ensure that everyone remains updated on the progress and any developments related to the central clearing mandate.

The post DTCC FICC Releases Tools to Help Firms Address Incoming SEC Central Clearing Mandate appeared first on A-Team.

]]>
DORA: Preparing the Pathway to Enhanced Operational Resilience https://a-teaminsight.com/blog/dora-preparing-the-pathway-to-enhanced-operational-resilience/?brand=ati Tue, 16 Jul 2024 09:54:57 +0000 https://a-teaminsight.com/?p=69295 By David Turmaine, Head of International at Broadridge Consulting Services, and Maria Siano, Head of International Strategy at Broadridge. Today’s digital world is increasingly complex, characterised by interconnected systems and data that is stored, and widely shared, online. Looking through a financial services lens, cyber threats and incidents are becoming more sophisticated, posing significant risks...

The post DORA: Preparing the Pathway to Enhanced Operational Resilience appeared first on A-Team.

]]>
By David Turmaine, Head of International at Broadridge Consulting Services, and Maria Siano, Head of International Strategy at Broadridge.

Today’s digital world is increasingly complex, characterised by interconnected systems and data that is stored, and widely shared, online. Looking through a financial services lens, cyber threats and incidents are becoming more sophisticated, posing significant risks to financial stability and security.

The number of attack vectors has multiplied in line with the growing reliance on technology and associated spike in remote and decentralised working since the pandemic. A recent survey by the BCI, the global body for resilience professionals, revealed three-quarters of respondents had seen a rise in attempted breaches over the last year, with nearly 40% the victim of a successful cyber-attack.

The system modernisation and digitalisation journey that firms around the world are now undertaking, often to align with market developments such as the shortening of the settlement cycle to T+1, is filled with risks – which has led to a heightened regulatory focus on cybersecurity and operational resilience.

Against this backdrop, the EU’s Digital Operational Resilience Act (DORA) has come into force and in-scope firms – such as banks, investment firms, and designated fintechs – must be compliant from January 17, 2025.

DORA seeks to establish a clearer foundation for security and operational resilience in the financial services sector, while also aligning with other EU measures on cybersecurity and data. It is the most comprehensive resilience regulation currently yet seen in this space, but the thinking is reflected by other jurisdictions around the world, with regulators increasingly demanding that financial institutions bolster their operational resilience.

Japan, for example, has introduced the Economic Security Promotion Act (ESPA), whilst the Australian Prudential Regulation Authority (APRA) has published a new Prudential Standard (CPS 230 Operational Risk Management) that will direct how regulated entities manage operational risks, resilience, and business continuity. In July 2023, the US Securities and Exchange Commission (SEC) adopted rules requiring registrants to disclose material cybersecurity incidents.

What are the main components of DORA?

DORA is the most in-depth regulation to date aimed at strengthening cybersecurity amongst financial institutions.

It is seen as a means of compelling more firms to work internally, and with their third-party information and communications technology (ICT) service providers, to improve their threat assessments, cyber incident management, and overall resilience. It is also a positive step towards a more harmonised EU framework that will enhance the digital operational resilience of financial services across the region whilst preventing widespread contagion that could undermine the financial stability of the bloc.

DORA is structured around five pillars, which cover governance, resiliency, incident management, and reporting. A common thread is the protection of data as it passes through both a financial institution and then the ecosystem around it, such as vendors.

The first pillar is ICT risk management, which mandates firms to implement robust risk management practices for their systems to prevent cyber-attacks and disruptions. They must also develop and maintain effective recovery and continuity plans to ensure the uninterrupted provision of critical financial services in the event of a cyber incident.

The second pillar is incident management, with DORA requiring entities to establish and maintain robust mechanisms for identifying, classifying, and recording incidents. Additionally, financial institutions will be required to report significant incidents to regulators within a tight timeframe to ensure timely responses and coordination.

The third pillar is digital operational resilience testing, and here we see some of the newer demands that firms must now quickly familiarise themselves with. Firms must conduct regular resilience testing to verify the effectiveness of their digital resilience strategies, and this includes advanced threat-led penetration testing at least every three years to address higher levels of risk exposure. Test results will need to be sent to the regulator for validation and approval.

The fourth pillar relates to third party risk management and oversight. Recognising that the digital operations of many organisations are closely intertwined with third party providers, DORA puts an emphasis on managing the risks associated with these external partners. Firms will be expected to conduct enhanced due diligence on their providers and include provisions in their contracts to ensure they also comply with strict digital resilience standards.

The final pillar outlines the importance of sharing information and intelligence about cyber threats and vulnerabilities amongst organisations. By creating a more collaborative environment, the hope is firms can tap into a wealth of knowledge and experiences, building their capacity to predict and address challenges. This collective understanding can foster the creation of effective policies and proactive strategies, ultimately improving the digital resilience of individual organisations and the financial industry as a whole.

The key steps to building operational resilience

DORA will place further pressure on firms to implement better cybersecurity measures and bolster their operational resilience in the coming years, but it is already front of mind for many in the financial services industry.

Broadridge’s 2024 Digital Transformation & Next-Gen Technology Study highlighted that in the next two years, financial firms will boost their investments in cybersecurity by nearly a third (28%). Furthermore, cybersecurity is the top capability that executives expect from their technology vendors, outpacing their ability to deliver projects on time and on budget.

As we look towards the DORA compliance date next January, what steps should firms be taking to build up their operational resilience?

It is crucial to assess existing business practices and processes, and identify the gaps, when it comes to meeting the DORA requirements. This will enable firms to create a robust roadmap for compliance whilst implementing stronger ICT risk management practices.

The first thing for firms to do is to ensure they fully digest and understand the regulation, and how it impacts their business model. They can then correlate that against what is already in place for their operational resiliency. Firms then need to identify their risk factors and map them against DORA, as well as their existing enterprise risk framework.

These steps will allow firms to effectively carry out their remediation planning. Resiliency in the past has typically been quite inward looking, with a focus on ensuring their own house is in order. DORA shifts the dial and will mandate them to now extend this externally across third party vendors and strategic partners, analysing the critical paths for the critical functions, whether that is trade data, settlement data, or any other element.

Firms will need a complete line of sight so they can take an informed risk decision on each of their current resiliency stances and provisions in order to make sure they are compliant with DORA.

For larger firms, their size will make it more difficult to locate the risks. They will often have hundreds of internal applications and platforms they will need to dissect to understand the interdependencies and find the critical paths that hold the data. They will also need to ascertain the risks across their vendor community.

For smaller firms, the challenge will be finding the right people to guide this, who can do it alongside their day job. They may struggle to get this project shaped and delivered on time. And they should not underestimate the resources needed to do a thorough analysis and then implement the changes DORA requires. They will also need to effectively ensure ongoing regulatory compliance, which can be costly.

Continuous improvement is an objective of DORA. Some elements of the regulation are prescriptive in terms of duration and frequency – such as annual testing of all critical ICT systems, and the advanced threat-led penetration testing every three years. But it will also be important for firms to make sure they refer back to the regulation and remain compliant whenever they change their IT footprint by acquiring new technology, which potentially introduces new vulnerabilities.

Unlocking new benefits

Whilst the journey towards DORA compliance is complex, it is also one that can unlock significant benefits for ambitious financial services firms.

This includes improved cyber defences; DORA will help financial institutions to enhance their cybersecurity measures and protect their critical systems and data from increasingly sophisticated cyber threats.

By improving long-term operational resilience, DORA can also help to reduce the financial impact of cyber incidents and other disruptions, ultimately saving organisations from costly recovery efforts.

Financial firms can instil greater confidence amongst their customers and stakeholders by demonstrating their ongoing commitment to safeguarding digital assets and services. And, perhaps most importantly, given the increased interconnectivity of firms, DORA can drive greater resiliency across financial markets as a whole. It can help to safeguard the stability of the whole, as well as its parts.

The post DORA: Preparing the Pathway to Enhanced Operational Resilience appeared first on A-Team.

]]>
Informatica Sees a Future of AI-Focused Innovation Releases https://a-teaminsight.com/blog/informatica-sees-a-future-of-ai-focused-innovation-releases/?brand=ati Mon, 15 Jul 2024 13:52:15 +0000 https://a-teaminsight.com/?p=69280 Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering. Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership...

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Informatica has had a busy 2024, announcing major new innovations and partnerships as it brings artificial intelligence to the fore of its cloud-based data management offering.

Last month the California-based company deepened its association with Databricks, providing the full range of its AI-powered Intelligent Data Management Cloud capabilities within Databricks’ Data Intelligence Platform. The expanded partnership will enable joint customers to deploy enterprise-grade GenAI applications at scale, based on a foundation of high-quality, trusted data and metadata. That followed the unveiling of a similar association with Snowflake. It was also selected by Microsoft as the Independent Software Design (ISV) design partner for the software behemoth’s new data fabric product.

The frequency of the rollouts in recent months has been dictated by the rapidity with which Informatica’s financial institution clients are seizing on the potential of AI. Many are struggling to bring the technology into their legacy systems, while others have a vision of what they want to do with it but not the capability to implement it.

With the market also heavily weighted towards capitalising on the growing generative AI space, Informatica group vice president and head of EMEA North sales Greg Hanson said new developments and enhancements are on the cards for the near future.

“The critical foundational layer for companies is to get their data management right and if you look at the current state of most large organisations, their integration and their data management looks a bit like spaghetti,” Hanson tells Data Management Insight.

“They realise, though, that they have to pay attention to this strategic data management capability because it’s almost as fundamental as the machinery that manufacturers use to make cars.”

Rapid Change

Hanson says that the pace of innovation at Informatica is the fastest he’s seen in his two decades at the company because its clients understand the operational benefits to be gained from implementing AI-based data management processes. This “unstoppable trend towards AI” is being driven by board-level demand, especially within financial services, a sector he describes as being at the “bleeding edge” of technological adoption.

Many have had their appetites whetted by AI’s ability to streamline and improve the low-hanging fruit challenges they face, such as creating unique customer experiences and engagements. To embed and extend those AI-powered capabilities across their entire organisation, however, will take more effort, says Hanson.

“Their ability to harness data and exploit AI’s potential is going to be the difference between the winners and losers in the market,” he says. But the drive to get results quickly may lure firms towards rash decisions that could create more problems later.

“They need to think strategically about data management, but they can start small and focus on a small use case and an outcome that they can deliver quickly, then grow from there.”

Make it Simple

Among Informatica’s clients across 100 countries are banks such as Santander and Banco ABC Brasil, US mortgage underwriting giant Freddie Mac, insurer AXA XL and online payments provider PayPal. Among the services it’s providing such institutions are broad cost reduction by the optimisation of reference data operations and the simplification of their broader data processes.

This latter point is key to helping clients better use their data, says Hanson. Arguing that without good data inputs, AI’s outputs will be “garbage out at an accelerated pace”, he says that many companies have overcomplicated data setups that are hampering their adoption of the technology. By having separate tools to manage each element of their data management setup – including data access, quality, governance and mastering capabilities – large firms are strangling their ability to make AI work for them.

“But now complexity is out and simplicity is in,” Hanson says. “As companies modernise to take advantage of AI, they need to simplify their stacks.”

Enter GenAI

Informatica is helping that simplification through a variety of solutions including its own GenAI-powered technology for data management, CLAIRE GPT – the name being a contraction of “cloud AI for real-time execution”. The technology began life simply as CLAIRE seven years ago. Last year, however, it was boosted with the inclusion of GenAI technology, enabling clients to better control their data management processes through conversational prompts and deep-data interrogation.

Comparing the new iteration to Microsoft’s Copilot, Hanson says CLAIRE GPT now offers clients greater capabilities to simplify and accelerate how they consume, process, manage and analyse data.  Adding fuel to its firepower is CLAIRE GPT’s ability to enable individual clients to call on the combined metadata of Informatica’s 5,000-plus clients to provide them with smarter outputs.

While almost all of Informatica’s offerings are embedded with its new GenAI technology, the next step will be to ensure the company’s entire range of products benefits from it.

“Data management is complex and costly for many companies and it massively impacts the ability of the company to release new products, deliver new services and create more pleasing customer experiences,” he says.

“Our job with GenAI as the fundamental platform foundation is to offer more comprehensive services around that foundational layer of data management, and more automation and productivity around the end-to-end data management journey.”

The post Informatica Sees a Future of AI-Focused Innovation Releases appeared first on A-Team.

]]>
Financial Firms Have Widest Data Security Perception Gap: Survey https://a-teaminsight.com/blog/financial-firms-have-widest-data-security-perception-gap-survey/?brand=ati Mon, 15 Jul 2024 13:46:42 +0000 https://a-teaminsight.com/?p=69277 The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks. A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of...

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
The financial services sector has the widest gap between perceptions about its data security and its vulnerability to data attacks.

A survey by data security provider Dasera found that 73% of institutions questioned said they had high levels of confidence in their ability to fend off ransomware attacks, data breaches and other unauthorised uses of data. Nevertheless, records of attacks showed that those firms were among the worst affected in 2023.

“The significant number of breaches contradicts high confidence in their security strategy, suggesting overconfidence in their security posture,” the report, entitled The State of Data Risk Management 2024, stated. “The sector remains a prime target for cyberattacks due to valuable data, indicating a gap between perceived effectiveness and actual vulnerability.”

The report compared the perceptions of companies in a range of high-profile data-focused sectors, including healthcare and government, with statistics on data breaches compiled by a variety of organisations and studies. These include the Verizon Data Breach Security Report, Kroll’s Data Breach Outlook Report and the Identity Theft Resource Centre.

Record Year

The Dasera survey said the combined conclusions of those studies showed that 2023 was a “record-breaking year” for breaches.

According to Verizon, the financial services industry suffered 477 data security incidents in 2023, compared with 380 for IT firms and 433 in the healthcare sector. Only government bodies suffered more, at 582. Kroll found that financial firms accounted for the largest proportion of attacks, at 27%.

Two-thirds of breaches originated externally. With the balance coming from internal “threat actors”, the financial services firms were among the least protected against attacks from within their own systems.

The report found that 77% of breaches within the sector came from basic web application attacks, miscellaneous errors and system intrusions.

“The survey underscores the importance of adopting integrated and automated data security strategies to address these challenges,” the Dasera report stated. “Reliance on outdated, manual processes and slow adoption of automated systems contribute to current vulnerabilities. Organisations must prioritise modern, proactive approaches, including regular audits, strategic use of technology, and external consulting, to effectively navigate the evolving landscape of data risk.”

The post Financial Firms Have Widest Data Security Perception Gap: Survey appeared first on A-Team.

]]>
French Election Reminds Asset Managers to Expect the Unexpected https://a-teaminsight.com/blog/french-election-reminds-asset-managers-to-expect-the-unexpected/?brand=ati Mon, 15 Jul 2024 13:41:17 +0000 https://a-teaminsight.com/?p=69274 By Sam Idle, Solutions Consultant at Clearwater Analytics. **The latest results of the surprising snap French election are a timely reminder for asset managers to always expect the unexpected. The knock-on effects on their investments can create a metaphorical line at the door from anxious investors with a million questions on how their portfolios have...

The post French Election Reminds Asset Managers to Expect the Unexpected appeared first on A-Team.

]]>
By Sam Idle, Solutions Consultant at Clearwater Analytics.

**The latest results of the surprising snap French election are a timely reminder for asset managers to always expect the unexpected. The knock-on effects on their investments can create a metaphorical line at the door from anxious investors with a million questions on how their portfolios have been impacted.

Going into the run-off, the RN was widely expected to have a reasonable chance at gaining a majority. Instead, the leftist Nouveau Front Populaire bloc won the most seats in this most strange of elections, with Le Pen’s RN coming in third place. While the reaction from markets wasn’t as significant as it could have been, it still had impact on the French sovereign long-term borrowing rate against the German equivalent – a barometer of market sentiment towards French fiscal fortunes.

A major market event is often when an investment manager’s reporting and client servicing capabilities are tested to the limit. In the wake of election results, investors are desperate to get clarity on their portfolios, and how they are impacted. They are often on the phone, calling up their asset managers, requesting information on risk exposures, price impacts, and a plethora of other bespoke inquiries.

Outdate Data Systems

This is not just something that happens in isolation when market-affecting events occur though, it is part of a general trend for investors to become more demanding of the people managing their money. Perhaps this is reflective of the relatively easier access that they have to news in 2024 than they did 20 years ago.

As the news cycle intensifies, asset managers are struggling with outdated systems. Reliance on legacy infrastructure, combined with the piecemeal addition of new products, has made managing the growing volume and variety of data increasingly difficult. This information often isn’t centralised, and with thousands of different clients with varying servicing requirements, there is always a tendency to focus on repeatable client reports. There is also a reliance that requests are coming into the same place, but this often isn’t the case. If there is no central repository, the data that is being provided to clients will often be different – depending on what information that particular team has access to. When bespoke inquiries come in, they are incredibly difficult to deal with effectively. This is felt even more starkly if the incumbent data architectures are not interacting with a modern reporting solution.

Modern Architectures

While the fall-out from the French election does not seem to have been too severe on markets, it is a timely reminder, nonetheless. Client engagement is a key differentiator in an age where performance is squeezed increasingly by passive investing through exchange traded funds (ETFs)– it is important that clients remember who was able to put their minds at ease rapidly in the aftermath of surprise elections or other market-shaking events.

When you consider that those accounts that generally require the most bespoke treatment are the largest accounts, the ones that drive the majority of a firm’s revenue, it becomes clear why asset managers need to expect the unexpected, and prepare themselves with modern, interactive data architectures and reporting solutions.

The post French Election Reminds Asset Managers to Expect the Unexpected appeared first on A-Team.

]]>
QUODD Enhances QX Digital Platform with S&P Global Bond Data Integration https://a-teaminsight.com/blog/quodd-enhances-qx-digital-platform-with-sp-global-bond-data-integration/?brand=ati Thu, 11 Jul 2024 08:22:59 +0000 https://a-teaminsight.com/?p=69217 QUODD, the market data on-demand provider, has upgraded its QX Digital Platform to incorporate comprehensive bond data from S&P Global Market Intelligence, reinforcing its end-of-day global pricing and reference data service for wealth management clients through its QX Automate API. QUODD’s QX Digital Platform gives customers access to market data functionality and content for front,...

The post QUODD Enhances QX Digital Platform with S&P Global Bond Data Integration appeared first on A-Team.

]]>
QUODD, the market data on-demand provider, has upgraded its QX Digital Platform to incorporate comprehensive bond data from S&P Global Market Intelligence, reinforcing its end-of-day global pricing and reference data service for wealth management clients through its QX Automate API.

QUODD’s QX Digital Platform gives customers access to market data functionality and content for front, middle and back-office workflows. S&P Global Market Intelligence now supplies the Platform with independent pricing and liquidity data for bonds, offering advanced security look-up and query capabilities using pre-defined or custom templates. Transaction data analysed and aggregated to generate pricing content encompasses nearly three million corporate and sovereign bonds, municipal bonds, and securitised products.

Integrating S&P Global Market Intelligence’s bond pricing and reference data with global equities and funds through QUODD is designed to enhance the QX Digital Platform’s display capabilities and connectivity for downstream wealth management users. The integration allows users to optimise their market data consumption, maximise their market data spend, reduce costs without compromising quality, and improve workflow efficiency. It supports daily pricing, reference data, and corporate actions while automating data usage entitlements for customised workflows.

“We have incorporated access to S&P pricing and reference data into our extensive content catalogue, which is a mix of proprietary and third-party data sets,” Bob Ward, CEO of QUODD, explains to TradingTech Insight. “This collectively amounts to 150 data sources and 250 billion data points in our data lake. We have made all this data available via several access points. Users can access this data as individual datasets via several communication methods (QX Marketplace), they can access digitally online and view and extract on demand (QX Digital), and now they can programmatically access multi-asset class data into third-party applications (QX Automate).”

QUODD has now signed numerous clients across multiple market segments with similar workflow concerns. The key drivers are timeliness, simplicity, and easy accessibility, as Ward outlines in the following use cases:

New issues research – New debt instruments are released into the market daily, and firms need pertinent terms and conditions to classify them correctly in their systems. The QX Digital Platform is tied into the real-time S&P bond reference data API to retrieve those details as soon as S&P does.

New asset setup – Banks price assets based on the issues that their clients hold. “A current customer told us this week that they set up 850 new assets in their system in June alone,” says Ward. “They are constantly accessing QX Automate to pull the data they need to properly set up those securities in their system based on asset type, sometimes multiple times a day. This gives them the timing and flexibility to retrieve data at any time to meet their client’s pricing needs.”

Price challenges – The Price Challenge process via the QX Digital Platform is supported by the integrated S&P Price Viewer tool, as Ward explains: “This tool gives our customers direct access to the S&P bond evaluators for price challenges. As bond pricing varies by provider, prices can differ, and customers need to confirm the most accurate evaluation. Price challenges are affirmed or updated usually within a few hours. The S&P pricing methodologies are transparent to all of our customers.”

Security master maintenance – “Most of our customers use QX not only for pricing but for global security master maintenance using our Corporate Actions solutions,” says Ward. “Many parameters can be set, such as Voluntary vs. Mandatory Date parameters, based on Effective Date or Announcement Date, or the ability to hone in on specific events that affect things like reorganizations, which affect shares and price. Maintenance tasks like identifier changes, name changes, M&A, etc., are all important in maintaining a security master.”

In addition, the platform has embedded proprietary calculators such as an Accrual & Amortization tool that provides the requisite buy & sell tickets on certain fixed income instruments, leveraging content from S&P to meet and exceed the functionality available in the legacy terminals.

Modern technologies and delivery models have been integrated into the QX Digital Platform to meet the new need for data on demand, and these innovations keep QUODD ahead of its competitors, says Ward. “Most providers today have layers and silos of technology, leading to increased inefficiencies and lower quality. QUODD is designed from the ground up for the future, and with our cloud-native platform, we can deliver our content into customisable client workflows that are turnkey, scalable, and cost-effective. Building from this platform allows us to meet customer needs today with very low switching costs while opening new options for even more advanced integrations as their digital strategy continues to evolve.”

Ward states that the two defining characteristics of the technology are a cloud-native platform that is purpose-built to power the full breadth of market data apps and APIs and the ability for companies to manage their preferred consumption model and frequency of data updates. “By virtue of technology reducing the friction of integrating and onboarding new sources of data on a self-service, on-demand, and connected basis, the QX Automate module delivers a superior experience, enabling customisation and integration at the same time; and because we are cloud-native with a modern tech stack, we can build faster and respond to customer requirements with more agility and transparency,” he says.

In terms of market data spend, Ward points out that under QUODD’s pricing model, clients only get charged for what they use and the frequency of that use. “This pricing approach, combined with a single integration point for a client’s entire security master, coupled with the improved workflow for the employees (no more swivel chairing), provides a very good value for our clients,” he says.

Looking ahead, QUODD has a number of customer-driven projects in the pipeline, including leveraging AI to help build third-party adapters at a quicker pace, and using AI to expand the company’s proprietary data sets.

The post QUODD Enhances QX Digital Platform with S&P Global Bond Data Integration appeared first on A-Team.

]]>