Why Sell-Side Data Modernization Must Be an Institution-Wide Mandate

October 1, 2025
Last Updated: September 30, 2025
Read Time: 6 minutes
Authors: Ted O’Connor
Data & Governance
Sell Side

The sell-side has been moving more urgently in the past two years to digitally transform operational systems and data management. Within the financial services sector, big banks in particular have demonstrated waves of resistance to digitization, most saddled with deeply rooted mainframe architecture that is far from easy to upgrade with point SaaS solutions or enterprise data platforms. Tech teams find it painful to retrofit old systems, acquired through M&A or built internally without standardization. Moreover, winning buy-in for complete data and tech modernization is not easy, with the high initial investment. Investment banks have done well to weather disintermediating headwinds like competition from fintechs and nonbanks, as well as the recent regulatory super cycle and high cost-income ratios.  

U.S. banks have been opening their wallets in recent years, with technology at the top of their shopping lists, with 34% overall growth in spending since 2019i.i Still, frustrations with modernization abound, with underwhelming results, difficulty measuring ROI, and inability to scale the tech with the business, among other things. Why are banking CTOs and Chief Data Officers (CDO) now shifting their technology focus from specific sell-side functions to a broader institution-wide data management modernization? 

Should sell-side data modernization be organization-wide?  

In Celent’s recent survey of sell-siders, 70% of respondents said they expected to see significant changes to business revenue models, while 68% plan to replace or significantly upgrade one or more critical systems this year.ii Indeed, the world of investment banking is changing before our eyes. Already, the private credit explosion, the popularity of innovative vehicles like secondaries, products like structured credit vehicles, and business models such as multi-manager platform hedge funds had sent tech buyers shopping for platforms built for the needs of an expanding capital markets environment. The transition to daily reporting for products like interval funds, coupled with higher volumes, puts immense strain on existing fund administrators and technology. The fundamental pillar of data modernization is the standardization and centralization of a firm’ data. And that, by definition, calls for banks to think on an institutional-wide level. 

Investment banks welcome the average Joe and Jill – and their data   

New products and customer segments are introducing layers of complexities, compounding the necessity for centralized, scalable investment data infrastructure. When the U.S. opened the doors to affluent retail inflows to private credit, it began changing the nature of these investor cohorts, raising questions about liquidity and suitability for less savvy investors.iii  

Further, banks are introducing new product wrappers to capture retail inflows. They will have to adjust their risk modeling, liquidity simulations, and valuation practices. As retail investing in alt assets becomes more mainstream, transaction volumes will skyrocket, pushing older, non-cloud data platforms to their limits and causing bottlenecks and errors.  

Blockchain extends business hours, adds more retail investors 

Policymakers also put the crypto pedal to the metal this summer, with the SEC opening compliant avenues for digital asset trading. The coming increase in blockchain trading pushes the needle on the idea of 24/7 (or, really 24/5) market operations, unrestricted by 9-5 banking hours. The automation of trade capture and settlement operations to enable 24/5 trading, while nothing to panic about, is really a spoke in the unstoppable data transformation wheel. It is imperative that sell-siders stop changing only the spokes and start changing the wheels.

AI forces the data modernization issue 

Now, the AI imperative has stoked flames a previously slow burning, industry-wide sell-side data transformation. Banks were dabbling in AI digital assistants and large language models (LLM) tools in 2024; in 2025, they are awash in them – or at least in the development of them, urgently needing to “shed technical debt so they can realize the promise of becoming an AI-powered bank.”iv  

CDOs are flocking to AI tools to automate reporting and disclosures, improve data ingestion, normalization, and analysis from vast amounts of structured, semi-structured, and unstructured data. Before AI agents can replicate operational tasks like processing trade captures and resolving data quality problems that prevent trades from being correctly entered into the system, they must be fed precise, high-quality data. It’s a chicken-egg scenario in which the number of eggs is seemingly endless. 

KPMG’s 2025 Banking Survey corroborates what we have been hearing from clients and prospects. The top blockers to their data modernization efforts are data quality (89%) and the complexity of integration with legacy systems (81%). Their results also indicate only a small majority of banks have installed foundational data capabilities, at 64%.v Just over a year ago, Deloitte reported that most banking leaders said (88%) that their data is available but from different sources, duplicated, and not integrated, aka not centralized or harmonized.vi Since then, banks report that more centralized data is accessible, but the quality remains lacking. 

Their world is exploding with data and middle- and back-office processes are creaking on top of quaking operational models. See our previous article to find out more about how sell-side institutions can prepare their data systems for AI.  

Why risk porous risk management? 

A sell-side organization’s inability to deal with the increasing volume, velocity, and variety of financial data hits the top line, the reputation, and the bottom line. The management of credit risk, operational, liquidity, and non-compliance risks is perilous with bad data governance and management.  

For example, if a brokerage has a hedge fund client that has a blip at a bank, it needs to be able to monitor their collateral requirements in real time. The staff will be examining implied and realized volatility metrics, leverage ratio data to look for balance sheet stress, margin call trigger data, and asset class haircut data, among others. With imprecise, missing, or untimely data, collateral calls go unmet, the brokerage becomes the bagholder, and the brokers scramble to unwind positions. Fire sales and unwanted regulatory attention follow.  

And we are only talking about one subset of risk management. Hundreds of regulatory reports still have to be filed on a daily, weekly, and monthly basis. Simply put, banking professionals need access to accurate, reliable data – and they need it now. 

Arcesium Logo Mark
A Virtuous Cycle

Currently, some financial institutions are in a negative loop: they have limited discretionary capacity for tech spending but determine they need to build certain solutions themselves, often because vendors’ offerings don’t meet their needs ... Some other institutions have created a virtuous cycle. They use a value-focused approach and ensure cross-functional collaboration among the C-suite (CEO, CFO, CIO, business unit heads) to ensure value realization beyond the CIO’s office.  

- McKinsey 

Fragmentation is the enemy of enterprise data modernization 

Data is core to banking and capital markets, from transactions, operations, hedges, client activity, sources of collateral, risk management, and more. Investment banking leaders that still regard financial data as a means to an end are missing the point. Data has become one of a bank's most valuable assets. We are really talking about the quality of your data – from top to bottom, end to end, back to front of the institution.  

The challenge looks different depending on where you sit in the bank. But the root cause is often the same: fragmented data. Those that recognize quality data as a strategic asset are now aiming to think more holistically about revamping their sell-side institution’s modern data stack. Silos between business lines, legacy systems, and inconsistent reporting make a single source of data truth and the operational automation that it makes possible an absolute necessity. Data silos hinder a holistic, firm-wide view and make it difficult to answer high-level questions for leadership. Further, banks' biggest operational migraines flare up when data flows poorly between departments, buy-siders, or counterparties. Normalizing data to feed upstream systems, including those for reconciliation, regulatory reporting, and AI applications is wholly reliant on data quality. 

It is tough to achieve because there is so much data and it’s pretty complicated. It starts with data architecture that enables centralization of data, followed closely by the automated pipelines and connectors that properly ingest, model, and normalize investment data from myriad sources. Automated data quality, validation, and governance keep the bank’s information correct and timely, and preserves its integrity as conditions change day-to-day. Data lineage tools ensure observable data flows, so that audit trails exist to monitor the impact of downstream changes, thereby providing another layer of oversight. Seeing change through an organization-wide lens is key to achieving automation across the bank’s functions. 

Sell-side modern data stack to cushion market shocks 

Uncertainty remains the theme of the year, if not the entire 2020s era. Trade wars have hampered growth, and the Federal Reserve is indicating possible rate decreases in Q4. Meanwhile, U.S. banks are looking to take advantage of a regulatory ebb, marked by delays and reversals of major rules. In this article we looked at servicing banks’ data needs from the point of view of defense, preventing problems; in my next piece I will look at sell-side enterprise data management with a focus on going on offense: a returns and revenue-generating perspective.  

See our whitepaper Solving the Data Problem in the Banking Sector that breaks down the root causes of the industry’s long-standing data problem and offers a practical framework for solving it, starting with a revenue-first mindset. 

5 Key Takeaways  

Q1: Why are sell-side banks investing in broader data modernization now? 

A1: Legacy systems can't support AI, 24/5 trading, or rising retail flows. Banks are modernizing data infrastructure to stay competitive and compliant. 

Q2: What’s driving the shift from siloed tech upgrades to institution-wide data strategy? 

A2: Disconnected systems limit scalability, increase risk, and block AI deployment. Unified data strategies offer centralized, accurate, and real-time insights. 

Q3: How are retail inflows impacting sell-side operations? 

A3: Hybrid investor bases require segmented reporting and compliance. Banks must adjust data, risk models, and infrastructure to serve retail and institutional clients. 

Q4: What role does AI play in accelerating modernization? 

A4: AI needs clean, timely, and integrated data. Many banks face a “chicken-egg” problem; AI tools can’t deliver value until data foundations are fixed. 

Q5: What makes quality data so hard to achieve in banking? 

A5: Fragmentation, legacy tech, and inconsistent governance impede data reliability. Automated ingestion, normalization, and quality controls are key to progress. 

Ted O’Connor

Authored By

Ted O’Connor

Ted is a Senior Vice President focused on Business Development at Arcesium. In this role, Ted works with leading financial institutions in the capital markets to optimize data, technology, and operational needs.

View Author Profile

Share This post

Subscribe Today

No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.