From Neighborhoods to City: Unifying Data Across Sell-Side Silos

March 23, 2026
Read Time: 7 minutes
Authored by: Ted O’Connor
Operations & Growth
Sell Side

Transformation at sell-side institutions has evolved from isolated program offices to a C-suite and board-level priority. But this same desire to centralize and elevate transformation often runs up against a stark reality. The data needed to stitch the transformation together across an entire organization remains in the hands of isolated areas of the business.

These data platforms are often isolated by line of business or organized by specific asset class. On the one hand, each of these has its own strategy, infrastructure, and workflows, often with its own P&L. On the other hand, this independence creates fragmentation, making transformation harder and slower to achieve.

Achieving truly transformative modernization can resemble turning a collection of insular neighborhoods into a functioning metropolis. This reality on the ground forces transformation leaders to act like urban planners for 21st-century global megacities. 

Why sell-side data still looks like a patchwork of neighborhoods

Much of the data diversity among sell-side players comes from their history. Large organizations cover a broad spectrum of asset classes. They tend to accumulate them over years and even decades. They have grown through a mix of organic growth, strategic expansion into new geographies or businesses, and acquisitions.

This technology includes a high proportion of on-premise servers and mainframe-based technologies. Older neighborhoods are part of the city core, while new neighborhoods sprout up on the periphery, with technology that reflects their heritage. In total, the institution spans the range from traditional data centers to public cloud, as shown in a study by cloud vendor Nutanix.i

A separate IBM study showed that 43 of the world’s top 50 banks rely on mainframes as their core computing platform.ii But there is continued investment in cloud. From 2023 to 2025, 87% of companies increased investment in the cloud, according to LSEG research.iii None of these technologies is inherently superior to any other. Some workloads are simply better suited to certain architecture patterns. The devil is in the details.

For transformation, a critical implication of this mosaic of technologies is the accumulation of hundreds of different approaches to data. Each neighborhood produces and egresses its data without normalization across systems or reference data to help make connections. The diversity impedes comprehensive cross-firm views and efficiencies.

The problem with fragmentation goes beyond the idea of “tech debt” caused by maintaining aging systems. Costs absolutely must come down to compete with digital-first disruptors and the changing economics of sell-side services. But competition also calls for the ability to support new products and meet buy-side demand for a broader range of asset classes, including digital assets.

A bigger problem is the inconsistency in the workflows that run the firm. Regulatory and management reporting, intraday funding decisions, compliance monitoring, and client reporting all require consistent, timely views of positions, trades, and reference data. Over time, parallel versions of truth pile higher, increasing cost, slowing change, and making reconciliation harder.

Shared utilities and services

A city with a patchwork of separate utility providers and public transportation systems would be hard to live in or run. Yet sell-side data often comes from many systems and providers that do not easily integrate. Most desks insist their data is unique. Some of these differences are real, such as product terms, calendars, conventions, margining, and settlement mechanics. Each desk is profitable and competitive because of its advantages in how it prices risk, structures products, and executes in its markets.

But these unique traits sit on top of a foundation of commonalities. Across asset classes from equities to FX to repos and structuring, the same backbone repeats, such as instrument definition and reference data, trade capture and enrichment, lifecycle state changes, counterparty and account data, valuations, positions, and downstream reporting obligations.

A transformation leader’s job is to separate “true nuance” from “superficial variation,” then build a unified system that handles common needs once, rather than re-implementing them desk by desk.

At the same time, the answer is not to modernize everything and move everyone to the cloud from the top down. These types of initiatives tend to stumble over internal roadblocks that stop change. Business units that generate billions of dollars in profit may not recognize that anything is broken or needs to change.

The better answer is to create a layer that ingests data from every source system, every external data provider, and every counterparty. That approach buys time for transformation and even extends the return on investment from the underlying technologies. It also lowers the need to make unrealistic demands or overcome steep local resistance and helps keep your citizens confident that they can access data for their own analytics and query.

Zoning and building codes

The goal is to define a small set of firmwide primitives that every neighborhood can plug into for the sake of consistency and operational streamlining. These primitives do not need hundreds of different technologies or data models with their own approaches to what defines a security or a trade, how to represent an execution, or which lifecycle states and events call for consistency so that downstream functions can understand them.

Crucially, those shared standards do not dictate how a desk runs its business. The goal is to give each business unit room to determine where true nuance and intricacy live. Doing so shifts the conversation to tangible questions around which definitions must be shared so data can move cleanly across regions, asset classes, and control functions without re-implementing the same basics repeatedly.

Governance: solving the “mayor” problem

Practically speaking, total convergence and lockstep alignment are a nirvana that most firms will never reach, especially when it means people must invest in change that does not seem to benefit them directly. When people believe what they already have works, when they look through the lens of “we have always done it this way,” or when they feel a deep attachment to technology they built themselves, change is that much harder.

Chief transformation officers need three things to build credibility and soften these forms of resistance. First, they need a clear plan and business case. Second, they need a top-down mandate. And third, they need to recognize that mandates from the top still need local buy-in to succeed, across senior management and business units.

In addition, transformation does not end at a particular milestone. Markets evolve, products change, and whatever looks complete in a three-year plan will be overtaken by new requirements, new asset classes, and new operating demands. To succeed, adaptability matters as much as architecture to stay current, absorb change, and keep the firm coherent as strategies shift.

With a layer that creates standards on top of existing systems as they age and are replaced naturally, core activities like regulatory reporting, funding decisions, compliance oversight, management reporting, and client reporting can draw from a consistent set of normalized data. They do not need to be rebuilt around system-by-system definitions or follow every upgrade. For example, if the firm replaces a trading system, the work is to map the new system’s outputs into the ecosystem once, rather than remapping those outputs into every downstream report, control, and workflow.

Once definitions are shared and data is normalized, the focus of transformation can come back around to the machinery that moves data across regions and platforms, enforces runtime controls, and delivers it fast enough to be operationally meaningful. That’s where architecture determines outcomes rather than debate about location or build versus buy. 

Ted O’Connor

Authored By

Ted O’Connor

Ted is a Senior Vice President focused on Business Development at Arcesium. In this role, Ted works with leading financial institutions in the capital markets to optimize data, technology, and operational needs.

View Author Profile

Share This post

Subscribe Today

No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.