Why MCP Servers Are the Missing Link for Scaling AI in Asset Management

February 6, 2026
Read Time: 5 minutes
Authors: Eashwar Viswanathan
Innovation & Tech
Inst'l Asset Managers

Now that you are well-versed in large language models (LLMs) and agentic AI in asset management, allow me to introduce you to another acronym that you will grow tired of hearing: MCP. A model context protocol (MCP) server creates a layer that allows different datasets to talk to an LLM, providing critical context regardless of the central agent used. If your firm doesn’t already need MCP now, it will likely need it in the immediate future.

Right now, asset managers are becoming adept at using AI for research-related tasks and to execute certain investment operations. But as AI infiltrates business functions throughout firms, they face challenges connecting multiple systems. Integrating workflows, LLMs, BI tools, or datasets requires piecemeal connectivity and wildly inefficient individual connections.

The early adopters of AI are poised to become the early adopters of MCP servers. Here is why MCP servers will be critical layers of AI infrastructure for asset managers, enabling greater adoption and scalability of AI.

The AI agents are coming

Ultimately, we will have AI agents working with each other on functions, making agent to agent protocols a premium innovation in the near future. As of now, management firms have achieved a good level of competency in deploying AI for market research and alternative data analysis, including tasks like analyzing earnings calls, portfolio construction, and risk analysis. They are actively launching AI tools for more operational functions in data quality management and reconciliation. In AIMA’s recent survey of worldwide fund managers, 58% expect wider front-office integration of AI this year, and 90% of the investors surveyed believe Gen AI would positively impact the performance of at least some fund managers' portfolios over the next three years.i However, firms face a major challenge as they transition AI strategies from pilots to enterprise-wide implementation.

Siloed AI agents don't talk

In investment management, CTOs are chasing an AI dream: for their managers to be able to type in a seemingly simple request like, “How is my growth portfolio doing?” and receive a magic answer in the form of a chart, a phrase, or a paragraph. AI models are not standardized, and they cannot easily communicate or collaborate across the organization's data infrastructure. The future may be end-to-end AI workflows, but for now AI agents operate in silos and cannot access multiple datasets simultaneously.

AI agents need the wiring plus the context. Without it, you can ask for a growth portfolio, but the agents won’t know if that’s the name of a fund, a book, or a legal entity. It's all the other systems that are supposed to give that context to the answer. It’s in the name: model context protocol. Those firms considering rolling out AI agents into their tech infrastructure should think of MCP as an integral part of these initiatives.

MCP servers drive data interoperability in finance

Most asset management firms are by now keenly aware of the necessity of data interoperability and having a centralized, scalable data management infrastructure in place to take full advantage of the strategic asset that data has become. Once again, we are talking about the need to scale, this time in firms’ AI strategies.

Arcesium Logo Mark
From Brittle Systems to Interoperable Agents

“The proliferation of autonomous agents across distributed systems, cloud environments, and edge networks has introduced significant challenges to interoperability, scalability, and system resilience. Traditional architectures, which relied heavily on P2P Application Programming Interfaces (APIs), rigid integration contracts, or monolithic orchestration layers, have become increasingly brittle in the face of dynamic, heterogeneous agent ecosystems... Against this backdrop, there is a clear motivation to develop a standardized protocol that enables agents—regardless of vendor, domain, or deployment environment—to seamlessly discover, interact, and collaborate in a secure, stateful, and extensible manner.” — Partha Pratim Ray, TechRxivii

MCP servers enable AI agents to dynamically use multiple tools and coordinate execution across complex, multi-tool workflows without inflexible, hard-coded logic.iii MCP servers are the cousins of APIs but are not fully parallel. APIs grew up alongside cloud computing in the 2010s and are now the engines of numerous digital ecosystems like Stripe, Twilio, and Slack, which have built their entire business models around providing APIs.iv MCP servers provide a standardized output similar to an API, meaning responses across different agents are standardized. However, unlike traditional APIs that return specific data, MCP servers focus more on standardizing workflows. With an MCP, the firm exposes their risk, portfolio accounting, and market data systems through standardized APIs; any LLM or agent can access them through a single protocol.

Are MCP servers the new APIs?

Early adopters of AI in asset management are moving ahead of the competition in this latest tech arms race. However, the spoils will go to those that leverage MCP — moving from pilot to scaling AI with more agility. MCP enables systems to normalize and standardize data across multiple vendors like S&P, Moody's, Fitch, and Bloomberg, who provide data in different formats and scales. This apples-to-apples comparison is critical for current use cases like research and formulating investment theses.

Another curious weakness of the current AI innovation phase is that LLMs stink at math and are not adept at processing structured data — i.e., numbers and database tables. Most cloud-based data ingestion and normalization systems have the opposite problem and cannot process unstructured data in the form of, for example, satellite data, earnings call transcripts, and securitized loan tapes. MCP servers act as engines that generalize info across integrations, supplying the meaning, context, and all the surrounds for structured datasets flowing through organizations’ AI agents. MCP connectivity provides flexibility, efficiency, improved decision-making, and cost reduction for managers. Then, their LLMs can churn out those magical answers to simple but important questions. MCP servers are not the new APIs, but they will be just as formative in this phase of digital transformation.

MCP unlocks the art of the possible

MCP servers are integration capabilities for AI agents, offering a ton of flexibility. These servers enable firms that have multiple AI vendors and agents to build their own agents and use them as part of their bigger strategy around externally sourced AI models like Anthropic or OpenAI. This means that if a firm wants to use vendor’s data within a larger internal system of agents they are building, MCP connectors will enable that value transfer. This technology infrastructure will allow new, more advanced AI use cases beyond current productivity tools, market research, and back-office operational tasks.

Critically, these servers prevent the agony of series of one-off integrations that would need to be executed each time a new agent is introduced into the workflow. Firms fully expect to advance with AI agent use cases, moving through investment ops tasks all the way up to alpha-generating functions including business intelligence. The building and connecting of complex, customized dashboards using tools like Power BI or Looker by directing an AI agent requires an MCP server to be viable. Then, agents can help build dynamic dashboards swiftly, transforming reporting cycles and removing the need for managers and analysts to receive specialized BI training. Moreover, MCP-enabled AI can replace specific, non-workflow-driven point solutions such as basic PDF extractors or systems that analyze financial statements to pull out specific insights. Subsequently, AI can be used for the actual report creation, while the underlying system handles the workflow aspects.

MCP readiness to take AI from pilot to enterprise-wide

Deploying an MCP server requires a certain level of technical acumen, but it is not considered technically difficult or dependent on managed services. More importantly, the firm’s chief data officer or CTO should ensure the data infrastructure is modernized and scalable, with proactive data strategy and governance, thoughtful system integration design, and comprehensive application permissions. The MCP server acts as a universal adapter, making enterprise data readily understandable and usable by any external AI application, akin to a standard socket allowing any compliant appliance to draw power effortlessly. In the next few months, the adoption of MCP will be a pivotal factor in competitive advantage in capital markets, as the tools will in effect multiply the multiplier.

Eashwar Viswanathan

Authored By

Eashwar Viswanathan

Eashwar leads product management for the institutional asset management segment at Arcesium.

View Author Profile

Share This post

Sources:

[i] Association of Alternative Investment Managers, September 16, 2025. https://www.aima.org/article/press-release-front-office-gen-ai-adoption-shifts-from-if-to-when-for-leading-fund-managers-aima-research-finds.html

[ii] Partha Pratim Ray. A Review on Agent-to-Agent Protocol: Concept, State-of-the-art, Challenges and Future Directions, May 01, 2025. https://www.techrxiv.org/doi/full/10.36227/techrxiv.174612014.42157096

[iii] Boston Consulting Group, August 1, 2025. https://www.bcg.com/publications/2025/put-ai-to-work-faster-using-model-context-protocol

[iv] DEV, October 3, 2024. https://dev.to/keploy/the-history-of-apis-evolution-of-application-programming-interfaces-55p2

Subscribe Today

No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.