“Data is a precious thing and will last longer than the systems themselves,” Tim Berners-Lee.
Sir Timothy Berners-Lee, the English computer scientist best known as the creator of the World Wide Web, noted this when he was building the infrastructure the Internet sits on top of. His enduring statement suggests that data has value beyond its use in a particular system.
Thirty-some-odd years later, with an investments industry awash in data, Mr. Berners-Lee’s statement has never felt more relevant. Multiple data sources have created an influx of information and the numbers are only growing.
IDC’s 2023 estimates forecast that the world will generate a staggering 291 zettabytes of digital data in 2027.
Data may well be “the new oil” but when locked up in legacy systems, many of which are too delicate to upgrade, let alone replace, extracting value from that “oil” can be challenging.
As datasets explode, firms must be ready to capture, clean, store, and intelligently use them. However, as many in our industry are finding out, you can’t build a data science practice on top of an antiquated data warehouse – and certainly not atop an accounting system.
A purpose-built data platform is an essential foundation for managing data from beginning to end. But where do you start?
Architect a Data-First Approach
A unified strategy centered around data can enable firms to differentiate themselves based on powerful insights.
Firms need quick access to their data. Yet, most systems in an organization’s stack are point solutions for specific tasks that don’t support the data needs of a firm’s broader ecosystem. In fact, firms often find their data is held hostage in siloed systems, inaccessible by the tools and analysts that need it most.
For data to flow smoothly, data pipelines must be infused with domain, i.e., subject-matter awareness. However, just as datasets are not generic, every organization has unique, context-specific goals.
Asset managers and financial services organizations with separate strategies for data and operating systems face a bigger challenge: Should they use established financial EDM frameworks designed for on-prem deployments or build from scratch using generic cloud and database technologies? The tradeoff leaves much to be desired. That’s why we pursued a third path with our clients to couple domain-infused data models with thoroughly modern architecture and infrastructure.
As organizations work to decode their data, another key question they must consider: who is using the data? The often buzzed-about themes of “democratization of data” and the so-called “citizen developers” have practical implications for many organizations.
Multiple employee personas across an organization rely on similar, but not perfectly overlapping data, to get their jobs done. Yet, accessing complete and useful information is not always simple.
We know from first-hand experience that developing and maintaining a data strategy that works for all parts of an organization requires integrating new sources of data into existing frameworks.
Organizations must also seamlessly incorporate that data into research, investment decisions, and non-investment processes.
As organizations break down silos, it’s interesting to observe how historical boundaries between the front, middle, and back office grow increasingly arbitrary. Public and private market investments have converged and fund wrappers have decoupled from investment strategies. As a result of convergence trends, data no longer flows in a linear fashion.
Workflows have become asynchronous “workwebs” with global creators and consumers of overlapping datasets working individually and collaboratively in harmony. The handoff is becoming extinct.
To scale and differentiate their organization, firms must be ready to revamp dated processes and enable their teams to seamlessly work together. The ability to collaborate is a part of the data process that’s now a game-changer in how organizations leverage insights in their decision-making.
Build a Robust Data Framework
Thoughtfully architected frameworks are de rigueur to ensure firms free themselves from the constraints of legacy systems. Even if, and probably when, those same systems remain operational for years to come.
As firms work to manage the data explosion, understanding every part of the data process and implementing the tools necessary to use their data will set their organization apart. Organizations that have a deep understanding of their data sources and usage — and build flexible processes on top of thoughtfully designed data frameworks — will have a competitive advantage.
In today’s competitive realm, successful companies adjust their investment programs in real time based on insights from their data. Their competitors remain stuck at the starting gates, figuring out if they even have the right data before spending time reactively and haphazardly cleaning and organizing it.
David Nable, Head of Client and Partner Development
Ready to change the way you see your data? Learn how Arcesium’s Aquata Data Platform can help your firm make critical decisions from a synchronized source of data.
This blog post is made available for personal informational purposes only. It does not constitute legal, tax, or investment advice and should not be treated as such. Nothing on our blog constitutes an offer to contract or acceptance of contract terms you may offer to us. We contract solely by definitive written agreement reviewed and approved by counsel. Any views or opinions represented in this blog belong solely to the author(s) and do not represent those of Arcesium LLC, its affiliates, or any other individuals, institutions, or organizations associated therewith. Arcesium LLC and its affiliates do not represent, warrant, or guarantee the availability, accuracy, or completeness of the information contained in this blog and shall not be liable for any losses, injuries, or damages resulting from the display or use of such information.