Private equity has moved from in-house Excel sheets to outsourced services and software point solutions, but modern data infrastructures that bring it all together can deliver the benefits of scale as the industry grows, says Cesar Estrada of Arcesium
Arcesium’s Cesar Estrada was featured as a keynote interviewee for an article in April issue of Private Equity International. This month’s issue was released as the Tech-Enabled Investing special report.
Why is data such an important strategic issue for private equity firms today?
Private equity has experienced extraordinary growth in recent years, which is unlikely to slow. Private markets returns continue to outperform those of the public markets; private markets assets under management account for close to 14 percent of the whole asset management industry but nearly half of asset management revenues.
Competition is, therefore, intensifying. The need to make better decisions more quickly to remain competitive and scale up puts data at the heart of the strategic agenda. In addition, firms face an increasingly data-hungry set of stakeholders, from LPs to regulators, all of which are accelerating the need to leverage data effectively.
What does modern data infrastructure look like, and how does it differ from what has typically been the norm for these asset classes?
Historically, everything has been done internally in a proprietary fashion that relies on Microsoft Excel and carries significant key-person dependency. That situation hasn’t disappeared entirely, but today, outsourcing is more commonly pursued. Firms are adopting a range of software point solutions to solve their accounting, portfolio monitoring, and investor relations needs, but that siloed approach is not sustainable. Something more holistic is required. Such modern data management tools exist in other industries, but adoption has been slow in private markets.
Why do you think the private markets industry has been slow to adopt modern data management tools?
I think it is partly to do with a lack of standardisation across the industry. Replacement decisions are tough to make. The industry is also very dynamic, with managers constantly coming up with new strategies. However, the main reason that adoption hasn’t taken off in a more meaningful way is that there hasn’t been a modern data infrastructure tool on the market that is domain-aware and focused on the needs of the private markets and alternative asset managers.
Data management disciplines common in the public markets are now taking hold in the private arena, and data management domain expertise is critical to helping managers develop a unified data fabric that provides transparency across business functions and their needs. It is also essential in allowing LPs to see data holistically across all their holdings, rather than static documents that address each of those holdings in turn.
Even those that have spent time and resources on this seem to have had mixed results so far. Why is that?
Larger houses tend to be further along in their data journey, and some of those firms have attempted to create data warehouses – or data lakes – with the support of consultants. However, the opportunity cost involved in dedicating internal private markets expertise to these programmes typically turns out to be too high to prove sustainable, which is why we have seen mixed results at the larger end of the industry.
When it comes to mid-market houses, meanwhile, it is simply the case that these firms don’t have the capacity or ability to do this on their own, and there hasn’t been an appropriate domain-aware partner out there. That is why, while everyone seems to be talking about the importance of data these days, in reality, most remain focused on individual point solutions rather than a holistic data fabric.
What value can investing in the creation of that data hub bring to private markets firms?
Many of these data capabilities have been well proven on the consumer side and are table stakes now, even on the B2B side in other sectors. Inevitably, private equity firms that have invested in this innovation through their portfolio companies and witnessed the results first-hand are now seeking to unlock more value from their own data to drive better decision-making.
Some of the persona-based use cases that we see include the common problem of a firm ending up with multiple administrators across regions, product types, and asset classes – a side effect of their growth and success. Having a unified data set with a workflow overlay to conduct oversight of multiple vendors and multiple sources of information can be a valuable weapon in their arsenal.
Fundraising is another important use case. We often hear of investor relations professionals on the road who rely on an offline and manually intensive method of slicing and dicing returns to speak to a track record. This can be done in a much more elegant and efficient way using a holistic data approach.
A final example I would offer involves investor enquiries, particularly enquiries made by those large investors that often overlap across multiple products. They may be invested in buyout, real estate, and infrastructure funds. They may be co-invested directly into some deals. They may have a separately managed account. Those investors want to access a holistic view of all their investments, including capital balances, commitments, and returns. From their perspective, what seems like a simple request can sometimes end up being a two-week exercise for the manager, and it simply doesn’t have to be that way.
What should CTOs and other business leaders at private markets firms consider when investing in improving their data infrastructure?
Building out a domain-aware data stack for private markets involves a diverse and ever-increasing set of choices, and managers don’t have to go it alone. They don’t need to customise a generic warehouse, and they also don’t have to make a huge, multi-year investment anymore. Having said that, there are some things that should be kept in mind when considering investing in data infrastructure.
First, data should be treated as an asset, just as people are deemed a critical resource. That means someone needs to be accountable for managing that asset holistically. Next, be aware that there is plenty of low-hanging fruit and you don’t need to disrupt your current infrastructure to embark on this.
Yes, this type of programme might allow for decommissioning of unnecessary legacy infrastructure at some point down the road, but the unified data hub can coexist with legacy infrastructure and help you unlock significant value from your data.
Furthermore, don’t try and do everything in one go. Think about meaningful use cases one by one, rather than attempting to address the needs of all your constituents immediately. Be aware that cloud-native platforms will win these days, and most legacy tech stacks were not designed as such. Fusion teams made up of IT and business staff can collaborate to drive innovation. Sometimes business and IT remain separate, which is not conducive to rapid and integrated digitisation. Finally, consider partnering with a private markets domain-aware data management infrastructure firm that can become an extension of your own team.