Measuring the Significance of Data

May 16, 2024
Read Time: 5 minutes
Data Management

By now, you’ve probably read countless headlines calling data the new gold, new oil, currency, or some other pithy phrase trying to capture the significance of data.

Whichever metaphor resonates most, data has become invaluable for organizations across industries, and financial institutions are no exception. The proliferation of available data has been exponential, creating new opportunities and challenges for providers, consumers, and facilitators of data. Providers are constantly finding ways to bring new data to market, consumers are constantly evaluating ways to use data to their advantage, and those offering solutions to manage and process data have been scrambling to keep pace.

The value of data lies not just in the volume of which it is available, but in the actionable insights and enterprise value it offers for those with the ability to harness it.  Organizations with effective data management are best positioned to capitalize on these advantages. While few deny the value of data, the challenge remains how to measure and value the impact of data and data initiatives.

RELATED READING: Building a Robust Data and Analytics Infrastructure

Data valuation frameworks

While there are several emerging academic frameworks for data valuation, we took a business outcome-based approach in defining the four categories below. The proposed frameworks below not only apply to a firm’s first-party data, but also for external data licensed from third parties.

Revenue generating

Firms productizing and licensing data to external consumers and those generating concrete top-line revenue impact have, arguably, the most straightforward valuation framework. A key consideration for valuation includes whether the firm is harnessing their own first-party data or redistributing third-party assets, with most firms seeking to build a proprietary content moat over the long term. While this is not something many firms with an investment returns focus are doing, market structure utilities, for example prime brokers or clearinghouses, see revenue-generating data initiatives as either core, or adjacent to core, strategically important business lines.

Revenue driving

When data is an input component of a firm’s product, whether that product is an investment vehicle or a SaaS offering, an enterprise value-based framework is the right way to think about measurement. The International Valuation Standard Council (ISVC) suggests that when intangible assets such as technology are the value drivers of a stream of cash flows, a with-or-without or relief-from-royalty based valuation may be appropriate. Valuation frameworks for revenue-driving data initiatives are still nascent and not yet widely incorporated into financial models. However, some practical ways institutional investors solve for measurement today do exist. PnL attribution to licensed datasets in an investment vehicle product scenario or customer usage or even cost of goods sold attribution in a SaaS product scenario offer pragmatic solutions.

Operational efficiency measurement

When data is not used as a revenue generator or driver, but as a contributor to operational efficiency or process automation, the right measurement framework is expense reduction tracking. While this data initiative use case sometimes has more lackluster sponsorship, impact can still be significant in terms of optimizing cost centers for the firm, whether the base unit of measure is in people hours, technology license consumption, or even dollar cost savings. In fact, the ISVC also suggests outlays for data or other intangible assets can be notionally “capitalized” in a valuation to restate margins or invested capital.

Pure R&D

R&D falls into the category of, “we have data that we know we will benefit from in the future but are still identifying production use cases”. Typically treated as an R&D expense, projects in this category must have a defined time or spend runway to transition into the revenue generator, revenue driver, or operational efficiency measurement frameworks. If the project hasn’t reached production-grade status by the end of its runway, it should either be sunset, or firms should find a low-cost way to persist the data if there is continued conviction in value at a future date.

The majority of investment firms’ data initiatives fall either in the revenue driving or operational efficiency categories of measurement:

BUILD, BUY, PARTNER: A Framework for Optimizing Time to Value for Data Initiatives

Attribution is key

In any scenario, from revenue-generating to operational, where value of data to the business is being measured, attribution is key. The two key components of implementing trustworthy attribution processes are availability of tooling for the entire data value chain paired with strong data management foundations.

The value of and ease of attribution for data increases each time it is processed toward actionable insights:

A strong data management foundation that centralizes all data and workflows for unified downstream consumption underpins the entire value chain. Tracking access, governance, and lineage of which users your data value chain is servicing, from internal users to external consumers and third-party platforms, is paramount for trustworthy attribution.

Trustworthy attribution processes form key feedback loops to inform budgeting processes for data initiatives. It’s common for firms to take a hands-off approach for mature programs. As long as the firm’s use cases for first-party or licensed data are generating value, the budget doesn’t get questioned. However, a hands-off approach neither lends itself to launching a new data initiative where conviction is forecasted but value needs to be regularly validated. Nor is it optimal if the firm is recalibrating a more mature initiative where attributed value or growth may be diminishing. In addition to internal attribution processes, external comparables, whether comparing supplier rates for data licensing or peers budgets for data initiatives, shouldn’t be forgotten as key inputs for managing a data budgeting process.

Takeaways to maximize return on investment

It’s only a matter of time before data valuation frameworks become a standard part of generally accepted accounting principles. Still, many investors are in the process of investing in the data management foundations necessary answer the first order question of what their current data inventory is.

A domain-aware data platform purpose-built for the organization can address the second order questions of:

  • How are we using data to create value for the organization?
  • Which is the right valuation framework to measure the impact of our data initiatives?
  • Where can we harness data to accelerate revenue generation?

Arcesium is a trusted partner for firms looking to leverage a configurable SaaS platform as the foundation for their data initiatives.

Want to learn more about how we can help with data management for revenue, operational excellence, or innovation focused outcomes? Reach out, we love talking about data.

Share This Post

Vera Shulgina Vice President of Product Management

Vera is Vice President of Product Management at Arcesium. She is responsible for the firm’s data strategy with a focus on driving value for Arcesium clients through data solutions and data partner integrations.

Subscribe Today

No spam. Just the latest releases and tips, interesting articles, and exclusive interviews in your inbox every week.