As ledgers have achieved heightened relevance and early real applications, data teams remain largely off to the side—and this, industry experts say, could eventually prove problematic as the space matures and grows more complex. At a bare minimum, distributed ledger participants must be able to interface with a ledger successfully and securely, oversee their activities and translate ledger events to the range of relevant (and often function-aligned) systems running in-house. Firms will also want domain expertise in place as more asset classes adopt distributed ledgers, potentially creating opportunities for insight to be gleaned through Big Data applications and other analytics on the back end. And, of course, they’ll need to meet the same principles and standards that many chief data officers (CDOs) are enacting throughout the enterprise.
In practical terms, much of this design work happens within a ledger’s shared data layer, which itself is no small ask. The more DLT touch-points there are floating around an investment bank or asset manager’s periphery, the more immediate the need becomes for CDOs and data teams to create structure and wield influence around these activities.
The Old Recipe
In 2017, the potential applications for distributed ledgers run the gamut and are well-rehearsed: For now, smart contracts for syndicated lending, trade finance, collateralized debt instruments, catastrophe bonds and some smaller IPOs; for later, post-trade utilities, central clearing and even exchanges. Many of these initiatives are still managed by consortiums or partnerships, usually running on one of four major ledgers—Ethereum, R3’s Corda, JPMorgan’s Quorum, and DAH’s Hyperledger. This, as one source says, has been the standard operating model: hedging and mutualized risk externally, while internally, most work is hosted in an innovation lab or similar carve-out, away from firms’ core production environments.
This separation allows the bank or buy-side firm to continue operating while pursuing stealth-mode experimentation and developing solutions; meanwhile, the lab itself serves as a responsive center of excellence—nimble and flexible to new developments in the space. As Eric Piscini, global blockchain leader at Deloitte tells IDM, many senior data governance leaders and CDOs have been mostly content with that arrangement. “The CDOs we talk to are usually so busy with other challenges, they just don’t see blockchain as being on the front burner,” he explains. “The exceptions are those in the middle of a larger settlement or reconciliation transformation of their own, who are considering leapfrogging into it.”
Others are certainly listening as well. Matt Shaw, head of the enterprise data practice at Synechron, suggests that Big Data projects prized by CDOs—data lakes and warehouses, typically backed by on-demand scaling via cloud—and DLT will eventually converge. “Today, that convergence is more luck than judgment,” he offers. “Broadly, there is still a need for firmer understanding of the data layer architecture. But once they have that, they can better explore DLT along with newly-arriving frontiers and possible overlap with artificial intelligence (AI) and analytics.”
One senior data governance head at a large US-based asset manager shares that sentiment, confirming his position outside the firm’s ledger fray so as to focus on other priorities. But, he says, this is a conscious choice—and one that is almost perfectly emblematic of data management precepts.
“There are many dimensions of data alignment implications and industry maturation required in our data management disciplines to support blockchain optimism. Legacy data infrastructures notwithstanding, we need to demonstrate major advances in common data standards, guaranteed transaction resiliency and common data identification frameworks to reach for a distributed yet shared common ledger architecture,” he tells IDM. “I’ve not been involved with blockchain evaluation, and am unfamiliar with our projects underway so far, but I would suggest any data angle in the discussion would need to touch on those necessary advancements as prerequisites to establish a base framework that is open, operable and scalable for the industry using any blockchain platform.”
The Right Batter: Interoperability
Most sources see the rubber meeting the road a few years from now, when ledgers become sufficiently customary yet technically divergent to the point where they must be able to link together. Last fall, Ethereum co-founder Vitalik Buterin described this moment in a paper, writing, “Use cases involving chain interoperability are among the few that may take the longest to come to fruition, simply because the set of dependencies is so large. Such a use case requires (i) some chain A that is mature enough to build off of, (ii) some chain B that is mature enough to build off of, and (iii) some application or need that cannot be served by implementing it on a single blockchain.”
We may already be seeing a preview of this in action. Sources see capital markets ledgers reaching this kind of impasse after, as one put it, the stubborn “stalemate scenario” among the main DLT developers that has built up over 2017. In practice, it’s something of a game theory problem. On one hand, the ledgers function best when they are self-contained and light on their feet. Likewise, the short history of investment for capital markets DLT has incentivized a kind of parallelism of ledger types, rather than consolidation. On the other hand, underlying dependencies among financial instruments and markets, and the non-linear nature of investment operations (particularly post-trade) essentially guarantee that ledgers must be able to communicate. Something’s got to give.
Little surprise, then, that interoperability is seen as the next “big solve.” Major sell-side firms—Barclays and JPMorgan, among others—have publicly and independently jumped at what amounts to a new fertile ground for intermediation. What’s also interesting, multiple sources agree, is that chain interoperability reveals a reality about DLT that has been there all along: For all of the innate benefits ledgers can bring—surety and speed of settlement, ledger availability and integrity—improved data quality isn’t one of them. Instead, it has to be built. This fact is less obvious when a chain is developed in isolation; with a standalone private ledger, one governing a particular collateralized loan obligation (CLO) syndicate, for instance, it is easier to manage. But for those with far bigger ledger dreams, this new problem requires a trip back to a familiar well.
“Collectively, the industry is looking at data standards now, and considering which message formats will be adopted for data transfer between DLT solutions,” explains Clarke Thompson, solution architect at R3. “Bodies including Swift and ISDA (the International Swaps and Derivatives Association) have recognized the need for new standard data hierarchies and taxonomies, and in addition to emerging understanding of a need for common standards—both within a single DLT platform and across multiple ledgers—there is an opportunity to use multi-party data cleansing activities to solve for data quality problems in the reference data space and also generate new sources of reference data.”
Baking a Better Cake
Greater focus on ledgers’ shared data layer may present significant challenges of engineering. But focusing on interoperability also offers significant upside from a ledger development and assessment standpoint, as well as opportunities for external providers to fill in informational gaps at greater scale.
For one thing, the sociology that governs the DLT space has typically favored tactical implementations and been asset class-aligned, and some suggest this has led to consortiums’ fragmentation in the past couple of years. A greater focus on data usage could provide new vectors of inquiry among like-minded groups. “Firms are now evaluating the balance between privacy, anonymity, commercialization opportunities and the benefits of open-source distribution, and certain participants will have different agendas and different ends,” says R3’s Thompson. “Some projects have considered relevant data standards, and we have advocated their use wherever possible. In one such project, participants try to optimize data quality by allowing multiple institutions to attest to the validity of data schema.”
Flexibility in the data layer also allows data teams to configure ledgers more effectively over time, says Deloitte’s Piscini. These “Lego blocks” include incorporating new wallets or digital signatures, and for that matter different types of cross-ledger linking mechanisms like notaries, relays and hash-locking. A third potential area for improvement is deeper integration of the compliance layer of the ledger, as well as observables (actions on the ledger) and development of cryptography that anticipates more sophisticated privacy protections for the future. These benefits have often been heralded, Shaw argues, but often simply as a matter of intrinsic design. These areas align well with CDO priorities he says, and going forward, a data-prioritized approach will enable greater use of APIs to pull in new third-party data provisions, as well.
“Know Your Customer (KYC) compliance is a very thorny challenge, and this is one area where managers can use the ledger to be more strategic. Looking at the several KYC utilities currently out there, and firms filling in as many as 270 fields for counterparty identification, as an industry we devote a lot of effort and cost just to gain consensus on KYC requirements. And even then, firms will still augment those internally with their own values,” Shaw explains.
“Ledgers should allow firms and KYC service providers to side-step much of that work—to go from a mode of description to identification. The same can be said for supplying global timestamps, shared evaluation services, and relevant observables supplied for smart contracts. These are great spots to take away some operational pain. Doing so would bring institutional ledgers closer to innovations we see in the startup space like [former Barclays CEO] Antony Jenkins’ 10X digital banking offering, that really focus on optimizing ledgers’ data layers and deploy APIs to focus on user experience in a rigorous way,” Shaw adds.
Ready to Serve?
The next wave of ledger innovation will ask more fundamental questions than the first—which, understandably, sought to successfully persuade capital markets to innovate using a green-field technology. Instead, the stakes will be raised, to fundamentally test DLT architecture, and indeed challenge the notion that a ledger can be both the single (and often open) source of the truth for transactions—the “incontrovertible representation of a trade fact,” as R3’s Thompson puts it—while simultaneously supporting the bells and whistles and controls that banks and buy-side giants would like to hang on it.
It’s a prospect that will surely make some blockchain purists wince, but a necessary one if the market for DLT is to flourish in a regulated environment. One particular puzzle will, in effect, be a Pandora’s Box problem. Deloitte’s Piscini believes firms should be preparing for “the fabric of any transaction” to be exposed to a ledger in the future. But bringing DLTs into enterprise data management will inevitably require wrenching decisions about what should be pulled into the mix and what shouldn’t, Shaw suggests.
“Trade reporting and compliance will be completely on-chain, given that it is one of the primary selling points to regulators, but you can foresee digital asset and account issuance, and even a market data consolidated tape running via a ledger as well,” Shaw says. “Conversely, most firms will be far more careful about proprietary valuation models and asset classification, and we’re working in consortium right now on a project that would preserve KYC client risk-rating and relationship analysis off-chain, while moving identification on chain, as well.”
To navigate those choices, firms will need to change their way of thinking from post-facto, backward mapping to forward planning—from slinging chains around the lab to future bedrock.
“My impression is we think we have a solution with blockchain capability and a public ledger, and we are currently exploring where such a solution could solve problems, but it does not appear that an ‘integrated’ approach with data management leadership is occurring yet,” says the US asset manager’s data governance executive. “This could be simply because we’re experimenting with siloed use cases to determine if the blockchain is viable, and what benefit such a data structure could yield. But it does run the risk of having to deal with data alignment and integration later—and that’s typically not a good practice. Better practice would be to have more unified approach with data versus platform and function at the heart, likely starting with a use case that demonstrates ledger value.”
Interestingly, Ethereum’s Buterin rides a very different perspective to the same conclusion about deliberate change. “Building interoperable applications is something that should be done on a basis of responding to actual need,” he wrote in his paper. “While the underlying technology can certainly be developed in the abstract, it would likely be a waste of resources to try to build applications that ‘connect the chains together’ without a clear underlying use case-driven reason for why such a thing would be useful.”
But, then again, ledgers and data management have never really been quite so far apart as they seem—certainly not where patience is concerned. Shaw notes a rare, if accurate, comparison: Legal Entity Identifiers and bitcoin are just about the same age, and notes that—even with the broad industry support and participation, and regulatory pressure for adoption—LEIs still haven’t achieved critical mass.
Without any regulatory impetus for adoption, firms will have to decide which business cases should be a core ingredient that needs to be baked into their DLT strategy, and which are merely the icing on the cake.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.