Technology Takes Aim at Post-Trade Black Holes

The influence of regulation and new technology is prompting a hard look at how post-trade processes can be improved, and perhaps even replaced entirely.

black hole

  • As regulation moves from formation to implementation, market participants are beginning to examine how post-trade processes need to be reformed.
  • New technology is proving to be key to this, in particular how the data from trade reporting can be used more widely, both internally and externally.
  • Distributed- ledger technology also provides a glimpse of how post-trade workflows may change. However, any significant change is likely to be years off, given legal uncertainty around developing technology and regulatory concerns.
  • Industry associations are actively examining this topic, and forming workgroups to determine the best path forward.

In March 2014, as Microsoft prepared to end support for its aging Windows XP operating system (OS) the following month, automated teller machine (ATM) manufacturers began to warn that 95 percent of US units still operated on the platform. The financial industry was roundly pilloried in the media for using an old and soon-to-be-retired program to handle everyday financial transactions—XP was released in 2001. But in the capital markets, particularly in the post-trade space, some of the processes and technologies in place today have their origins even further back.

Some believe that the entire structure of the post-trade market across all asset classes is overdue for change. As the industry begins to move past a period of frantic rulemaking and explores new technologies that simply weren’t around when workflows were first devised, this may be an opportune time to do that, they argue. It might even be time to say goodbye to some parts of it entirely—particularly those that make little sense in the modern world.

“When you buy IBM on the NYSE, do you need to then go into the NYSE platform and press a button to affirm that you bought this trade? Today that’s the process. It takes 10 manual steps to affirm a trade on the incumbent platform,” says Zohar Hod, CEO of truePTS, and former head of sales at NYSE owner, the Intercontinental Exchange’s, data services business.

Others are even more direct in their criticism.

“We’ve patched and taped this system together so many times over the past 30 years that nobody really knows how much it’s been costing us—and if it’s costing us, then it’s costing our clients,” says a New York-based technology director at a US investment manager. 

Blame Game

Some, but not all, of the blame can be laid at the feet of regulatory reform, and the reaction of banks to that on a global basis. Sent reeling after the financial crisis and the subsequent regulatory response, compliance-related activities became a driver of technology development at banks. Indeed, consultancy Protiviti found in its 2016 IT Technology Trends Survey that compliance technology spend accounted for around 12 percent of the total IT budget on average at financial services firms that responded.

This has resulted in patchwork implementations designed to address specific rules—for instance, the revised Markets in Financial Instruments Directive (Mifid II), derivatives reporting under the European Market Infrastructure Regulation (EMIR), or the electronic trading mandates in the Dodd–Frank Act that gave birth to swap execution facilities (SEFs). These projects are rarely complementary, and have a tendency to add complexity to sprawling internal and external processes.

“You often see institutions building tactical solutions—maybe one department does something, another does something else, the team focused on US regulations does something for Dodd–Frank and then the Mifid team does their own thing,” says Keith Tippell, head of sales and business development for Europe at Droit Financial Technologies. “They’re completely distinct in their efforts, which means logically, you can’t be optimal.”

But the problem runs deeper than just the past few years. Much of the workflow for post-trade processes, particularly in derivatives markets, was designed long before mechanisms such as central clearing were widespread, or common communication protocols such as FIX and Financial Products Markup Language (FpML) were fully developed.

As such, many of the operations required to push a trade through its post-execution lifecycle remain highly manual. This isn’t just inconvenient—in some cases, it’s distinctly problematic. “When SwapsWire was first built in the late 1990s, for instance, it was built for confirmations,” says truePTS’ Hod. “On that same pipe, since then, three or four different mandates such as Mifid or Dodd–Frank have passed, reporting requirements have increased, the timing of those requirements has shortened, and yet [on the NYSE] you still have a 10-step manual process to affirm a trade.”

With all of these issues and an increasing resource burden on the part of financial institutions, something has shifted in the market’s attitude. In September 2016, the International Swaps and Derivatives Association (Isda) published a whitepaper titled The Future of Derivatives Processing and Market Infrastructure, which is scathing in its assessment of how trade processing is currently conducted in financial markets. “The complexity inherent in the new derivatives ecosystem is now putting derivatives participants under considerable strain,” the report states, adding that “this now needs to be addressed.”

Black Holes

Trade reporting has emerged as a key area where, market participants believe, the correct application of technology could be broadened to other areas of post-trade processes that are currently siloed. It’s fair to say that global reporting regimes, a key pillar of the 2009 G20 agreement that kick-started the reform of derivatives markets, haven’t been as successful as hoped. The distinct lack of quality in trade reports has been a constant thorn in the side of regulators, some of which have begun to fine participants and trade repositories for perceived failings, notably in May 2016, when the European Securities and Markets Authority (Esma) fined the Depository Trust and Clearing Commission (DTCC) €64,000 ($75,000) for failing to provide timely access to data in its European trade repository.

“Reporting, especially in Europe, has been a black hole for the longest time,” says a London-based data manager at a UK bank. “It’s one of those things spawned by regulation that sucks in time, money and people, but from which precious little light emerges.”

Steps have been taken to improve this of late, however. In February 2017, the Committee on Payments and Markets Infrastructures (CPMI) and the International Organization of Securities Commissions (Iosco) issued long-awaited guidance on the construction of unique trade identifiers, a key aspect of reporting that many have blamed for low matching rates between each side of a trade due to unclear rules about their generation.

Esma, the direct regulator of European trade repositories, which handle around 400 million reports per week, also highlighted improvements in data quality as one of the core objectives of its 2017 work program. Mifid II and the Market Abuse Regulation (MAR) further increase trade reporting responsibilities and incorporate identifiers into reports, and some see the benefits this could bring to post-trade processes in the middle and back offices by streamlining currently fragmented and siloed activities, many of which often consume the same data in different ways.

“We’re working with our colleagues in the surveillance department to automate that capability off the back of the trades that users are reporting,” says Mark Husler, CEO of the London Stock Exchange Group-owned reporting platform UnaVista. “If you think about the data attributes that firms are reporting across asset classes—derivatives, equities, bonds and so on—and our capabilities in this area, we’re effectively developing a platform that enables users to not just send the data to regulators in order to comply, but actually use the application as a surveillance system.”

The firm plans to use this first to assist with MAR compliance in Europe, and then offer it to North American clients later in 2017. Other technology firms have noticed this void, and are attempting to step into it. Nex Group, for example, formed from the post-trade and electronic markets businesses of inter-dealer broker Icap in 2016, is basing much of its strategy on linking together various parts of the trade lifecycle, from pre- to post-trade, across business lines that used to operate separately. But in terms of technologies that have the potential to shake up the post-trade space, few have received as much attention as distributed-ledger technology (DLT) and its predecessor, blockchain.

Biggest Investor

The financial services industry has been one of DLT’s biggest investors—consultancy KPMG estimated in June 2017 that over $1 billion has been invested into blockchain firms and projects by banks, asset managers and others globally. As a single, purportedly inviolable, golden record of transactions, regulators and insiders have all cited the transformative potential of DLT. In particular, the technology is seen as likely having the biggest impact on post-trade activities such as settlement, where real-time settling of trades could be a possibility, through to reconciliations, the need for which could be drastically shrunk in a market based on the technology.

DLT will essentially eliminate the need for reconciliation as there will need to be consensus or validation of the trade at the time of execution. Billions of dollars in post-trade processing costs could be eliminated,” said Terry Roche, head of fintech research at Tabb Group, in a July 2017 research note.

A report, Banking on Blockchain: A Value Analysis for Investment Banks, published by Accenture and McLagan in January 2017, highlights the potential for reforming market structure through DLT, and estimates that by 2025, the implementation and adoption of mature blockchain systems could save the industry around $12 billion annually through the removal of current inefficiencies and legacy systems.

“Everyone and their dog is looking at blockchain,” says the New York-based technology director. “I don’t think we’ll all be moving to a distributed-ledger-based market in the next five years, but it’s encouraging that people are starting to think about this from a structural perspective. It starts the conversation.”

While still largely in its infancy through industry consortia and in-house pilot schemes, some applications of blockchain in post-trade processing are starting to take shape. In January 2017, the DTCC announced that it would be embarking on a project to “re-platform” its Trade Information Warehouse utility to use DLT. The utility, which processes 98 percent of trade lifecycle events for the global credit derivatives market, will be moved to a blockchain-based platform developed by IBM, distributed-ledger consortium R3 and vendor Axoni. The DTCC has also partnered with another major financial services-focused blockchain group, Digital Asset Holdings, to explore ways in which US Treasury, agency and agency mortgage-backed repo transactions can be cleared and settled through a single platform.

There may even be applications in areas such as trade, transaction and client reporting for buy-side firms, particularly if blockchain adoption becomes widespread and regulators are able to access those systems. However, experts caution that this will require the industry to fully embrace a digital future and abandon its continued reliance on manual processes and physical documents.

“In terms of trade processing, if we grow up about dematerialization and actually do it, then we get the benefit of eliminating a number of components in the post-trade process,” says Ian Hunt, a consultant who works with fund managers such as M&G Investments on projects that handle record-keeping and DLT. “For regulators and clients there are also benefits—if we have a blockchain and we can give permissioned access to that blockchain to the regulator, then instead of us having to write extracts and reports specifically for them, they can go and get what they want from the ledger. And because it’s immutable, we can’t manipulate it, perpetrate fraud, or mislead them.”

Gray Area

The incorporation of new technology also comes with challenges. On a purely legal level, developments such as blockchain and the growth in smart contracts—software that can automatically mimic post-trade lifecycle events such as coupon payments and margin calls—exist in a gray area, despite efforts by Isda and others to create legal documents governing their operation. 

There are also regulatory concerns around the potential impact this technology could have on financial stability. In February 2017, CPMI released an “analytical framework” discussing the application of DLT, in which it highlighted certain legal questions around settlement finality, but also queried whether the widespread use of smart contracts could introduce pro-cyclical effects, thanks to the automation of post-trade processes that are currently governed by humans.

DLT could also have negative implications. For example, in a possible future configuration with many automated contract tools, macroeconomic conditions could automatically trigger margin calls across financial market infrastructures, leading to severe liquidity demand across the financial system and creating a systemic event,” the report states. The CPMI called for more research into understanding how these tools and technologies would be correlated.

There is also work needed to change the mindset of an industry long used to the way things work. “People keep on asking me, ‘Zohar, what are you doing to help us make better reconciliations?’ And my answer is: I should reduce the need for reconciliations, not help you do better reconciliations,” says truePTS’ Hod.

However, despite positive moves in this area, changes appear to be a long way off. The Isda report cites a general lack of sophistication among the industry in some areas, perhaps most damningly that “some firms and infrastructures still rely on fax for some of their business communication and instruction.”

But the trade body has put its money where its mouth is. Following the publication of its 2016 whitepaper, it announced that it would work on defining processes and procedures in trading to a standard, machine-readable format, known as the common domain model, or CDM. “The system as it stands is creaky, over-complicated and outdated, increasing cost and compliance burdens for all market participants,” says Scott O’Malia, CEO of Isda and a former commissioner at the US Commodity Futures Trading Commission. “New technologies can alleviate many of these problems, but first we need a reform of current standards and practices.”

No Small Feat

In practice, this is no small feat. Each institution, by nature of the absence of such standardization in the past, has its own way of doing things that often does not mirror practices at its contemporaries. Market participants are also not known for their ability to agree on what should be fairly basic common elements to trading, even in relatively homogenous markets such as the US—the years-long wrangle over the legal entity identifier, and the 20 years it took to shorten the settlement cycle in equities to two days from three, are just two examples of this inertia. “It sounds simple, but it’s a huge task,” says O’Malia. “Getting everyone to agree on a set of definitions, and encouraging widespread adoption, will be challenging.”

Blockchain will become a part of that change, experts say, although it will not provide the fuel for the entire process. What people are looking at now, 10 years after one of the worst financial crises in living memory began, is how the very structure of the capital markets may need to be torn down and rebuilt to meet the expectations, requirements, and rigors of the modern age.

“That will require global effort and coordination on a scale we haven’t seen before,” says a London-based senior technologist at a European bank. “But there’s no doubt that it has to change. My view is that it will happen a little at a time, here and there, until we wake up one morning and realize that the world we live in now isn’t the same as the one we did 20 or 30 years ago. That’s how fundamental change occurs in markets—just look at how electronic trading is still being discussed like it’s this new thing, even though Nasdaq launched in 1971.”

Accomplishing this will also require a shift in the mindset of how the industry views technology, moving from its current perspective of it being something that simply augments existing processes through to one where it forms the foundation of them. Or, as the Isda report states: “The answer is not just about speed and the replacement of existing processes with faster solutions. It is about reviewing and possibly re-engineering the whole post-trade process.” 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here