Managing Data Supply Chains For Golden Copy

john-bottega-edm-council
John Bottega, consultant and senior advisor, EDM Council

With golden copy data inherently coming from a combination or comparison of data sources, the links in the data supply chain are taking on greater importance in efforts to produce that golden copy, according to industry data management executives.

"To say that you can just get data off the shelf that is fit for golden copy for any firm is not true at the moment," says Igor Lobanov, enterprise architect at Legal & General Investment Management in London. "Standardization of what different firms want, at different times and with different integration technologies may be cost-prohibitive and, frankly, against the interests of data vendors," he says. "Because of that, data management is always a halfway house. Some data issues can be resolved by the vendors, but way too many still require in-house technology and data management capability."

In the supply chain, the data vendor may be committed to providing certain data and capable of following through on delivering that data, but producing golden copy can become difficult if internal data management and data distribution infrastructure at the receiving firm is inadequate, explains Lobanov. "Then you're limited in what you can achieve, especially with data supporting some mission critical capability of the business," he says.

Also, working with an intermediary such as a data aggregation service or an outsourced data management service to manage the data supply chain can introduce complications, according to Lobanov. "They have to be transparent about the levels of service," he says. "The technical solution used to deliver the data may also be a limiting factor. Things to consider are additional latency, quality guarantees and mechanisms for corrections of data already delivered. You should be sure that both contractual obligations and technical infrastructure complement each other and support your data management objectives."

Identifiers such as the legal entity identifier (LEI) can prove useful when tracking data through a data supply chain, observes Stephen Engdahl, senior vice president of product strategy at GoldenSource. In addition, he says, automating the data supply chain is possible, but it requires the integration of data processing systems and firms must take care that automation does not lead to duplication of processes.

"For a truly efficient supply chain there needs to be a collaborative approach that suits all parties while allowing for competitive differentiation and innovation," says Engdahl. "Industry utilities have proved successful elsewhere and could provide an obvious answer for data management-specifically in separation of commoditized parts of the process, including the creation of raw data, manufacturing and capturing of events, aggregation of data feeds and management of quality and distribution."

However, in the reference data sphere many are adopting data supply chain solutions specific to their customers on a managed services basis, rather than as a mutualized utility, says Engdahl.

 

Centralized or Federated?

The data supply chain has evolved in recent years as firms have become more data-centric, according to Rick Aiere, a New York-based data management consultant. Regulation has encouraged higher data quality by requiring it-through good data governance, he says, and good data governance is necessary to get that quality.

Becoming data-centric is often characterised by a consolidation approach under an executive serving as chief data officer or a similar role. However, ‘golden sources' are also emerging as a counterpoint to golden copy derived from multiple sources, says Aiere. "There are external sources of information, such as market data, reference data, client reference data and legal entities," he says. "Now, there are more standardized guidelines based on what the actual source is and how that data is brought into an organization. Everything has context-data is meaningless without it. So, in a given context, how is data captured and represented?"

Internal distribution, as Lobanov noted, requires firms to know who is responsible for maintaining data as it circulates, and who is responsible for ensuring quality and governance, says Aiere. The role of chief data officer (CDO) has appeared in an increasing number of firms in the past few years, and has become more clearly defined to meet these requirements and respond to regulation, he says. However, CDOs are more appropriate to a consolidated data environment than a federated one, explains Aiere.

A federated model, with requirements for how business entity data is viewed in context, has to be efficient in providing data to exchanges, to entities within the firm, and to external consumers. Whether such a model is better or worse at distributing data to those parties depends on the departments generating the data and the type of data that then has to be brought together after distribution, says Aiere, noting the subjective nature of assessing a federated model.

Virtual cataloging or consolidation of data from a federated model can be used to respond to requests for golden copy information, adds Aiere. "Depending on the size of the organization, the capacity, the grouping and the relevance, if one organization requires consolidated data but they can maintain the details, you don't need to propagate that and have a federated golden copy throughout the organization or have a consolidated golden copy. You can have reasonable golden copies and then just exchange information in consolidation or aggregation."

Also, golden copy components may come from different departments such as front offices, back offices and client references, notes Aiere. "Given the context of the golden copy definition, how they are structured varies from organization to organization, front to back," he says.

The Sum of its Providers

Regulation-in particular, BCBS 239 risk data aggregation rules-has affected the handling of data supply chains, according to John Bottega, senior advisor and consultant at the EDM Council. "One of the tenets of risk data aggregation is that firms have a standard data lineage-having providence over their data, knowing where it comes from, and understanding how it flows through the transactional process chain and the data supply chain," he says. "A lot of that speaks to the creation of data on the transaction side, then all the way through from front to middle office, to compliance and to back-office clearance and settlement, and then where it goes into regulatory reporting."

Market data, benchmarks, indices and pricing can all feed into the data supply chain that produces reference data, observes Bottega. The supply chain involves steps such as acquisition, process cleansing, maintenance, distribution and consumption, he says. The final destination can be accounting, reporting or other functions.

Although Bottega does not define the choice of data providers as an element of the data supply chain, evaluating these service providers can be a link in it, he explains. "I have processes around exception management and exception handling. I might create, for example, a set of rules around what is an acceptable tolerance in the change in the price of a bond, and based on that tolerance, whether it's worth being looked into and researched. It could be valid, but my tolerance level is such that it's not a valid change. It might be due to an error. This is how firms are ensuring that the data that goes into their systems is of high quality."

Assembling golden copy data, however, does require quality sources, and some sources may be better at certain types of securities than others, says Bottega. Also, rules engines for data processing can be set at certain tolerance levels of accuracy, he explains. Accuracy and quality may be fit for trading but not for clearance and settlement. If data is good enough to be golden copy, it can be included directly or, if not, used in part and complemented with exception reports. Data operations teams then have to correct those records, says Bottega.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

‘Feature, not a bug’: Bloomberg makes the case for Figi

Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.

Where have all the exchange platform providers gone?

The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here