Compliance Demands Raise Pressure To Link Data and Re-Design Models
Risk operations professionals aim to accelerate changes to prepare for 2018 deadlines
The data operations units and high-level data executives of financial firms face regulatory compliance demands that must become standard operating procedure in about 18 months. These demands dictate a need for greater insight into firms' risks, according to executives from such firms and a service provider who spoke on a webcast titled "Managing Risk and Compliance Through the Flow of Data" on September 28.
Data departments and professionals can expect a multi-year process to implement the necessary measures, particularly when contending with the increased complexity of data needed for compliance, the executives said. They will also need to examine the provenance or sourcing of data and its completeness, the resources they have for the work involved, and the linkages of data and models that must be accomplished.
"Since we have months to prepare, hopefully we are ready, but the effort that is made is really material," said Antonello Russo, head of risk for beta, equity and commodity strategies for Europe, the Middle East and Africa in the risk and quantitative analysis group at BlackRock in London.
The Basel Committee on Banking Supervision's Fundamental Review of the Trading Book and its BCBS 239 risk data aggregation principles, along with the European Commission's revised Markets in Financial Instruments Directive are the new regulations driving these data issues.
Collectively, said Ken Krupa, enterprise chief technology officer at MarkLogic in New York, these regulations "have increased the complexity of what we need to do to provide the transparency needed by an order of magnitude. You can't expect the same technologies, techniques, people and processes to apply to solve the [compliance] problem."
As a result, just 14% of respondents to an informal poll conducted during the webcast said their current risk data transparency efforts are sufficient for regulatory compliance (figure 1).
Russo said that getting comprehensive data is often the focus of efforts to add value for compliance from the use of data, but that is difficult without "timeliness, governance and structure, which has to be in place." Establishing a common data set can prevent unnecessary work such as reconciling information between systems, which is the aim of the regulations' standardization and harmonization provisions, he said.
Data mechanisms that are reactive need to be made proactive, said Roberto Maranca, chief data officer at GE Capital in London. "Data quality, master data, taxonomy—anything that helps you put in place and embed a cultural change in order to use and think about your data better—will probably be a must," he said.
Resources and Models
A quarter of respondents to an online poll said their firms have enough resources to address risk and compliance data issues (figure 2). But among available data operations methods and mechanisms, enterprise data management (EDM) systems are a prerequisite for meeting staffing and resource needs, Maranca added.
"That doesn't happen without shockwaves and repercussions in the rest of the company," he said. "If you create something new that needs to harmonize with the rest of what you have, it must be at the company's regular pace. Just dropping 200 people into your environment can be even worse than not having [enough] people to get additional requirements under control."
Since EDM is a "cross-cutting function," as MarkLogic's Krupa said, it is hard to implement without asking for a new set of tools and finding a new way to manage data. In turn, Krupa explained, implementation of data management plans must be incremental. "The notion of having a cross-cutting technology initiative that starts with the first 12 to 18 months of people sitting in a room drawing models doesn't cut it anymore," he said. "In 18 months, a lot of these regulatory requirements will be implemented or close to implemented and in effect. You must be able to harmonize as you go along.
"It's a DNA change—including management creating new roles, like the chief data officer, and lines of business recognizing they are all participating in an enterprise function," Krupa added.
If firms can successfully manage multiple models, as outlined in BCBS 239 guidelines, by establishing the right taxonomies and architectures, and leveraging metadata, they can gain data management agility without losing control, he said.
Links and Standards
With the right data management models, architecture and systems, linking data between silos-which responses to another poll question indicated was the area of risk data management that needs the most work (figure 3)-can derive more value from compliance efforts, the executives said.
"The implementation of a common data set across functions, areas and asset classes is discussed frequently," said Russo. "It's a good way to address trade monitoring and compliance. It's really not ideal to get different data and have to reconcile it."
Firms need not choose between linking and scrubbing data to remove anomalies, according to Krupa. "Over time, as you get a better handle on the provenance of data and achieve more visibility—moving it from complex code that no one can interpret into the layer where you can ask how the data got that way—your need to scrub and reconcile will go down," he said.
Maranca elaborated on the importance of visibility, which "is a first big step to control or optimize data," he said. "If you achieve the right visibility, it's not just solving or helping the risk management or compliance issues. It's making you a better company."
Contrasting views concerning the importance of standards to support linkages emerged in the discussion. Krupa described an "open-world assumption" in which knowledge is always incomplete. "That's what we have to embrace. But because we can now manage that imperfection, the new knowledge confirms our assumptions or further informs us," he said. "That's the approach we have to take."
Using ontologies and data dictionaries to standardize data should be the first step, said Maranca. "It's tough, but worth it to try to come up with as much of a common view of the business terms we use in our firms."
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.