Regulators 2.0: Embrace Tech or Be Overwhelmed by Data

To handle the massive increases in data volumes expected from Mifid II regulation and the US Consolidated Audit Trail, regulators are examining their own data governance and technology use, report Joanne Faulkner and Kirsten Hyde.

“Regulators are preparing and embracing the massive change that is coming with new pieces or legislation and the new datasets that will become available to them,” said Olga Petrenko, market integrity senior officer at pan-European regulator the European Securities and Markets Authority (Esma), speaking at Inside Data Management’s European Financial Information Summit in London in September.

Since the implementation of the original Markets in Financial Instruments Directive (Mifid) in 2007, the regulatory community has been privy to much more granular data and datasets, Petrenko said. “The question going forward is how do you make sure this data is used most efficiently, and how do you make sure the cost to the public sector does not skyrocket when trying to make use of this data.”

olga-petrenko-esma

Regulators are now asking themselves the same questions as chief data officers within financial firms, Petrenko said, adding that these conversations are a departure for the regulatory community, and were simply not being talked about just two years ago, whereas regulators are now having discussions with competent authorities on “the best way to organize the data—making management understand what we’re talking about, the process of organizing the data management framework within the institution, [and] whether data strategy comes before or after business.”

Until recently, legislation was conceived, discussed and negotiated in silos within different departments of the European Commission, she said, and legislation would often go into the implementation stage without any communication between different departments working on similar rules.

‘Educational Leap’

Creating the Mifid II regulation represented a “big educational leap” that has resulted in a series of IT challenges for ESMA, such as the building of the Financial Information Reference Database (FIRDS), which collects reference data submitted to local supervisors as part of Mifid transaction reporting. “This is the first ‘delegated project’ we have undertaken where the task of data collection from the trading venues was designated by 22 of the 27 National Competent Authorities (NCAs) that have signed the agreement for full connectivity—meaning Esma is directly connected to trading venues in 22 jurisdictions, and is directly connecting to data according to the same standard, format in the same XML message,” Petrenko said

Esma’s desire to process large amounts of data in a centralized manner means it will be able to ensure the quality of the data as it “allows you to benchmark data coming from different trading venues. It allows you to identify problems and issues in a swift and smooth manner.  It allows you to see whether they’re jurisdiction- or trading venue-specific,” she said, adding that data quality and its assurance will be the top priority for regulators in Europe once Mifid II goes live. “You will see quite a significant shift or reorganizational focus in 2018 in terms of what regulators do on data quality, what expectations they have, how it’s being implemented, and what data management processes they will be expected to apply. It will shape the compliance agenda of supervised entities.”

‘Embracing’ New Tech

Regulators are also keen to show they are able to embrace new technology. Some are looking to regtech to ease the regulatory burden for capital markets firms, while others have identified ways to can keep up with the evolving markets they are monitoring.

patrick-hogan-european-central-bank-ecb

Patrick Hogan, head of section of banking supervision data division at the European Central Bank, speaking on a panel at EFIS, said regtech is not just for banks that need support to meet regulatory requirements, but also for supervisors who need to keep abreast of the entities they are supervising. 

“Regtech and fintech are getting increasing attention from the regulators. At this stage, I believe a lot of it is to understand what is happening and why… It brings a lot of new challenges in understanding what exactly the innovative products are delivering,” Hogan said. 

To keep up with the markets that they are supervising, regulators have no choice but to embrace technology. Hogan said the ECB is often asked why so much regulatory data is needed from banks in the first place. 

“We need that information to be able to verify that the reporting entities are in compliance with the regulatory requirements…. We need data that can allow a supervisor to understand the risks and vulnerabilities of the reporting entities. Hence we need timely and accurate data. At the macro potential level—or the financial stability level—you need to have trust that the data in an aggregated form truly reflects the state of the industry at that time,” he said, adding that embracing regtech can “provide a basis to allow the supervisor to check compliance through the systems, and also have trust that when they’re using that data themselves, it comes from sound regulatory decisions.” 

The revised Mifid II and Mifir regulations are strengthening the ability of NCAs to perform their surveillance duties by putting access to a wider scope of data at their disposal, said Franck Lasry, transversal project manager in the market surveillance division of French regulator Autorité des Marchés Financiers (AMF), speaking at the same event. While big data and machine learning might not be particularly “new” to firms, “it is [new] for us. It’s a major leap forward to be able to adapt to the market,” he said. 

franck-lasry-autorite-des-marches-fianciers-amf

Lasry said the AMF’s previous system is more than 15 years old, and has “several programs which have not been designed well,” whereas Mifid provided an opportunity for the regulator to start over from scratch. Earlier this year, the AMF set up its ICY system—a monitoring system that allows the regulator to exploit large volumes of data to detect market abuse—and will gradually roll it out to start receiving Mifid II data from January 3, 2018.

Lasry also said AMF is in the process of setting up a data governance team. “We have several divisions which are dealing with the same information… built in silos. Now is the time to make it more… so the data will be the same across divisions. We are setting up this organization for the long-term basis.” 

UK regulator the Financial Conduct Authority (FCA) is also expected to “provide further details, plans and next steps” on its use of machine learning. In a speech earlier this summer, Nick Cook, the FCA’s head of data and information operations, said that while the regulator is “still learning” the best application of new technologies such as machine learning to support regulatory compliance,  it is keen to be involved in discussions to come up with global data standards to help with the automation of regulation, as well as an “industry framework for how to assess and manage IT risk and how to consider and manage the risk of deploying regtech solutions into their infrastructure.” 

Digitization is also on the agenda. The FCA has partnered with regtech vendor Corlytics to apply a central, common taxonomy to all regulations to make its handbook taggable and machine readable. The regulator has started by improving the searchability of topics within the Conduct of Business Sourcebook, and will shortly release tagging for other sourcebooks. According to Cook, the long-term aim is to have rules that can be “fully and unambiguously interpreted by machines,” and to strip down some of the “enormous” costs associated with interpreting regulatory requirements.

Them And US

It’s not just European regulators that are preparing for an exponential growth in the amount of data that will be entering their systems, along with the challenges that this will bring, and the need to embrace new technologies. 

In the US, self-regulatory organization the Financial Industry Regulatory Authority (Finra)—which oversees the trading activity of some 3,800 broker-dealers and monitors an average 37 billion trades, orders and related market events each day—has been developing machine-learning software to enhance and accelerate the computer programs it uses to sift through massive amounts of data to detect potential misconduct (see related story, this issue). The regulator has already moved its surveillance systems to Amazon Web Services’ cloud platform, enabling it to more economically tackle surges in data intake.

robert-cook-finra

“Having the storage and computing power to process this data efficiently is essential, but only one of many challenges we must address to make the data useful. For example, we also have to undertake extensive work to ‘normalize’ data received from different sources and in different formats into a structure in which like data elements are consistently identified so that the data can be analyzed effectively,” said Finra president and chief executive Robert Cook in a speech last month.

While embracing new technologies to enhance its market surveillance infrastructure and has worked with exchanges to aggregate data and coordinate cross-market oversight, Finra and the other self-regulatory organizations are bracing themselves for a new set of challenges when the new US Consolidated Audit Trail (CAT) system comes into operation next month. 

After years of proposals, delays and intra-industry fights over how it would be financed, US financial regulator the Securities and Exchange Commission (SEC) last year approved a plan for the creation of the CAT, a single central repository that will collect and track data on all trading activity in the US equity and options markets, and will be accessible by the SEC, Finra and US securities exchanges. 

The CAT will require the submission of much more data from market participants than is currently aggregated, including information related to listed options and orders originated by market makers. Beyond trading data, personal information about individuals sending trades to their brokers will also be required for submission. The SEC says this granular data will allow regulators to more quickly identify insider trading and other violations. 

scott-bauguess

Although clouded in controversy of late—with senior politicians urging the SEC to postpone CAT’s debut in the wake of a cybersecurity incident involving the SEC’s Edgar filings system—the current schedule, at the time of going to press, calls for Finra and the securities exchanges (collectively, the SROs) to begin submitting data to the repository on November 15. Broker-dealers will follow with their orders and transactions over the next two years. As Scott Bauguess, deputy chief economist and deputy director of the SEC’s Division of Economic and Risk Analysis (Dera), noted in a speech at the OpRisk North America conference in June: “This will result in data about market transactions on an unprecedented scale.”

However, implementing the SEC-approved plan is a complex process, and a number of issues are coming to the forefront now that the CAT is on the brink of becoming reality. 

“While today Finra and the various exchanges each have access to information that the others do not, once the CAT is implemented all SROs and the SEC will have access to all CAT information. The exchanges will not need Finra’s data—and Finra will not need the exchanges’ data—to oversee the markets. That raises the question of who will do what types of market oversight in the post-CAT world,” said Cook in his speech last month. “What happens when any one regulator spots something that requires follow-up? If everyone is reviewing the same activity, will 20 active exchanges, Finra and the SEC all separately be responsible for investigating instances of suspicious behavior and then bringing disciplinary actions if warranted?” he added. 

“While the CAT has the potential to improve our oversight systems, it also has the potential to impose unnecessary burdens on our markets and the industry if we do not pay close attention to avoiding supervisory fragmentation or duplication. So it is important that the SEC, Finra and the exchanges work together to assess and define what market oversight should look like after the CAT is implemented,” Cook said, also noting the importance of embracing new technologies to adapt to the new regulatory environment. “Surveillance in the post-CAT world must be no less quick to embrace new developments in technology, such as machine learning, in order to innovate and adapt to these developments, including new products and market practices.”

With so many changes and data requirements coming into force over the next three months, regulators are facing a greater data management challenge than any individual firm alone. So while there are serious concerns about individual firms failing to meet the compliance deadlines for these new data requirements, regulators must either embrace new technology with open arms, or risk failing those firms that succeed in meeting their obligations. 

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here