Bank of England’s Ambitious Look at Regulatory Reporting

As the regulator looks at new ways to handle data, there are still a lot of paths to consider.

Future transformation

The aftermath of the financial crisis has seen a massive increase in reporting requirements for trading firms. This has also created unique challenges for the regulators themselves, as they need to be able to take in, store and analyze this sea of information—or else what’s the point?

As a result, the Bank of England (BoE) is exploring new technologies such as artificial intelligence and cloud storage to get to grips with these requirements, as well as improving its ability to handle machine-readable regulation, according to a recent report published by the regulator.

“Given the volume of information created, transmitted and received, embracing leading technology is no longer a choice,” the BoE report said. “The explosion in the volume of regulatory data means supervisors receive more information than they can absorb and analyze using traditional methods.”

The value that regulators have gained from reported data has been a topic of contention for some time, particularly as regulated firms have had to allot a fair amount of monetary and human resources to build out new systems to comply with regulations such as Mifid II and EMIR. As a result, EU regulators are having to catch up with the rest of the industry in order to deploy technologies that will make sense of the incoming data. Discussing the report, John Kernan, head of product management at trade repository Regis-TR, explains that regulators are only as good as the tools and technologies they use to analyze the data they receive from trade repositories, regulated firms and/or delegated service providers.  

“‘Data’ doesn’t automatically equal ‘information’—it needs cutting and dicing in order to reveal trends and insights,” Kernan says. “Simply collecting more lines of data, or more data points, does not mean greater supervisory capability.”

According to the report, regulatory authorities spend, on average, two-thirds of their time manipulating data, rather than analyzing the data—putting into question how equipped supervisors are today at assessing systemic risk or market behavior. To help tackle these concerns, the BoE has conducted proofs of concept using AI technology to derive detailed analytics from the data accumulated. It is also looking at machine learning techniques for detecting market anomalies and monitoring trading activity. Additionally, the regulator is exploring the potential of developing a data standard and a machine-readable rule book.

Some of its ambitious plans for the future include using natural language processing, machine learning, and big data analytics to chart data points in real time for supervisors. This will be used to provide an instant view of how firms are performing against forecasts, with live updates of key ratios. These detailed analytics will enable regulators to apply real-time shocks and stresses to better predict breaches of regulatory requirements.

The BoE did not respond to a request for comment in time for the publishing of this article. 

Embracing the Cloud

The BoE is also turning its attention to cloud technology for its scalability, as well as its cyber and operational resilience, the report says. This follows in the footsteps of the Financial Conduct Authority (FCA), which is looking to migrate all of its operations to the cloud. In February, when WatersTechnology spoke to Nausicaa Delfas, an executive director at the FCA, she explained that the regulator was about halfway through the migration process.

“It is definitely a significant move and we are partly through that process,” Delfas said. “Our aim is to remove our reliance [to] physically contracted or directly contracted data centers in the next two to three years. It is a [work in] progress, and we are seeing many benefits from it already.”

Another potential benefit of using cloud technology relates to how regulators can collect data. One approach highlighted in the BoE report is enabling the regulator to access a firm’s cloud infrastructure directly at any point through a shared data lake. This, it hopes, will remove the slow and cumbersome system of receiving quarterly reports, and diminish the need for ad hoc data requests from the regulator to regulated firms.

APIs and Central Repositories

The BoE outlined four options that it is considering—two of which could transform the entire system in which regulatory reporting is conducted and collected. One option involved accessing firms’ data and systems through individual application programming interfaces (APIs) using a common format. This approach could reportedly reduce the submission time from 30 minutes to 10 seconds and will require industry firms to be comfortable with permitting access to their systems.

But according to Kernan, there are many existing complexities with this approach.

“If we think about just one of the corporate towers looming over Canary Wharf, and the number of trading desks and business lines, front-, middle- and back-office systems that are holding and generating data, the number of counterparties holding disparate parts of datasets, it soon becomes clear that it is a much more complicated task to hook up regulators’ systems to firms’ systems,” he says.

Another progressive option proposed by the BoE would be to create a central data repository, where all statistical and regulatory reports are held.

This would involve working with the industry to build the data utility and would look to enable near real-time analysis of the market. Mark Husler, head of business development at the London Stock Exchange Group and CEO of UnaVista, believes regulators will have to consider the complex nature of regional regulatory requirements and global data protection laws if they decide to take a more transformative approach to regulatory reporting.

APIs, infrastructures and new technologies will continue to evolve and assist, but we have to ensure that every local [and] regional regulatory rule is adhered to, and where, for example, there are datasets that are considered private,” Husler says. “So under rules or regulations like GDPR [the General Data Protection Regulation] where there are rules about data residency and how data should not leave a specific region… all of those legal and regulatory governance rules do have to be maintained.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

‘Feature, not a bug’: Bloomberg makes the case for Figi

Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.

Where have all the exchange platform providers gone?

The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here