FRTB data quality issues persist amid shifting implementation dates

Banks are finding market and reference data challenges posed by the FRTB’s standardized model tricky, compounded by uncertainty over when the regulation will take effect.

Some banks are struggling to aggregate market and reference data of sufficient quality to fit the models of the Fundamental Review of the Trading Book ahead of the proposed 2025 deadlines.

The director of FRTB implementation at a global systemically important bank (G-Sib) tells WatersTechnology that the firm was forced to completely revamp the way it categorizes data due to the difficulties presented by meeting the risk calculations of FRTB’s standardized approach.

“We had to do a full audit of our data just to identify the source of the data,” he says. “We had to go and look who is responsible for that internal classification of sectors to fill this gap around the data governance in terms of ownership and the different processes around data management.”

The bank, which has been working with Bloomberg to strengthen its FRTB offering, made the change after seeing how many gaps were present in its existing data backlog and tried to fill those gaps by sourcing replacement data. “But sourcing the data is, in itself, the easy part,” he says. “It’s actually about how much trust you have in that data, how do you rate that data, and how fresh is that data? These problems really can be very time-consuming to solve, right?”

Banks across several regional jurisdictions are intensifying their focus on FRTB, as the US and the UK are expected to implement the rule next year. The EU was previously included in this group until earlier this week, when, on June 18, the European Commission announced that it would postpone FRTB implementation by a year amid concerns that US regulators would fail to meet their July 2025 deadline and that the two regions would be misaligned.

Though the US Federal Reserve has not yet announced any delays, a flurry of negative feedback from Republican lawmakers, lobbyists, and even some Democrats on the rule’s size and complexity has put the go-live goal of July on shaky ground. In May, the European Commission warned that concerns over FRTB implementation and possible delays stateside could lead the EC to delay its own implementation deadlines until a point where both jurisdictions are aligned; however, it has decided to lead the charge in the matter.

The G-Sib’s director of FRTB implementation says that neither the US nor Europe is pressing ahead with FRTB because they fear going first and getting something wrong.

“The US is a key determinant, especially for European and UK markets, because nobody wants to go first,” he says. “The general idea from a regulatory perspective would be that they don’t want to create regulatory arbitrage, where doing business in one location is better and more capital-prudent.”

No entiendo, SA

FRTB has been in the minds of risk and compliance professionals at banks since the aftermath of the 2008 financial crisis. It was born out of a desire by the Basel Committee on Banking Supervision to rethink how capital charges for market risks were calculated so that banks could more efficiently absorb trading book losses in the face of extreme market conditions.

The 16 years since FRTB’s genesis have seen multiple consultation meetings, proposed reforms, and delays. When banks decide to implement FRTB in house, it is in the form of one of two models, either the standardized approach (SA) or the internal models approach (IMA). Both are particularly data intensive, requiring many historical datasets across each asset class the bank offers, dating back several years. Due to legacy data management architecture, high volumes of data hosted by banks on-premises, and the granular assessments of data required by the BCBS, parsing the data is hard, says the bank source, and making sure that it meets the requirements of the FRTB models is even harder.

Sourcing the data is the easy part. It’s actually about how much trust you have in that data
Director of FRTB implementation at a G-Sib

The SA model has proven to be by far the most popular among banks in the EU and the US, as well as jurisdictions for which FRTB has already been implemented. In Canada, where FRTB came into force in January of this year, no bank has gone live with an internal model, and in Japan, the country’s three megabanks—MUFG, Mizuho and SMFG—all opted to use the SA when the country’s early roll-out of FRTB began.

Julio de Jesus, a director at EY in Canada, told WatersTechnology’s sibling publication Risk.net that the IMA high technology cost was “crazy”.

“Given the amount of technology spend and the time required, it wasn’t worth the capital savings,” he said in February.

This is not to say that the SA approach is easy, though it may be easier than the IMA. With SA, banks need to calculate a range of risk sensitivities—delta, vega, and curvature—across all desks and asset classes in which the bank operates. Banks also need relevant, high-quality reference data to identify positions in scope and map them into predefined regulatory “buckets”, which group together instruments with similar risk characteristics.

Fausto Marseglia, head of FRTB product management and regulatory propositions at London Stock Exchange Group (LSEG), says that one of the biggest challenges of the SA model is the quality of the reference data needed to calculate the risk buckets and risk weights for the banks’ trading positions.

“To come out with the risk sensitivities calculation, you need first to calculate risk buckets and weights for your trading positions, and that task requires some non-trivial mappings defined by the rules for each relevant security type,” he explains. “And that mapping is actually complex, because you need to have quality reference data.”

When LSEG was considering its approach to helping banks with FRTB, Marseglia explains that the vendor had to be careful, as most banks were unwilling to provide it with sensitive internal trade data. As a result, LSEG decided to source data from external, independent market infrastructure providers, and created an advisory board to provide oversight of the design and build of the company’s Non-Modellable Risk Factor solution.

“We had six banks representing all the major jurisdictions and six market infrastructure providers, including LCH and Tradeweb, which are part of LSEG,” Marseglia says. “We put together the specifications of the product, which were reviewed by the banks, and defined the data sourcing strategy together with the market infrastructure providers. Based on information from the advisory board, we built this offering.”

One aspect of LSEG’s FRTB offering is providing reference data that has been specifically built to address the problems around data quality posed by the SA model. This includes specific reference data fields for all securities across the seven risk classes defined by the SA rules, including risk buckets, risk weights, and other data fields that are relevant for providing transparency on how the weights and buckets were calculated.

Dodging the slap on the wrist

Apart from looming deadlines, another reason banks are tooling up their models is the threat of potential enforcement actions by regulators once their respective regions enact FRTB.

Thomas Labbe, global regulatory product manager at Bloomberg, says the global path to FRTB implementation has been “a bit of a stop-and-go approach”, as the frequent delays have lulled some banks into failing to ensure their FRTB calculations are fit for purpose.

“I think the main risk with the standardized approach is that if the data you use as an input into your analytics is not correct, or your interpretation stands out from the consensus, then the risk is that calculation is going to be wrong,” Labbe says. “When it comes to being audited by a supervisor, then obviously the bank faces the risk of not being able to necessarily provide the right explanation.”

Labbe says Bloomberg clients are interested in learning from other regions where FRTB has already come into force, and hope that by following the example of successful banks, they can avoid potential teething problems. This interest extends even into geographical regions where the bank doesn’t have a branch.

“One of the things we are seeing with our clients is that there’s definitely a lot of attention that is given to other jurisdictions, even if they don’t need to report in that jurisdiction,” he says. “They want to know how it’s going. Is the regulator interacting? How do they solve this type of challenge?”

Labbe explains that while Basel sets the standards for each region, they are applied locally. This means that Bloomberg starts work on dedicated regulatory datasets for each jurisdiction once this application period starts so that it can distribute them to clients ahead of time.

“Whenever a new jurisdiction is coming up with the guidelines—or even a proposal for the guidelines like in the US—that’s when we start to do the work, and we create a number of fields that are specific to a given jurisdiction,” he says.

General uncertainty and American uncertainty

One issue with the SA model is that while certain requirements are prescriptive, this is not the case throughout the guidance. This uncertainty is an added headache for banks, especially since the documentation on risk reporting from other regulatory bodies does not allow for what LSEG’s Marseglia calls “space for interpretation”.

“I think the regulations define the standards up until a certain point, but there are details that typically the regulation does not address,” he says. “There is a kind of interesting dichotomy between the rules being not detailed in certain aspects and the fact that they are trying to give to banks some space for interpretation.”

Charlie Browne, head of market data, risk, and quant solutions at GoldenSource, agrees. He draws unfavorable comparisons between FRTB and the European Central Bank’s Risk Data Aggregation and Risk Reporting measures explained in BCBS 239, a separate piece of regulation enacted in 2016 that applies to G-Sibs. Browne says RDARR is more efficient because it gives no space for banks to stray from specific rules, which creates less uncertainty around what’s required from reporting firms.

If the data you use as an input into your analytics is not correct… then the risk is that calculation is going to be wrong
Thomas Labbe, Bloomberg

“If you look at the FRTB document itself, it is not prescriptive on data lineage and data governance, and it says you have to use observable market prices but it does not talk in detail about data lineage, data governance, and data quality, whereas when you look at the new RDARR [documentation], it focuses on the governance processes around data quality,” Browne says. “It talks heavily about the need to do this and the fact that this is going to be audited, and it covers all valuation and risk regulation.”

Despite these data challenges, most banks now feel ready for the implementation deadline, having been aided by a variety of risk and data vendors for some time now. The G-Sib’s director of implementation for FRTB says the bank is “more or less ready” for it. The bank still needs to work on the reporting and template creation side of the regulation ahead of its full finalization, and he says that an additional challenge of FRTB has been explaining it to investors.

“The other thing that we find challenging within the FRTB space is that from a risk management perspective, it is just enough for us to understand the calculations and be satisfied they’re producing the right results, but also communicating that to the stakeholders” is another layer of challenge, he says.

Confusion is something that, for better or for worse, has become associated with FRTB implementation across multiple jurisdictions since the BCBS first proposed it in 2008. Now that the European Commission has delayed the EU’s involvement in the regulation by a year, eyes keen to follow its progress have turned to the US. There, the final, revised package on the so-called Basel Endgame is not expected until September at the earliest, and it isn’t out of the question that it, too, will be delayed until 2026.

Until the Federal Reserve Board announces a decision confirming this speculation, US banks are working towards the current deadline of July 2025. Bloomberg’s Labbe says this is a major pain point for US clients.

“The big uncertainty is what will happen with the US rules, as well as if the rules will be materially changed from the Notice of Proposed Rulemaking,” Labbe says. “Secondly, will those implementations be postponed? So this level of uncertainty means that for our clients, there’s still this element that they’re not sure whether they’re working on the actual final rules, or whether those rules might change even next year.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here