The Rise of Regulation

Despite the growing focus on data management after the financial crisis, many feared an already packed regulatory agenda would mean the data dilemma would not be prioritized. Yet, it seems there was no reason to worry.

Data discussions are now becoming part of the mainstream industry and regulatory agenda. Gone are the days when data standardization and quality topics only concerned a few. Understanding of reference data is growing among industry leaders, and discussions that would have been seen as data-centric years ago are now being talked about at the legislative level.

The financial crisis of the past couple of years has placed data in the spotlight. In the US, there has been increased recognition that there is a lack of technical tools to monitor systemic financial risk in today's complex market-place, and data discussions are becoming part of the regulatory and legislative agenda.

On February 4, Senator Jack Reed introduced S.3005: The National Institute of Finance Act of 2010 "to create an independent research institute, to be known as the 'National Institute of Finance,' that will oversee the collection and standardization of data on financial entities and activities, and conduct monitoring and other research and analytical activities to support the work of the Federal financial regulatory agencies and the Congress."

Reed asked the National Academy of Sciences to study the data and tools needed to monitor systemic risk. It found that "over the last several decades, a completely unregulated 'shadow-banking' system has metastasized to the point where many of these new products and market participants, such as derivatives and hedge funds, remain completely out of reach of financial regulators," Reed said in a statement.

Workshop

In fact, on November 3, 2009, the National Research Council of the National Academies held a workshop to explore data issues and assess the research needed to regulate systemic risk in response to a written request from Senator Reed of the Senate Banking Committee. The workshop concluded that the US "currently lacks the technical tools to monitor and manage systemic financial risk with sufficient comprehensiveness and precision."

Washington DC-based John Liechty, associate professor of marketing and statistics at Penn State University and a founding member of the CE-NIF, says the effort to create a National Institute of Finance as part of the broader financial regulatory community has substantially raised the awareness and importance of both data and analytic models, with respect to understanding systemic risk.

"We believe there is a growing awareness that without the access to the detailed transaction and position data needed to understand the network of contractual relationships between financial firms, our regulators will continue to be flying blind with respect to systemic risk," says Liechty, adding that members of the committee to establish the NIF have been able to repeatedly raise visibility of the need for better data, to the extent that the regulatory community is actively discussing how to better co-ordinate their efforts.

Washington, DC-based Mike Atkin, managing director at the EDM Council and member of the committee to establish the NIF, says it has been surprising to see how regulators and legislators are looking into and talking about data and how they now understand the importance of standards. "It has been terrific to see that Senator Reed is interested enough in data to ask the National Academy of Sciences to study the data and tools needed for systemic risk regulation, as well as introduce legislation and hold hearings to make it happen."

The NIF has also received the support of academia. In February, six recipients of the Nobel Memorial Prize in Economic Sciences, including professor H. Markowitz, R. Engle, R. Merton, M. Scholes, W. Sharpe, and V. Smith, jointly endorsed the proposal to establish the NIF in a letter delivered to Senator Dodd and Senator Shelby, respectively the chairman and ranking member of the US Senate Committee on Banking, Housing and Urban Affairs.

In the letter, the six Nobel Prize recipients strongly urged Senator Dodd and Senator Shelby "to include in the US Senate's financial regulatory reform legislation, the authorities and resources needed to assure that the US government will have the understanding, data and analytical capabilities proposed by the CE-NIF that are necessary if government regulators are to have the tools needed to safeguard the US financial system."

Support has been ongoing. In August 2009, the American Statistical Association also endorsed the NIF. "We believe a new data and analytic infrastructure is required to maximize the effectiveness of any financial regulatory system. In view of this reality and recognizing the current effort to reform the financial regulatory system, we add our voice to the call for the creation of a National Institute of Finance containing a Financial Data Center and an Analytic Research Center," as per a statement in support of the NIF issued by the board of the American Statistical Association.

Business leaders, policy makers, regulators and legislators seem to be coming together in the dialogue with the data community, and this may be the first step needed to enable change. And on February 12 it was Governor Daniel K. Tarullo who emphasized how improved data is essential for monitoring systemic risk and for implementing a macro-prudential approach to supervision in a speech before the Subcommittee on Security and International Trade and Finance, Committee on Banking, Housing, and Urban Affairs, US Senate, Washington, DC.

In the speech, titled Equipping financial regulators with the tools necessary to monitor systemic risk, Tarullo said regulators should have greater power to be able to collect data. He also supported the notion of a council of existing regulators to collect and analyze data on systemic risks in the financial system, as opposed to creating a new agency for this purpose.

"The recent financial crisis revealed important gaps in data collection and systematic analysis of institutions and markets. Remedies to fill those gaps are critical for monitoring systemic risk and for enhanced supervision of systemically important financial institutions, which are in turn necessary to decrease the chances of such a serious crisis occurring in the future," he said.

Similar discussions began in Europe in 2009. In fact, in February last year Jean-Claude Trichet, president of the European Central Bank, in his Remarks on the Future of European Financial Regulation and Supervision keynote address to the Committee of European Securities Regulators (Cesr) in Paris, emphasized how standardization is essential to achieve higher transparency levels and how "dialogue between the ECB and Cesr on the possibility of creating a standard for reference data on securities and issuers, with the aim of making such data available to policy-makers, regulators and the financial industry through an international public infrastructure," was welcome.

Data experts highlight that there is a need to ensure collaboration and co-ordination between all the initiatives. EDM Council's Atkin says we have to ensure there is enough co-ordination and alignment between the efforts taking place in the US and in Europe. "The financial crisis was the trigger point, and what became clear to everyone is that there is a need for comparable data and that at a minimum level from a systemic risk standpoint you have to get the data right," he said.

Changing Strategies & Standardization

With the increased regulatory pressure, market participants now say it is no longer fair to say the reference data market is still the same as it has been for the last few years. At the Third Annual Reference Data Management in the Banking Industry conference in London in February, London-based PJ Di Giammarino, chief executive at think-tank JWG, said those who believe nothing has changed have fundamentally missed the point. "If you are doing a job in data in the financial services this year you have to take note of what the new requirements are because the bar has been raised," he said.

The first months of the new year have seen an increased focus on involving the wider industry in data conversations. Data practitioners are reaching out to the regulators and trying to get more market participants involved in collaborative work. "It's not just about the data, but about engaging the broader community," said Di Giammarino. "Whether senior management wants to know it or not it's our job to get the message across to them, and (you need to) ensure you are aware of what everyone else is doing, how you compare to the others ... senior management needs to get this and needs to keep hearing that there are new requirements," he added.

Increasing regulation and the need to address the new data requirements has placed data quality in the spotlight. Far from regarding more regulatory supervision as a burden, data practitioners now see it as a driver to push the agenda forward. Frankfurt-based Francis Gross, head of the external statistics division at the European Central Bank and speaker at the conference, said: "Regulation is not only about the regulators taking control, but it's about market participants and having the certainty that everyone plays the game and that they do so in the manner you do too."

Regulation could become the catalyst for change when it comes to reference data and standards. There is a sense that regulators are starting to get more involved in industry discussions than previously, and that this also helps get data managers around the table. London-based Julia Sutton, head of customer data at Royal Bank of Canada Capital Markets, said: "I think that (regulatory participation) is what has been missing a lot of the time."

To solve reference data issues such as business entity identification, Arja Voipo, senior advisor to the Financial Supervisory Authority in Finland and chair of the CESR technology committee, said the industry should first come up with a solution and then market participants should speak to the regulators to understand what they think. Voipo said if the industry identified a solution on a voluntary basis the implementation would be less complex than if it were to be part of a European directive.

For now, it is uncertain how this type of voluntary work will play out, but the data debate has at least come a long way. Data is now talked about at the top level, and there should be more than enough regulation to keep up the support for data management initiatives going forward. The reference data market is changing, and now is the time to prepare for a data reform.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here