BMO Centralizes Data Management, Infrastructure
Canadian banking group BMO Financial Group is centralizing its firm-wide market data management and enhancing its market data infrastructure to enable the bank to source and distribute market data more efficiently and better manage its data spend.
BMO is now working on centralizing its market data collection function and building a “hub and spoke” architecture. This will ultimately enable it to source feeds directly to support the needs of its latency-sensitive algorithmic trading operations via a dedicated trading network connecting the bank’s global co-location sites—which began last year with the rollout of a network solely for market data and trading traffic—and consolidate the feeds using an in-house ticker plant. BMO will then distribute the data back over the network to end-users within the bank, which currently uses third-party consolidated datafeeds, says Brant Arseneau, chief information officer of BMO Capital Markets.
A centralized data distribution infrastructure will also enable the bank to more cost effectively scale to support growth, whereas previously, BMO had limited centralized data collection and distribution capabilities, with data primarily purchased by different business lines as needed, and delivered on a point-to-point basis, Arseneau says.
These technology enhancements are necessary to make it easier for the bank to enter new markets and offer services that compete with other large global banks, says Paul Rowady, senior analyst at Tabb Group, adding that the bank can take advantage of being able to roll out brand-new infrastructure, rather than having to replace legacy equipment. However, having a knowledgeable team to lead the initiative is crucial, Rowady adds. “You have to change a mindset that’s used to the old ways…. You need to assemble a team that has had experience developing the latest and greatest somewhere else, and then give them the bandwidth and authority to do something truly revolutionary, so they can have greater confidence that they’ve caught up,” he says.
Previously, market data was managed separately by individual business lines at the bank. Hence, a crucial first step towards centralizing the bank’s market data management was to establish a central market data team—initially to serve the Capital Markets group, though the group’s remit is now being expanded to include data requirements from across the entire bank.
A skilled team is important because business users who have to go through the team in order to gain access to market data expect high levels of service, Arseneau says. “There are three pillars a good market data group must have: management of market data and vendors; engineering to create the right solution for the firm; and global support to keep the market data up and running. We are continuing to build out our data group to ensure we have all three,” he adds.
The initiative to overhaul the bank’s data management and infrastructure began in the Capital Markets group, when Arseneau joined the firm in 2009 as managing director and head of electronic trading capabilities. After being appointed chief information officer at BMO Capital Markets this May, he was tasked with expanding the initiative to include the entire bank—and similarly expanding the data group’s coverage—though he says the initiative is largely being driven by the Capital Markets group because it consumes the majority of the bank’s total market data spend.
This central data group is now working on centralizing BMO’s content management, and is currently processing the bank’s data consumption through an unnamed third-party market data management tool to obtain an accurate inventory of data usage and demand. Once this is complete, the data team can then identify and remove unused or duplicative products and services.
In addition, a centralized team can collect and review data usage statistics at an enterprise level, as well as by business line, which the bank can use to create normalized metrics for the cost of market data per trader and for those in non-trading roles, and to provide usage and spend reports to business managers to help them make decisions on what products to use, Arseneau says.
The data group also developed a dashboard tool that aggregates statistics—including data consumption and usage and latency monitoring statistics, from proprietary and third-party monitoring tools—to monitor the bank’s infrastructure, and plans to expand this tool to capture additional metrics, such as tick-to-trade latency, to enable the firm to monitor the entire trade lifecycle, he adds.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.