As Public Cloud Gains Traction, Concerns & Challenges Grow

WatersTechnology looks at more than 20 cloud-based projects and initiatives to see how banks, asset managers and vendors are embracing public providers, and the inherent problems involved.

Machine Learning Takes Hold in the Capital Markets: Some Examples

Blockchain in the Capital Markets: Slow Progress in 2019

For the App Interoperability Movement, 2020 will be a Big Year

5G & IoT: Are You Ready for the Data Deluge?

I’ve said and written this before, but when I started writing for WatersTechnology a decade ago, the idea of banks and asset managers turning to public cloud providers for important data functions was anathema. Ten years later, the tide has certainly changed. Not only are capital markets firms increasing their reliance on public cloud providers—namely, Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP) and IBM Cloud—but they are now increasingly employing a multi-cloud strategy. And even for those laggards who have yet to embrace the cloud, their vendors almost certainly have.

For this story, I’ll look at some interesting public and hybrid projects that we wrote about in 2019, and look at some of the challenges capital markets firms have faced in their efforts to migrate to the cloud. As with our earlier look at machine learning, this is by no means meant to be a definitive showcase of cloud projects on Wall Street, but these are certainly some of the more interesting endeavors that we wrote about over the last year. Hopefully, this list helps to show just how the industry is embracing the cloud and potentially create some ideas to kick around with your teams. 

If I’m missing something or have something wrong, shoot me an email: anthony.malakian@infopro-digital.com

The Use Cases

One of the bigger projects undertaken in 2019 was the pairing of IBM and Bank of America, who are combining forces to build a public cloud specific to the financial services industry. While BofA has successfully built its own private-cloud environment, this endeavor will serve as the next evolution of BofA’s cloud journey.

According to the bank, in 2012, which is when its private-cloud project began, it said that by the end of 2019, 80% of workloads would be run on its own private cloud—that was accomplished in September. As a result, BofA once had 200,000 servers and roughly 60 data centers. Now, it has pared that down to 70,000 servers, and more than halved its data centers to 23. The bank now spends $2.1 billion less per year on infrastructure costs than it did in 2012, due in large part to the private cloud, it said.

But this latest announcement is not an about-face. Executives at the company have noted in the past that at some point in the future the scale and finances of the public cloud will eventually catch up as more users adopt public-cloud services. This endeavor is to help the bank prepare for the future. And this IBM-BofA partnership is not exclusive to the Charlotte, NC-headquartered bank; rather, the service will cover the entire financial services market, from institutions focused solely on retail banking, to banks and asset managers in the wholesale capital markets.

No specific launch date has been set, but the two intend a gradual roll-out of different components—the bank, and other participating parties will be able to deploy their workloads incrementally, from the most mundane to advanced instances such as data science, model development, and deep learning, says Curt Leeman, IBM’s managing director covering Bank of America.

***

Also of interest was CME Group’s announcement that the exchange operator is making real-time market data from all its venues available via Google Cloud. The exchange hopes to expand its client base and the reach of its data by using a new connectivity model to simplify data access for end users.

“We are putting all our real-time data into Google Cloud, and converting all our Market Data Platform (MDP) channels into a Google service called Pub/Sub, so anyone can access them via Google from anywhere on the planet,” says Adam Honoré, executive director of data services at CME. “The specific use case for this is how do we take advantage of native cloud services to lower the barrier to accessing our data. …We are creating a low-cost global transport solution for all our market data.”

***

Looking at a more niche implementation, Scotiabank initiated a project that would allow it to use cloud GPUs to run its valuation adjustments (XVA) program. With the project gaining traction, the results have been impressive.

According to the bank, the runtime for risk calculations and derivatives pricing using cloud GPUs is 30 times faster, allowing brokers to deliver more accurate derivatives pricing in 20 seconds, which would previously have taken 10 minutes. It also allows for more nuanced risk analysis thanks to more detailed risk scenario modeling that can assess more than 10 times the number of previous scenarios.

“The scale of XVA means that we need to lean on the scalability of public cloud for compute power and couple that with data at scale from our internal global data platform,” says Stella Yeung, chief information officer at Scotiabank Global Banking & Markets. “This combination lets us deliver, in real time, to the traders the information that they require to serve our global clients.”

Andrew Green, managing director and lead XVA quant at Scotiabank, says that the bank already had a cloud-first policy in place, even before they started this particular overhaul. When combined with a public cloud infrastructure—for valuation adjustments, Scotiabank is using the Microsoft Azure cloud and its NC24 virtual machines—GPUs are better equipped to handle these type of computationally-intensive calculations than traditional CPU cores.

***

On the custodian side of the Street, Northern Trust has adopted a cloud DevOps framework to allow it to more efficiently develop technologies, including its recent front-office solutions for managing the investment lifecycle of family offices and endowments.

The technology was built in collaboration with an undisclosed start-up on AWS and over the summer went live with its first beta client. The firm leverages the scalability of the cloud and has the ability to spin up an environment in around 10 minutes. The adoption of the cloud framework enabled Northern Trust to go from product idea to market in nine months, a fraction of the time it took when using traditional development processes and architectures, says Joerg Guenther, CIO of Northern Trust.

For the project, the firm recruited a head of product and an external tech team to work alongside existing members of Northern Trust’s development team to build out the front-office product.

***

And while the vendor community has long embraced the cloud, the biggest data players made some big moves in 2019. For example, Bloomberg is in the process of shifting its entire data estate and commercial offerings to the cloud as part of what it is calling its One Data strategy. Tony McManus, chief information officer and global business manager of enterprise data at Bloomberg, says the migration process will be done in incremental stages and the company expects to release its first batch of datasets on to the cloud in the first quarter of 2020.

“A lot of our clients are moving their machine learning or statistical analysis processing into the cloud. Therefore, having large volumes of historical time-series data available to them in native format is very important. That is the next big set of milestones on the roadmap,” McManus says.

Additionally, Bloomberg also announced on September 12 that its flagship real-time market data feed, B-Pipe, will be rolled out globally through AWS’s PrivateLink, which is designed to eliminate exposure of data to the internet and to offer private connectivity between on-premise and virtual environments. Bloomberg says that B-Pipe streams 35 million instruments across all asset classes, including data from 330 exchanges and 5,000 contributors through a common API.

***

Beyond that, Chicago-based data and investment research provider Morningstar has revamped its tick data delivery architecture into a cloud-based solution. For the project, it is partnering with Australian big data technology platform vendor RoZetta Technology.

The new version of Morningstar’s Tick Data Solution will go live at the end of July, using a platform built by RoZetta, running on Amazon’s cloud environment. Morningstar hopes the revamp will make it quicker and easier for clients to access tick data from the vendor.

Morningstar’s existing Tick Data service stems from its acquisition of London-based ticker plant and datafeeds vendor Tenfore Systems in 2003. Since then, Morningstar has collected and stored—and in most cases, commercialized—around 2.5 petabytes of tick-level market data, which is growing exponentially year-on-year, covering 200 stock trading venues, or roughly 99% of global equities coverage. The dataset includes historical tick data back to 2003, with 10 years of US composite data, exchange messages and outage information, and the ability to filter by symbol or exchange, and to view market-by-order or market-by-price. Data points include trade date and time, exchange time, volume, trade price, last bid and last offer.

“Our challenge was to get that into the hands of clients—and the way that the data was traditionally stored made that challenging,” says Matt Spedden, global head of equities and market data solutions at Morningstar, citing the vendor’s legacy storage and extraction environment, which used older technologies and typically delivered data via flat file or physical media, such as hard drives, for large volumes of data.

***

In a similar vein, SmartStream Technologies is following in the footsteps of Morningstar—and Trading Technologies and MarkitSERV—when, at the 2019 Sibos conference in September, it launched SmartStream Air, the firm’s cloud-native, AI-enabled reconciliations platform. As noted earlier this week, the platform will allow it to better deploy machine-learning tools.

***

Also, Bureau Van Dijk, which is owned by Moody’s Analytics, is beginning a full-scale rebuild of its risk management platform, Compliance Catalyst. In 2020, the vendor plans on moving the offering to the cloud. A handful of customers have been testing the revamped product in beta since August, and the new tool features automated and enhanced customer due diligence screening for anti-money-laundering (AML), know-your-customer (KYC), and anti-bribery-and-corruption (ABAC) rules, which will also look to incorporate machine-learning techniques to improve alerts and analysis.

***

And the ability to leverage artificial intelligence has proven to be a boon for the cloud community. HSBC, for example, is in the process of implementing a large-scale project that entails using machine-learning technology to measure the quality of its data across five different dimensions—accuracy, completeness, uniqueness, validity, and consistency—and uses granular details to link correlated data together.

It is no small task, as the firm is pulling information from multiple systems across several business lines and jurisdictions. The data will be viewable on data quality dashboards, where the user can view critical data elements and identify the real value of the data that the system aggregates. Chuck Teixeira, chief administrative officer and head of transformation at HSBC’s global banking and markets business, explains that its data management teams are leveraging AI to index and tag data from trillions of transactions and external sources to build a reusable golden source of data.

“Part of the challenge other banks and [that we also] have is that we have lots of data pools, but the problem is, if you don’t tag that data and index it, how do you find it again? So that is part of what we have built, a reusable data asset. And this has been a significant undertaking over the last year,” he says.

In the second phase of its transformation project, HSBC began migrating the data to a cloud-based data lake in June to be able to utilize it for a variety of use cases, such as to build a client intelligence utility on the cloud. The platform will use the cleansed data, captured from trade lifecycles and external sources, to better understand the needs and requirements of its clients. It will ultimately act as a single part of a more comprehensive client services project, Phoenix, in which HSBC intends to collaborate with AI partners.

***

High-frequency trading (HFT) firm Grasshopper has partnered with GCP to build out its quantitative research and data-processing platforms to improve its trading capabilities. By leveraging GCP tools such as TensorFlow, an open-source machine-learning library, BigQuery, the same data warehouse on which everyday Internet users search Google, and Cloud DataFlow, which is used for processing both batch and real-time data streaming, the Singaporean firm and the cloud giant have worked closely on several internal projects.

One of those projects is the firm’s in-house Java application, Ahab. Named for Herman Melville’s fictional whaling captain, it allows traders to “listen” to live market data and make better and faster trading decisions. It is built on Apache Beam, the open-source programming model.

Tan T-Kiang, who holds the dual-title of chief investment officer and chief technology officer for Grasshopper, says the data that flows through the application can be thought of as fish, gobbled up by a bottomless whale. Moby Dick metaphor aside, Ahab sources its data directly from exchanges, and calculates, in real-time, the value of order books, or the lists of buy and sell orders at any trading venue, and how that can impact a stock’s price. The data is tied into GCP’s Solace PubSub+ tool, which handles and sorts information from multiple sources, thus eliminating the need for Grasshopper’s engineers to deal with basic network connectivity. The resulting data log then gets stored inside BigQuery.

“One of the biggest things we’ve been trying to solve is that when you start off, let’s say, 10 years ago, you built a database, and the market was probably 10 times smaller in terms of data. That system was fine, and then a few years later or even a year later, you have to re-tool because it’s not good enough anymore,” Tan says. “Or you, as a hedge fund or an HFT firm, decided to add five more markets to what you’re covering. And what you’re storing now is maybe 50 times or 100 times more data.”

***

Finally, there was also a major acquisition that happened, in part, because of the need to move to the cloud. In October, Pittsburgh-based Confluence Technologies announced that it was buying StatPro, the cloud-based UK firm that offers portfolio analytics through its SaaS-delivered tools. Its flagship Revolution platform uses a hybrid cloud model that leverages both AWS and Azure architectures.

As one executive told WatersTechnology, StatPro was “very gutsy when they built Revolution from the ground up as a pure SaaS [software-as-a-service] solution. They effectively end-of-life’d [sic] their existing legacy-deployed solution [StatPro 7] and went all-in by investing in a next-generation, cloud-based system, betting that they would end up with increased revenue to offset the loss of revenue from their legacy product line. No other firm in the asset management software business has been so daring,” they said after the acquisition. “Competitors have implemented ‘faux’ SaaS offerings by putting boxes in data centers and letting firms remote access them. So I think the sale is cashing in on this initiative. Going private will make it easier for StatPro to continue with this strategy without having to answer to the markets.”

***

It’s also important to remember, though, that these cloud migration projects are not simple. While they are absolutely necessary in order to help a firm future-proof itself, there is a certain bit of Band-Aid ripping involved, unforeseen hiccups, and stubborn client migrations. While I used MarkitSERV and Trading Technologies as examples of ambitious projects previously, they both have experienced delays.

Trading Technologies was an early mover when it looked to sunset its flagship X_Trader order management system (OMS) in favor of the new TT platform. As such, the migration has been slow, but is progressing. The trading platform provider has since launched its infrastructure-as-a-service (IaaS) offering. Earlier this year, TT announced Graystone Asset Management as the first formal client of its consolidated IaaS product.

MarkitSERV, IHS Markit’s post-trade processing business, has gone live with its cloud-based TradeServ platform, first going live with foreign exchange (FX) for non-deliverable forwards (NDFs). The plan was to launch credit on TradeServ in the first half of 2019, but that rollout has been delayed.

“The re-platforming of our trade confirmation and processing service for credit derivatives is one of the largest and most complex upgrades ever offered to the market,” according to a spokesperson. “A large group of dealers, asset managers and infrastructure players, including clearinghouses, are currently conducting testing to validate the interaction between participants and other credit market systems. The time and rigor we are putting into testing will help ensure that the launch is as seamless and as transformative as we know it can be.”

On the plus side, tas of mid-October, the new TradeServ platform had processed about 1.4 million NDF trades for clearing and 34 financial institutions are active on the platform for FX.

Story continues after Box

Cloud Nuts & Bolts

For the April issue of WatersTechnology, we wrote a 4,000-word feature that delved into many of the problems that firms are facing when going to the public cloud. Early adopters have also had to learn lessons around migration strategy and architecting, cost management, security, staying abreast of updates, how to properly deploy open-source tools, and, crucially, how to change the culture of the firm while trying to attract new talent with unique skillsets. And all of these considerations combined will help to inform a firm which of the public cloud providers to use.

For it, I spoke with the CTOs of Blackstone and Bank of America, Northern Trust’s CIO, as well as senior executives from Numerix, Trading Technologies, and CloudReach. If you would like more detail as to some of the lessons that have been learned along the way by banks, buy-side firms and vendors, alike, click here.

***

Not all challenges are internal. As firms are increasingly turning to public cloud providers for help, regulators around the globe are starting to take notice.

To show how the ball is starting to roll on regulation, financial companies in Europe must now maintain a register of all their outsourcing arrangements under new guidelines, as regulators are worried that cloud services are concentrated among just a handful of companies.

The European Banking Authority’s new guidelines on outsourcing came into force at the end of September, replacing the existing 2006 guidelines and some 2017 guidance on cloud outsourcing. They apply to a wider range of companies than before: any firm that falls within the EBA’s mandate, which means credit institutions and investment firms like banks and hedge funds, as well as payment and electronic money institutions, a category that will pull fintech companies into scope. The guidelines don’t apply to insurers.

***

One only needs to look at Google to see how the regulators are already forcing tech giants to reconsider how they work with financial services institutions. While speaking at the Google Next conference in November, Suzanne Frey, GCP’s vice president of engineering, noted that her company is taking an industry-specific approach to service-level agreements, such as writing audit rights into its contracts to adhere to EU guidelines on outsourcing that were updated this year and demand that outsourcing contracts set out the rights of users to audit providers’ premises, including devices, systems, and networks.

“We have tweaked our contracts to be very specific to industries and even in some cases the regions in which we are working. …We have been working with financial regulators in Europe and worldwide on engagement and audits to expose the full depth of our operations, how we handle information, how we handle business continuity, and the like. And then we are continuing to invest in various compliance regimes,” she said.

***

The regulators, themselves, are in a bit of an awkward position. On the one hand, they need to do a better job of monitoring cloud proliferation and understanding what kind of risk that brings to the markets. On the other hand, they need to evolve technologically alongside those they oversee, and that means embracing the cloud. So, for example, the Financial Conduct Authority (FCA) is turning to the cloud to improve its analytics capabilities and to—potentially—streamline the reporting process.

Over the last few years the FCA has moved almost all of its technology estates to the cloud, thus allowing it to roll out a major data analytics pilot program across the organization. The FCA can leverage huge amounts of data from the roughly 59,000 firms under its purview. Over the course of 2019, the FCA has brought together all the data initiatives that were happening across the organization, using data lakes and new methods of data collection. The idea now is to reduce areas of overlap and find pockets of resistance.

“We are constantly being asked for greater efficiencies, to be more effective around monitoring that broad sweep of sectors that we have,” says Steven Green, head of central data services.

  • READ MORE: My colleague Jo Wright spoke with several lawmakers, lawyers, and end-users to see how the regulators could approach supervising public cloud companies in the future. Click here to read more. Also, as cloud computing becomes an ever-more critical component of any modern financial technology infrastructure, cloud deals are coming under increased regulatory scrutiny. Jo takes a deep dive into security and SLAs. Click here to read more.
     

People Problem

As mentioned earlier and as you’ll hear many tech executives say, there’s another reason to embrace the cloud: talent acquisition and retention. And it’s true—there are few (if any) talented programmers and engineers that want to work on Cobol-architected legacy platforms. But one thing that firms are learning is that there is both a shortage when it comes to finding talent, and changing the hearts and minds of existing employees is rarely a smooth process.

The London Stock Exchange Group (LSEG) is moving forward with an ambitious program where 60% of service delivery and corporate computing will operate in the public cloud by 2021.

While speaking at the Extent Conference, hosted by Exactpro, in September, Ann Neidenbach, chief information officer for the LSEG, noted that the company has already moved 40 workloads to the cloud, but it has encountered something of a people problem, as this project aims to institutionalize a DevOps methodology for cloud deployment going forward.

“As you have that culture transition, it is hard for the developers to go through this transition,” Neidenbach said. “‘What do you mean they rejected my code? Why can’t I move this server? Why can’t I do this, this, this in order to pass our chief information security officer’s checklist?’” a developer may now ask, she added.

Howard Boville, Bank of America’s chief technology officer, also points to the human-capital element of the project as key in making this work. He advocates creating a governing body—which will be appropriate for even small institutions—that includes “representation across all areas of the enterprise, while remaining customer-focused and managing risk appropriately.”

Additionally, he says it is necessary to create an education program, since this is a significant paradigm shift. “Cloud migration and implementation is a cultural change requiring full training and communication planning to support the migration strategy,” Boville says.

Blackstone’s CTO Bill Murphy adds that moving to the cloud is “all about creating velocity in our technical deployment.” As such, it is vital to have people in place who have done this before, and sometimes that means needing to bring in outside experts to help move the process along, otherwise, momentum will be hindered. Blackstone has looked to both hire developers with cloud expertise and it has also invested in a cloud specialist, Cloudreach, to help it move its cloud-development project along more quickly.

Meanwhile, Axa’s head of data services, Edosa Odaro, says that the struggle to make optimal use of cloud computing is changing. “The battlefield has shifted from technology to talent,” he says. “That is why you see the competition out there for the best.” When asked how to avoid common pitfalls, Odaro says: “If it was a single word, it should be ‘talent.’ You need to have more focus around getting the right people and the right mindset, helping you actually deliver, whether it’s data or technology.”

Firms have been focusing on getting the tools and technologies right, thinking that, “You just buy a new tool, a new shiny thing, and it will just happen,” Odaro says. “Unfortunately, that is not the case.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here