Vendors Continue to Move Products, Services to the Cloud: Some Examples

Last year, most (if not all) financial technology providers either completed or started major projects that involved moving their products and services to the cloud. WatersTechnology looks at 15 of the more interesting cloud-migration initiatives from 2020.

We wrote over 150 articles in twenty-twenty that in some way touched on either a company developing or rolling out a cloud-based tool, or that focused on regulation of cloud providers and services.

Essentially, capital markets firms are increasingly embracing the major public cloud providers, specifically Amazon Web Services (AWS), Google Cloud Platform (GCP), Microsoft Azure, and IBM Cloud. And thanks to the expansion of cloud tools and platforms moving to the web, tech firms and end-users alike are leveraging APIs to deliver their services, which provides for a more connected industry in the cloud. When combined, these two movements are helping to push along the desktop application interoperability movement in the capital markets. And, finally, while banks had previously bristled at the idea of using open-source tools, much less contributing to these communities, as they move to the cloud, that ethos is quickly changing.

To show how this great migration is unfolding in the capital markets, we looked at 15 major technology providers and projects that were either started or completed last year. Again, this list represents less than 10% of the stories we wrote about cloud in 2020, but it should provide a good look at the benefits and challenges of making this transition. Trust me, we’ll be writing a lot about cloud-migration projects in 2021…and well beyond.

FactSet

FactSet had a busy 2020 expanding its cloud strategy.

First, the vendor migrated its physical ticker plant to AWS’s Amazon Elastic Compute Cloud (Amazon EC2). FactSet began the project earlier in 2020, and expects to complete the rollout this year. It is initially running in parallel with its current ticker plant, which is hosted in multiple datacenters, including two major datacenters in the US, with local feed handlers deployed around the world to capture data locally from exchanges. Once the migration is complete, FactSet will decommission the old ticker plant.

“Cloud gives us features and functions that are cost-prohibitive in any other way—such as deploying infrastructure globally—and clients get lower latency and great quality regardless of market volumes. With ticker plants, you have to size for the highest market volumes, so you need a lot of capacity. And when you onboard new clients, it takes time to get infrastructure in place,” said Gene Fernandez, chief product and technology officer at FactSet. “This will allow us to onboard new exchanges very quickly and onboard new clients in a fraction of the time—and we’re going to gain a ton of operational efficiency.”

Second, FactSet partnered with Snowflake, a cloud-based and cloud-agnostic data-warehousing platform. Through the agreement, customers can now access 58 of FactSet’s datasets in the Snowflake environment. The release comprises 34 datasets from third parties through Open:FactSet Marketplace, alongside 24 proprietary FactSet feeds, which include fundamentals; supply chain data; geographic revenues; point-in-time consensus estimates; information on spending trends; news sentiment; and environmental, social, and governance (ESG), among others.

Additionally, while this article is focusing on projects that went live last year, this week we wrote about how FactSet has launched Concordance, a new service for performing outsourced data mapping of traditional, alternative, and clients’ proprietary data. It leverages its existing operations and technology, as well as its relationship with cloud provider Snowflake to provide mapping of data stored in mutual clients’ cloud environments. Click here to read more about that project.

Refinitiv

​Refinitiv’s latest release of its enterprise data platform steps up its cloud compute capabilities, and rebrands the platform from Thomson Reuters Enterprise Platform, or TREP, to Refinitiv Real-Time Distribution System (RTDS).

The name change is part of a project to eliminate instances of branding relating to former owner Thomson Reuters, but also sees the vendor roll out simplified and standardized branding across its product lines.

This version and upcoming releases already scheduled on the vendor’s development roadmap add significant capabilities to support the ongoing rollout of RTDS in the cloud as an alternative to traditional client-premise deployments of its data platform, said Steve Moreton, global head of product management at technology support provider CJC.

“For the components released … it’s essentially the same software,” he said. “The software contains many enhancements, however—for example, on the ADH, improvements to load balancing.”

Moreton adds that one notable change is that its Advanced Data Hub component (which has been rebranded as Refinitiv Real-Time Advanced Distribution Hub), is now certified for Amazon Web Services (AWS), which will help support Refinitiv’s plans for rolling out RTDS in the cloud. Another small but “important and symbolic” change is support for Amazon Linux 2, an operating system designed to work best in AWS testing environments.

“The TREP/RTDS software is very mature. We support around 800 clients, and each has different configurations of TREP—everything from low-latency, co-located clients, to those who use it to delay data for days before they access it. Wherever you are on that spectrum, you have a choice of using TREP/RTDS on a fully deployed or cloud-deployed model,” Moreton. “I only know one or two firms that want to keep using on-site deployed technology. … The multicast element is still a challenge in the cloud, but you’ve got to move with your firm and your CTO. And the appetite is definitely there to move to the cloud option.”

SS&C Advent

​In the institutional asset management and hedge fund world, SS&C Advent is best known for the Geneva portfolio management system, the Moxy order management system, the APX client management solution, and Genesis, the company’s portfolio construction and rebalancing offering. Of those fairly-ubiquitous tools, only Genesis is cloud native. It’s going to take time, but the plans to move those other platforms to the cloud over the coming years.

“We’d like to do a transformation where we can offer an entire suite that is cloud-based, whether that means building additional cloud-native capabilities, or transforming some of our legacy solutions, like a Geneva, to be able to plug-and-play with a much more open cloud platform,” said Karen Geiger, co-general manager of SS&C Advent. “This would be a multi-year endeavor, but we’re starting by building off what we already have with Genesis, and extending that.”

While Genesis does rebalancing and order creation, Geiger said that there are opportunities to expand further into the front and middle offices thanks to the platform’s software-as-a-service delivery mechanism. More than that, though, the company is considering how to move an offering like, say, Moxy—which SS&C Advent can currently host for users—and make it a more cohesive front-to-back experience via the cloud.

Geiger said the way that she thinks about this transition is a user can pick and choose which components of the Advent offering make the most sense for them. In this scenario, certain specialized components might fit better for a hedge fund rather than a traditional asset manager, but those pieces would nonetheless “seamlessly talk to each other” so that users could scale up or down depending on their needs.

So while Genesis was the first cloud-native platform under Advent’s asset management umbrella, the long-term goal is to have every platform deployed via a SaaS model.

Charles River

Last year, it was announced that the flagship Charles River Investment Management Solution (IMS) would be deployed on the Microsoft Azure cloud platform, with Charles River also incorporating Microsoft Power BI, data visualization tool, and Microsoft Teams, a communications platform, into IMS. In April, CRD announced that IMS had “achieved platform and operational readiness on Microsoft Azure,” and that the vendor was migrating “new and existing SaaS (Software-as-a-Service) clients onto the new Azure infrastructure.”

“This is one of the most important projects for 2020, and we’re very pleased with the client response and the third-party’s work with us as well,” John Plansky, chief executive officer of Charles River, told WatersTechnology.

This cloud migration initiative with Azure will allow Charles River to deploy system upgrades more easily and to add new products to IMS, Plansky said. Additionally, the Azure partnership is just the first piece of a broader cloud strategy, as CRD is currently evaluating other potential partners.

“We need to make sure that we don’t lock ourselves in, and we already have a multi-cloud strategy at State Street,” Plansky said. “Our intention is to move Charles River in its entirety to the cloud. Those systems that are closest to the front office and to the data are what we are proactively moving these to a common cloud architecture. State Street does have a private cloud that we leverage as well for other solutions.”

Confluence

In 2020, Confluence completed the initial tech integration of London-based StatPro, which it acquired in October 2019. Next up for the combined entity—which will operate under the Confluence moniker—will be an expansion of its offerings and client base in three targeted areas: risk analytics, performance measurement, and regulatory reporting.

But the larger goal, said Todd Moyer, Confluence president and chief operating officer, is to become a fully cloud-native and cloud-enabled platform leveraging microservices and open-architecture technology as key components of its technology stack.

“I think the biggest change you’ll see from Confluence going forward is that, to us, it’s about ease of access to our technology,” Moyer said. “If you’re dealing with legacy solutions that have heavy installs and a lot of need for human involvement, we feel like that will accelerate the necessity to move everything into a truly cloud-native platform.”

Revolution, which was StatPro’s flagship SaaS-based performance and risk measurement platform, is central to Confluence’s roadmap. It acts as a wrapper for the other acquired business units from StatPro, which include Milan-based ECPI, an ESG research and index business; InfoVest, a South African software provider specializing in data warehouse, ETL, and reporting software; advanced-risk-metrics specialist Investor Analytics; and fixed-income-analytics service provider UBS Delta. For now, Confluence has decided not to sunset any of either of the companies’ components. However, it may decide to wind down certain legacy apps as it pursues its broader cloud and microservice strategies.

Nasdaq

Earlier last year, Nasdaq unveiled Nasdaq Cloud Data Service (NCDS), a cloud-based delivery option for its most popular market data services. It is designed to reduce the cost and time to market associated with deploying datafeeds from the exchange, and aimed at applications, web services, and visual displays.

Hosted by Amazon Web Services—though the service is cloud-agnostic and could be made available via any cloud operator—the data available includes a mix of real-time streaming and historical datafeeds and datasets, including the Nasdaq TotalView real-time feed, Nasdaq Last Sale, Nasdaq Basic, Nasdaq Global Index Data Service, Nasdaq Fund Network, and the Quandl alternative data platform.

Clients access the data using an open-source software development kit (SDK) available via Github to connect to a version of Nasdaq’s data API in the AWS cloud.

“For years, when people inquired about accessing data from Nasdaq, and they ask about specifics for an API, we gave them a laundry list of ways to connect to the data, which might include buying hardware, or leasing lines,” said Brandon Tepper, VP of Americas for Nasdaq’s Global Information Services business. “But a lot of new clients and fintechs were born in the cloud, so it made sense for our data to talk to them directly in the cloud, because they’re not used to buying hardware and leased lines.”

In addition to cloud-native data subscribers, many traditional financial firms with large datacenter infrastructures that are looking to migrate parts of their business to the cloud will be able to focus more on developing their applications faster, Tepper said.

So Nasdaq decided to make its API available in the cloud, which reduces setup time from months to days, he said. “To us, having a client take months to get access to our data is a hurdle. If we can deliver it in hours or days, it reduces time to market and allows clients to get on with building their apps.”

In addition to cost and time-to-market benefits, another advantage of a shared delivery is that firms using Quandl’s datasets can now access Nasdaq’s “traditional” streaming market data alongside it, Tepper said.

SmartStream

SmartStream Technologies is in the process of rewriting its entire solutions suite into cloud-native software so the vendor and its clients can exploit the cost and operational benefits of serverless cloud computing.

Having already released cloud-native versions of its Aurora (formerly known as Corona) digital payments processing platform and its SmartStream Air (and just recently, Air 2) artificial intelligence (AI)-enabled data reconciliation platform, the vendor is now planning to leverage the benefits of cloud-native software across its entire lineup of solutions, said Nick Smith, senior vice president of managed services at SmartStream.

Though not yet complete, Smith said all the applications comprising the vendor’s TLM solutions suite for corporate actions processing, collateral management, cash and liquidity management, and confirmations management are all “in flow” toward being migrated to cloud-native versions.

OneMarketData

OneMarketData is in the last leg of its two-year migration project to deploy its data solutions on the cloud.

The vendor’s two offerings, OneTick, an enterprise platform for managing and analyzing tick data, and Tick Data, a service for historical, intraday, and time-series data, were made fully available on Amazon Web Services (AWS). OneMarketData is currently building out its real-time data feed and processing capabilities—which currently supports equities, futures, and options—using the cloud architecture. Users can access the data feeds through a deployed platform, as a managed service, a proprietary GUI for querying the data, or via an application programming interface (API), written in Python.

Although OneMarketData’s cloud journey began around the end of 2018, the planning phase stretches back much further. Jeff Banker, senior vice president of market development, said the idea sprouted when the firm acquired Tick Data in 2015, which, at the time, had already been running its entire infrastructure and data files on AWS.

“When we bought the company, it became obvious to us at the time that there was some benefits and practical metrics that would serve us well in terms of [cloud] conversion. So, we slowly started the process of aggregating all our data in AWS, then testing our software in AWS, our security processes, and so forth. So, it has been an optimization journey,” he said.

Prior to the migration, OneMarketData ran its data, servers, and solution through Equinix’s NY4 or NY2 co-location centers, or at private cloud facilities. When the firm bought Tick Data, it integrated the acquired data feed into OneTick, making both services available on one platform, while transitioning to the public cloud.

Xignite

​Web services data provider Xignite unveiled a suite of cloud microservices for data management, storage, and distribution in the cloud. The vendor said this will help companies already moving from in-house operations to cloud storage for certain data and functions to migrate from monolithic legacy data architectures to more agile and less expensive cloud services and data sources.

Stephane Dubois, CEO of San Mateo, Calif.-based Xignite, said the company has been working on the suite of microservices, dubbed Xignite Enterprise Microservices, for around three years, and is already using them in-house to support its own data management and distribution activities.

“When you are building a cloud, you architect it in a certain way, and if you do it right, you end up building microservices,” Dubois said. “We built them out of necessity to allow us to scale, and now we are offering them to our clients.”

HPR

While HPR established itself as an ultra-low-latency hardware provider under the moniker Hyannis Port Research, the vendor has slowly entered into the cloud space. One of the best examples of this shift was the creation of Databot, a market data distribution platform that is delivered, initially, as part of its Omnibot switch.

The offering will be available via a field-programmable gate array (FPGA) or via a software-based cloud solution. HPR will first go to market with the FPGA implementation within Omnibot. But, because Databot is built into Omnibot, which also serves as a router and pre-trade risk gateway, firms will eventually be able to purchase a software-based version of the appliance.

“Historically, we haven’t been that interested in the data market because it’s been a crowded vendor space,” said Tony Amicangioli, founder and CEO of HPR. “We now believe, though, that by providing leading performance and the completeness [of a service], it will allow our clients to achieve better performance, reliability, and efficiency.”

This software version of Databot will allow HPR to target large investment banks while pushing into the cloud services space. Amicangioli said that while HPR’s institutional clients predominantly use its FPGA products, the vendor is seeing a growing base for its software tools among individual users within firms.

IHS Markit

Last year, IHS Markit rolled out a new product dubbed Risk Bureau, which aims to help buy-side firms calculate and model their risk using alternative data, machine learning, and cloud computing.

By leveraging GPUs run on the AWS Cloud and incorporating machine learning, IHS Markit has reduced the time it takes to calculate valuation adjustments (XVAs) for complex and simple derivatives portfolios by 200% to 250%, compared to traditional Monte Carlo models. XVAs include credit valuation adjustments, funding valuation adjustments, collateral valuation adjustments, and capital valuation adjustments.

Where a Monte Carlo simulation uses a forward-looking stochastic process, Risk Bureau works backward with a regression technique. By pre-computing all the different simulated parts with machine learning, users who are plotting and moving data points around on a graph can cut the lag time associated with calculating paths of single lines down to milliseconds.

Linedata

Linedata is moving many of its platforms to the public cloud as the company looks to shift to a continuous delivery model.

Sujit Mascarenhas, vice president of engineering at Linedata, said moving to a public cloud from a private cloud setup has been a lengthy process. The vendor is already using Amazon Web Services (AWS), and it is now looking to tap into Microsoft’s environment.

One example of this transition is its Capitalstream lending and leasing software platform for credit origination and risk management. In 2019, Linedata began to take the private cloud version of the platform and code it for AWS. Capitalstream is the first platform on the credit side of the company to be moved to the public cloud.

Linedata also has plans to transition its intelligence dashboard interface, Clarity; its order management system, Longview; and its portfolio management system, Global Hedge, in the coming months. Mascarenhas said clients who do not want to use services on a public cloud can still use a hybrid system with both private and public cloud deployment.

Broadridge

In January 2020, Broadridge Financial Solutions announced it had partnered with IBM to build what will be known as the Broadridge Private Cloud, aimed at modernizing Broadridge’s infrastructure to provide automated private cloud services for critical workloads, while adding velocity and scale.

One early project focused on Broadridge’s data intelligence platform, which is catered to the corporate bonds market. Mark Schlesinger, Broadridge’s chief information officer, said that the platform will help buy- and sell-side firms unlock liquidity in the corporate bond market.

“This really takes the manual process out of this marketplace and brings it online and helps clients identify natural buyers and sellers, and enable the best execution through this AI-enabled electronic trading platform,” he said. 

Meanwhile, on the wealth management front, Broadridge is building what it’s calling a next-gen platform that will rely on private and public cloud components.

Crux

Data engineering and delivery company Crux Informatics is expanding its existing relationship with Google Cloud Platform (GCP) by integrating its data catalog—consisting of more than 15,000 datasets sourced from more than 100 data vendors—with Google’s own data universe. 

The move will allow all GCP customers to access Crux’s datasets through the cloud offering, as Google is able to repackage and resell Crux services to its own clients. For instance, if a GCP account holder is seeking particular datasets that happen to be part of Crux’s data catalog, the GCP salesperson could immediately turn those datasets on in the GCP customer’s cloud environment even if they aren’t also Crux clients and as long as the customer has a valid license or evaluation license for any third-party vendor data they want to access. Crux can help users obtain licenses, if needed, though it wouldn’t be a party to them.

The data is delivered directly into users’ cloud environments, and specifically into BigQuery, Google’s fully managed data warehouse, so users can run analytics right away. Crux also has partnerships with GCP’s peers: Amazon Web Services, Microsoft Azure, and Snowflake Data Warehouse.

Citco

Fund administrator Citco launched a platform called Data Services that aggregates client information from internal and external sources. Data Services is a web application with the option to use the platform via an API. It is built on a data lake architecture in Amazon Web Services.

Data Services brings together, curates, and stores data from all the different Citco systems, as well as data from external vendors such as private equity fund administration software Investran or software provider Yardi. Data Solutions. 

Earlier in 2020, Citco migrated its portfolio management and accounting platform Æxeo Technology to AWS.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here