Waters Wrap: On cloud migrations and VCRs

Financial services firms are increasingly embracing public cloud offerings, but there have been stumbles along the way, including around scalability, throttling, and a lack of true multi-cloud connectedness. These are lessons that must be learned if firms want to be able to take advantage of innovative new technologies, Bill Murphy tells Anthony.

In Europe and the US, there are two interesting discussions going on about consolidated tapes. In the States, the Securities and Exchange Commission’s (SEC’s) new market data system needs an independent administrator, which will be selected by requests for proposals (RFPs). As Jo Wright explains, the road ahead is murky due to underlying questions and ongoing litigation. And across the Atlantic, Josephine Gallagher writes that some firms are voicing concerns that mandating multiple consolidated tape providers could create fresh problems around data fragmentation and connectivity costs.

For the wonks out there, both of those articles will walk you through the various complexities surrounding these plans. But in this space, we’re talking about cloud.

Secrets of the cloud

Many years ago, I bought the movie The Godfather on VHS (parts I and II … don’t get me started on III). A few years later, I bought them on DVD. A couple years after that, it was Blu-ray because a friend said the picture and sound qualities were “so much better!” (Overrated.) Finally, about a year ago during the pandemic, I bought them on Amazon Prime, where they’re now stored in the cloud and I can watch them any time I like. I truly hope that this will be the last time that I give royalties to Mario Puzo and Francis Ford Coppola.

The cloud is certainly a game-changer: I can watch The Irishman on Netflix at my own leisurely pace rather than suffering through three hours and 30 minutes in a New York City movie theater? Yes, please! Sorry Frances McDormand, but Nomadland was just fine on my couch—I ain’t going back to the brick-and-mortar if I don’t have to.

As cloud and streaming services disrupt the business of making movies, banks are experiencing their own growing pains as they grapple with legacy systems and the accumulation of technical debt that these systems bring about. Essentially, legacy systems are a bank’s version of the VHS tape, Bill Murphy recently explained to me. They want to move to the cloud, but the migration process hasn’t been as easy as hoped.

Murphy has seen it all these last two-plus decades. He joined Capital IQ as its CTO soon after its founding in 1999. In 2011, he was named CTO of private-equity giant Blackstone. About a year ago, he joined Cresting Wave, which helps firms connect with innovative technologies. (And, most importantly, he’s a frequent guest on the Waters Wavelength Podcast.)

He says people are “starting to panic” because they’re realizing that new security products, data storage/scaling offerings, containerization techniques, and data science platforms largely do not run on-premises.

“Not being public cloud is soon going to be—actually, I think it already is—like having a VCR. You can watch your old movies and maybe the old movies are fine, but you can’t get any new releases on VHS anymore,” Murphy says. “So you’re literally stuck watching Star Wars, Episode 4 over and over again on your VCR, while other people are getting the new releases.”

Cloud challenges

To highlight some of the challenges financial services firms are facing, Max Bowie recently wrote about the scaling issues that some are experiencing after transitioning to a public cloud provider.

From Max’s story:

At its technology conference in June, industry regulator Finra revealed a startling statistic—the amount of information being collected to support the SEC-mandated Consolidated Audit Trail (Cat) of all US equities quote and trade data was straining the limits of the capacity it had provisioned in Amazon Web Services’ public cloud to run the Cat.

“The peak volumes of the Cat have exposed some scalability issues in the public cloud. Volumes keep going up. When the Cat was originally contemplated, the original plan stated that peak volumes would be 80 billion market events in a single day,” said Finra CIO Steve Randich at the conference.

It must be noted that Randich said that “Finra is a public cloud company,” and 99% of its data and tech are in the public cloud, but there have been surprises. For others, the issue of throttling has presented challenges.

“What we did not appropriately plan for was the throttling of services during our migration process,” Trevor Hicks, CTO at Wetherby Asset Management, told Max. “We are moving a significant amount of data to the Microsoft cloud, and based on the time of day and how much data we are moving, Microsoft throttles how much speed/resources are available to us. This is something we were aware of when we started the migration project, but we didn’t fully understand the actual impact it would have on our project timeline until we started to experience the throttling for ourselves.”

As a result, the firm had to extend its project timeline and adjust its migration strategy, extending the six-month project by around 45 days to compensate for delays in data transfer and resulting strategy adjustments, completing the migration around the end of September.

Another recent story that we wrote looked at how the Bank of Montreal (BMO) is starting the second phase of its cloud migration and development strategy. Lawrence Wan, the bank’s chief architect and innovation officer, told Nyela Graham that going forward, BMO wants to be “cloud first, cloud native” when it comes to development and engineering.

Still, he also said that half of the bank’s workflows are likely to remain on-premises over the next few years, articulating just how slow these transitions can be at a company that is heavily regulated with a litany of legacy systems.

“We will always have an on-prem presence and capability,” Wan says. He estimates that in three years, BMO will have 30% of workloads in the cloud, 50% on-premises, and 20% on an external service provider.

Reading these stories, a few questions popped into my mind. First, as the 2008 global financial crisis showed, the markets are more interconnected and complex than we previously thought, which gives birth to new risks that are largely unforeseen. As more institutions rely on a handful on Big Tech cloud providers, even if you’re well architected for cloud scalability and new product development, what about third- and fourth-party risk?

As Hicks told Max: “One of our third-party service providers (which leverages AWS) bumped into their bursting limits as a result of a misconfiguration or misuse of their platform by another (unrelated) customer. Although we had nothing to do with this other customer, the services that the third party provides to us were severely degraded and brought our business to a crawl. Fortunately, we had considered this potential in a previous risk assessment of the third-party provider and had appropriate workarounds available to our employees so they could continue their work.”

This is something that Murphy emphasizes—the cloud providers can handle scaling, but it’s a people and legacy problem that firms need to address if they want to take advantage of all the public cloud’s good stuff. So, for example, he says he was recently speaking with someone at Google who said they can spin up 10,000 cores in 40 seconds.

Ten thousand machines in 40 seconds … that’s pretty elastic,” Murphy says.

“Now, if your software isn’t architected in a way to take advantage of the elasticity that the cloud providers have, that’s when it gets dicey. If you have all of your major workloads running on a single database server and there’s significant state required in the logic that you’re running so it has to all run together, then it’s very hard to be elastic there, because you have to replicate large parts of your data for each workload. But, if you’ve rethought your architecture where you build it in such a way that the workload can be spread more easily across different services and horizontal across multiple resources, then the cloud becomes a huge advantage as it’s extremely elastic. So it’s more about the software architectures of the workloads dictating the perceived limitations of the cloud than it is about the cloud providers themselves,” Murphy says.

Finally, there’s this idea of “multi-cloud”—every company I talk to likes to say they’re multi-cloud, cloud-agnostic. While it’s true that almost everyone has contracts with at least two of the big four cloud providers—Amazon Web Services, Google Cloud, Microsoft Azure, and IBM—these are clunky connections, at best.

Here’s something that Finra’s Randich said but that didn’t make Max’s story:

“The big lesson learned is, when we went down this path originally, we naively thought we could do a multi-cloud approach—for example, take our surveillance apps and run them seamlessly between Azure and AWS. And we learned over time that this is not feasible … and the whole industry is struggling with vendor lock-in. The industry is not yet commoditized to the point that we’re vendor-agnostic about cloud workloads. There have been noted efforts to get there, whether it’s Netflix or GE, but nobody has gotten there in terms of running the same workload across multiple vendors—I think that’s going to take a while.”

Murphy echoes this point, saying that he has not seen instances of true interconnectedness and failover.

“The whole multi-cloud thing is a bit of a hoax—no one is doing it truly generic because it’s too hard and it’s costly to make it generic. They have multi-cloud, but it’s multi-cloud by service,” he says.

To wrap things up, I’ll offer up another analogy from Murphy that has nothing to do with movies. To keep hammering it home, this comes down to a people and legacy technology problem. Newly designed golf clubs are better at driving a ball and preventing slices than clubs from, say, 10, 20, or 30 years ago. So if all things are equal between two golfers, but one has more advanced clubs and knows how to use them, that person is likely to perform better.

Conversely, you can give a novice the best clubs in the world, but if they don’t have good technique, that ball is going into the trees no matter what.

“It’s all about design and architecture in order to take advantage of these new technologies in the cloud,” Murphy says. “It’s a people and legacy problem. I either know how to [architect to migrate to the cloud], but unfortunately I have to move so much [think: technical debt] to get it to the new, or I don’t know how to do it in such a way to make the migration efficient. Someone who says, ‘All of these golf clubs can’t get the ball to the green’—well, I don’t know … do you have the right swing to get it there?”

Have thoughts on cloud, golfing, or how Godfather III was an abomination? Let me know: anthony.malakian@infopro-digital.com.

Image: “Mount Starr King, Yosemite” by Albert Bierstadt, courtesy of the Cleveland Museum of Art’s open-access program.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here