Barclays Nearly Finished with First Major Quantum Computing Experiment

Dr. Lee Braine walks through how the bank is experimenting with quantum computing and where the field is heading.

Barclays aims to conclude its first experiment into the use of quantum computing for settlement optimization in a few months’ time. 

Dr. Lee Braine, of Barclays’ Chief Technology Office, tells WatersTechnology that it has been focusing on exploring the application of quantum computing for settlement optimization. “We are aiming to complete our first significant experiment in the next few months,” he says. “And then we’ll look to publish our results off the back of that.”  

The challenge is finding the optimum variable to settle: i.e., the number of transactions to settle or the value of the transactions to settle. “There are implicit connections between different trades; for example, chains of back-to-back trades, where it may not be explicit whether a particular trade is an intermediate or an end-point in the chain, but you can connect them via a netting algorithm to settle together as a unit in terms of optimization,” he says.

Braine continues: “It is what is called an NP-complete problem in math, meaning it would take an extremely long period of time to come up with the optimal combination of trades that should be settled together. So that led us to a hypothesis that this challenge could be a good problem to be solved on a quantum computer that could potentially explore all the combinations in order to come up with the optimal combination.”   

I think the challenge in financial services will be in scaling up that number of qubits, but from our perspective, the important thing is to be what is called ‘quantum ready.’
Dr. Lee Braine, Barclays

The challenge for Barclays—like everyone else experimenting with the technology—is that quantum processors currently don’t have a large number of qubits, the fundamental units of information in a quantum computer.

“You need to represent your problem using in the order of, depending on the specific quantum processor, a maximum of 20 to 50 qubits,” Braine says. “Whereas to represent a full transaction set of maybe 50,000 trades with all its associated data attributes, and all the processing stages they progress through, you’d need several million qubits. So we have to abstract the nature of the problem and then run experiments using both a simplified algorithm and a reduced dataset. This effectively is a ‘toy solution’ to the problem.”

Braine says his team has to think through what the problem is, and then how to construct an abstracted version of it so that not only the number of qubits is reduced to what is viable on a current quantum processor, but also the number of processing steps needed to perform are viable within the quantum-coherence time before the quantum state collapses. “We are currently demonstrating via a proof-of-concept that the idea is viable. But we won’t run it on actual trades, partly because there aren’t enough qubits yet. So it is purely a simulation,” he adds.  

Braine says that over the last few years there has been an acceleration in terms of the coherence time and an extension of the coherence time and the number of qubits. Coherence time refers to the period of time for which qubits can retain quantum information—a hurdle for developing the practical applicability of quantum computing.

When looking to find practical use cases, Braine says their methodology is to question the nature of the problem, and whether it maps to another problem of which there is a known quantum algorithm solution. If it does, they leverage that known algorithm.

“Inevitably, there will be applications in non-financial areas, such as quantum chemistry, where it is much easier to model behavior with a small number of qubits,” Braine says. “I see applications there occurring sooner rather than later. I think the challenge in financial services will be in scaling up that number of qubits, but from our perspective, the important thing is to be what is called ‘quantum ready.’”

Braine says the objective for Barclays at this point is for people to understand the opportunities and the threat by coding and running some of the quantum computing programs so the bank can make an informed view on the timeline for the deployment of these technologies. From a hardware and industry adoption perspective, as noted before, the challenge is abstracting complex problems using a smaller number of qubits.

Barclays has been involved in quantum computing since the summer of 2017. It is a part of the IBM Q network, a global community of companies, startups, academic institutions and research labs working in this field.

  • LISTEN: Kathryn Guarini, vice president of IBM Industry Research, and Bob Sutor, vice president of IBM Q strategy and ecosystem for IBM Research, talk about how trading firms could potentially use a quantum computer, why people shouldn’t be so concerned about quantum computers destroying blockchains, and what a realistic timeframe might look like for commercialized quantum computers. Click here to listen.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here