Feature: Is Wall Street Ready for Quantum Computing?

scott-aaronson
Scott Aaronson, associate professor of electrical engineering and computer science, Massachusetts Institute of Technology

Is the financial services industry ready for the processing power promised by the next disruptive technology? Rob Daly looks at advances in quantum computing and what it means for the investment banking community.

The “quantum” prefix has appeared numerous times in the names of science fiction movies, television series and books—so often that its significance has been obscured. But outside of pop culture, quantum computing is set to shake up the financial services industry and indeed any computing-dependent business. The only question is when it will be available.

The power of quantum computing comes from exploiting the quirky behavior found in quantum mechanics. The best-known example, and the basis of quantum computing, is the famous double-slit experiment in which the experimenter shines a light through a screen with two slits onto a second screen that shows illuminated areas but will also have a dark fringe where the light doesn’t go. “If you close one of the slits, the light does go there,” says Scott Aaronson, associate professor of electrical engineering and computer science at Massachusetts Institute of Technology (MIT). “This result seems to violate the laws of probability because if you increase the number of ways something could happen, it should increase the chance that it would happen. Yet you see exactly the opposite.”

Strange Behavior
After about 20 years of pondering this phenomenon, physicists theorize that the photons were in a “super position,” which allowed them to go through both slits simultaneously and that the dark fringe was caused by destructive interference.
“Where you see the dark fringes is where the two paths that the photon could have taken are,” says Aaronson. “One contributes a positive amplitude and the other contributes a negative amplitude that cancel each other out.”

For each “qubit,” or individual particle used, the number of possible outcomes would be two raised to the same number of particles used—in other words, if there were 100 qubits, there would be 2^100 possible outcomes, and 1,000 qubits would provide 2^1000 possibilities.

“The entire trick of quantum computing is to arrange things so that the different paths that coincide with the wrong answer interfere destructively with each other,” Aaronson says. “On the other hand, the different paths that lead to the right answer should all have the same sign, or be in phase with each other, so that all of their amplitudes add up. If you can arrange that, you can measure the results at the end of the process and observe the right answer with high probability.”

Academia
This research would likely have remained in the world of academia if not for MIT professor Peter Shor devising his Shor’s Factoring Algorithm in 1994, which factors any integer into its prime factors. Theoretically, when this algorithm runs on a computer powerful enough, such as a quantum computer, it can crack public-key encryption, which is based on multiplying two very large prime numbers and using the resulting product as an encryption key. Prior to quantum computing, these products were so large that conventional computing systems would have taken decades or longer find their factors.

Various governmental agencies took note of this development and began funding research into quantum computing.

“There is a national security reason to research this and to get the technology working,” says Geoff Woollacott, a senior analyst at technology market analysis firm Technology Business Research (TBR). “There’s a theory that the future of warfare will involve hacking to take down the enemy’s power grid and other things.”

Besides being used in factoring numbers, quantum computers can act as general-purpose quantum simulators that researchers could use for protein solving, determining an action of a drug molecule or the behavior of high-temperature superconductors, according to Aaronson. “Today, we know a fair amount about what a quantum computer would and would not be good for. The bottom line is that a quantum computer offers incredible speedups for a few specific problems, but probably only modest speedups for more general problems,” he says.

A quantum computer could also be used to optimize environments via the adiabatic algorithm. “This is a heuristic algorithm, which means it’s not guaranteed to work,” says Aaronson. “It might work on some instances of a problem but not on others. We can construct theoretical cases where it does well and others where it does poorly.”

Ready for Prime Time?
Whether or not quantum computing will be available outside of the research community remains to be seen.

“Right now, theory is way ahead of experiments—building even a small quantum computer is an incredibly hard problem,” says Aaronson.

However, D-Wave Systems, a privately held company based in Burnaby, British Columbia, has also been working on the problem since it formed in 1999. In 2003, it developed its quantum-computing platform and doubled the number of qubits that it could support every year, according to Geordie Rose, founder and CTO of D-Wave Systems.

In May of 2011, D-Wave announced the first public sale of its 128-qubit D-Wave One to aerospace giant Lockheed Martin for a reported $10 million.

Aaronson says he questions whether the platform is truly a quantum computer or some sort of hybrid between a quantum computer and a conventional computing platform, given that D-Wave appears to be several steps ahead of academia. “It would help if D-Wave would demonstrate quantum entanglement, where two particles are connected in some way that cannot be explained through classical correlation, within its system,” he says. “It would be circumstantial evidence that something quantum is happening.”

Rose attributes the difference in results to the basic difference between academic and commercial research.

“We are building a new type of computing system that competes with conventional systems for certain extremely important applications,” says Rose. “In order to legitimately try to build a computer to compete against systems that have had trillions of dollars of investment and 50 years of the world’s best technical people making them better, you need to take the problem seriously.”

On the other hand, the research community is more interested in building devices, according to Rose. “Building a computer is very different from doing research into devices,” he says. “A computer is built of devices. If you open your processor in your laptop and stare at it using the right type of microscope you will be able to look down into it and see devices—transistors, memory elements and whatnot. However, your computer is not a thrown-together series of devices—it is a system that is ultimately designed to satisfy the needs of the user. Computers, even quantum computers, are tools we build for users. They aren’t an end in themselves,” he adds.

At the heart of the D-Wave One is a quantum computer, but Rose says D-Wave’s goal is not to build a quantum computer for the sake of simply building one. Instead, the vendor developed the hardware in order to improve the performance of machine learning—the process of taking a large amount of complex data and developing automated ways to extract meaning from it.

“It could be predicting the price of a financial instrument in the future, finding a cure to a disease or detecting objects in images,” says Rose. “There are a large variety of applications of machine learning that are now transforming industry, and the importance of these techniques will continue to grow in the future as data volumes and complexity continue to rise.”

The basic machine-learning algorithms that D-Wave uses within the platform have been around in the public domain for years, but their performance scales poorly in a conventional computing environment. To improve the performance of these algorithms, D-Wave increased their complexity and offset it by running them on D-Wave One.

Making the Quantum Leap
No one associated with quantum computing believes that it will eventually replace the familiar x86 family of processors. “The platform is like a co-processor that does something very well,” explains Rose. “You just have to figure out how to use that co-processor to get an advantage to the problem you’re trying to solve. You can define a problem mathematically and select the platform on which to run it. It could be your conventional gear that would take 1,000 years. Or you could send the problem to our system, which might be able to do the same task in a couple of milliseconds.”

To connect to the D-Wave One, the vendor provides an application-programming interface (API) to allow developers to send a call to the hardware with a defined problem. The platform solves the problem and sends back the answer, which can then be integrated into the consuming applications. Users can host their own D-Wave One system in their datacenter or access a remotely hosted version managed by D-Wave systems.

Firms hosting the platform in their own datacenters will need to make minimal changes in terms of power and cooling.

“The actual temperature the processor runs at is extremely cold, but the user doesn’t need to deal with it—the cryogenics are hidden in the system,” says Rose. “The chips run at almost absolute zero, but the cooling system runs inside the box in a closed cycle—you set it and forget it.”

For firms looking to access the D-Wave System hosted offering, clients will notice that they will not have to send the same volume of data they would for conventional systems, since machine-learning algorithms can significantly compress the necessary data.

“Imagine that you’re in a room, and on average there are 1017 photons hitting your eye, but you don’t see the photons, you see a coffee cup. Your brain parses a large number of photons into a small amount of information, which is then labeled as one piece of information—a coffee cup,” says Rose.

Takeoff?
Beside the various government-related signal intelligent organizations, will quantum computing take off? The strategic investors in D-Wave Systems—the Business Development Bank of Canada, Draper Fisher Jurvetson (DFJ), GrowthWorks, Harris & Harris Group, International Investment and Underwriting (IIU) and Kensington Capital Partners—seem to think so.

D-Wave officials are keeping mum about their future business plans. However, the current leadership includes Vern Brownell, the vendor’s president and CEO. Prior to joining the company, Brownell launched high-end computing vendor Egenera, and spent 11 years as CTO at Goldman Sachs. “This jump in hardware computation will get gobbled up, whether it’s available now or in 15 years,” says TBR’s Woollacott. “The government will see it first, the financial services industry will be second, and research and development will be next. It will be like the workstation adoption wave—the more things change, the more they stay the same.”

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Where have all the exchange platform providers gone?

The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here