NATAS Panels: Latency, Capacity Prompt Greater Hardware, Cloud Adoption

natas2012-latency-panel2
Nikhil Bagga, UBS; Olivier Baetz, NovaSparks; Gil Tene, Azul Systems; Fergal Toomey, Corvil; Yuri Salkinder, Credit Suisse; and moderator Scott Sullivan, Celent

Despite the increasing cost and difficulty of gaining a trading advantage from speed, the race to zero is expected to continue apace as firms roll out new technologies to reduce latency compared to their competitors, according to a panel of experts at last week’s North American Trading Architecture Summit, hosted by sibling news sites WatersTechnology and Sell-Side Technology.

However, panelists warned that firms are more closely scrutinizing expenditure on latency-related projects, and that the industry must reach consensus on latency monitoring standards to achieve meaningful gains in future.

“The latency race means different things for different firms. And there is a limited number of firms who will compete for every last nanosecond… and for whom that will become increasingly expensive… but who will soon be using full-loop FPGAs [Field Programmable Gate Arrays]—and not just for data and execution, but also for the algorithm in the middle,” said Olivier Baetz, vice president of operations at hardware-accelerated data capture, processing and distribution technology vendor NovaSparks. “And as they press ahead, others will have to follow, so as to not let the gap get too big, and to stay in the game.”

So far, the complexity and cost of FPGA solutions has been a barrier to broader adoption, forcing firms to make critical choices about what belongs on a low-latency infrastructure, and what should be handled elsewhere. “My rule of thumb is to use hardware when you have to provide absolutely cutthroat latency—and only then,” said Yuri Salkinder, director at Credit Suisse. “If the extra 15 or 20 microseconds is not absolutely critical, don’t use hardware—let that be taken care of using regular architectures, which are usually more flexible.”

However, some firms are looking at hardware-acceleration as a means to increase performance more broadly, rather than just to increase outright speed, such as for performing high-performance options calculations or to reduce their overall hardware footprint, panelists said. “These ultra-low-latency teams are like little development labs that can play around with the FPGA cards, then transfer their knowledge to the rest of the firm,” said Nikhil Bagga, head of equities IT at UBS.

For example, FPGAs could be used to increase determinism and reduce jitter, Baetz said. “What aspect of low latency do you care about: is it consistency, or is it being really fast some of the time, even if it means you are really slow at other times—i.e. sacrificing consistency—because you won’t find many exchanges willing to do that,” said Gil Tene, chief technology officer of Azul Systems. “People are already gaming variability—if you are very fast but predictably freeze every 15 seconds, you’re very exposed.”

But every dollar spent is wasted if the results are not measured—which increasingly requires a more standardized approach in response to the highly granular measurements. “People are becoming more circumspect about making investments in technology around latency because they want to be sure they will achieve a return,” said Fergal Toomey, chief scientist at latency monitoring software vendor Corvil. “You have to understand how fast you need to be before you spend a lot of money meeting that target. For example, if an exchange’s latency is fluctuating at around a millisecond, the returns you can get from reducing your latency to microseconds are limited.”

FIX Protocol’s FIX for Inter-Party Latency working group is defining standards for how latency should be monitored and reported, which panelists said are much-needed. “We would ask an exchange for latency figures and be told the matching engine latency, which is only one part of the process. There needs to be consistency and a common language,” Salkinder said.

These standards may become even more important going forward as latency monitoring shifts from focusing on raw speed to performing different roles for different segments of the industry. For example, Toomey said, exchanges and sell-side firms are producers of latency data that measure execution latency and collect that information to benchmark their performance and promote their services to clients, while buy-side firms may want to collate data to monitor service providers and decide where to place strategies.

“I philosophically look at this race as never-ending. Even if you call a truce and say, ‘Let’s all agree not to go any faster,’ someone would break the truce because there’s a lot of money at stake,” Tene said. “Right now, we talk about nanoseconds on the wire, but I think the next step is [eliminating] the wire itself—not co-locating in the same country, city or block, but co-locating on the same machine so there is no wire.”

Capacity Courts Cloud
Meanwhile, as lower-latency, higher-frequency data results in rising data volumes overall, leading to increasing capacity and storage requirements, firms are looking to reconfigure their infrastructures to avoid any unnecessary data distribution, and are exploring cloud technology to help confront these challenges, according to speakers on a separate panel at the same event.

Statistics from the Financial Information Forum show that message-per-second traffic has increased 63 percent over the past year, while the total number of messages flowing through equity and options exchanges has increased by more than 70 percent over the same period. Meanwhile, trading volumes have fallen, meaning that the growth in message rates is a direct result of higher volatility, said Arsalan Shahid, program director for the FIF.

Patrick Myles, chief technology officer at trading platform and data distribution technology vendor Caplin Systems, told attendees how a key issue for any web-based trading application that the vendor builds for its bank and online broker clients—for them to distribute prices to trading customers—is the distribution of market data to front-end dashboards for end-user trading clients. To address this, the vendor is taking the approach of focusing on building out the capacity of back-end servers that need to consume full order books while pushing out only the data that individual clients need to their front-end, rather than increasing capacity across the whole infrastructure to send every update to every client, Myles said.

In addition, for less latency-sensitive needs, rather than distributing data internally, firms could provide a centralized, cloud-hosted time-series database, to allow separate business units to subscribe to only the data they need, minimizing unnecessary replication of data, said Peter Mager, chief technology officer at Davidson Kempner Capital Management.

However, Howard Halberstein, vice president and lead Unix solutions architect at Deutsche Bank, said that the cloud does not always prove to be a cost-cutting measure—particularly for larger banks that have a significant fixed-cost infrastructure already in place. “When you start to remove servers from internal cost centers and put them out to the cloud, you’re not changing your fixed costs. Your datacenter’s not going away. You’re not laying off one person per server. In the short run, it’s extremely painful to get on the cloud,” he said.

As such, cloud-based solutions tend to be more in demand among smaller institutional banks without a large datacenter footprint, as well as among start-up buy-side firms that can begin leveraging the cloud from launch, Myles said.

Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.

To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe

You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.

Most read articles loading...

You need to sign in to use this feature. If you don’t have a WatersTechnology account, please register for a trial.

Sign in
You are currently on corporate access.

To use this feature you will need an individual account. If you have one already please sign in.

Sign in.

Alternatively you can request an individual account here