Need to know
-
Market data platforms control the distribution of content from datafeeds to end-users and applications throughout an organization.
-
Through attrition and acquisitions, Refinitiv has established itself as the dominant provider of market data platforms.
-
New technologies—particularly the evolving use-cases of cloud—are opening up this space to a new breed of platforms, potentially enabling users to replace part or all of their legacy data platforms.
-
Changing user requirements are creating opportunities for alternative providers to win parts of that business, resulting in a more competitive marketplace for data infrastructure.
Market data feeds are often referred to as firehoses, spewing out torrents of data. But while data enriches financial firms’ trading operations in the same way water brings life to the earth, it’s not a river; it doesn’t flow naturally from source to soil, but is rather a complex, manufactured irrigation system that must be carefully managed to ensure it reaches the places where it’s needed. To do that, firms employ market data platforms to make sure the right data reaches the right consumers and applications.
First, a history lesson. For many years, Refinitiv (and before it, Thomson Reuters, and Reuters) has enjoyed the dominant—even monopolistic—position in this market, with (chronologically) its Triarch platform; the Tib platform sold under an arrangement with Tibco; then RMDS (the Reuters Market Data System), which combined features of both; then Trep (the Thomson Reuters Enterprise Platform), which under new ownership of the London Stock Exchange Group (LSEG), has been rebadged RTDS (Real-Time Distribution System).
Over the years, rival platforms either fell by the wayside, or were acquired by others—often by Reuters itself. One that gained some significant traction in the mid-2000s was Wombat Financial Software, a startup that foresaw the industry’s obsession with low latency and built a streamlined data platform and feed handler solution that suited high-performance trading desks, which frequently deployed Wombat while the rest of their business used RMDS.
Wombat was acquired by the ill-fated NYSE Technologies, and some of the assets were eventually bought by SR Labs (and which was later rebranded as Vela), a similar startup that also acquired other fintech companies before ultimately selling to hardware ticker plant vendor Exegy earlier this year. The deal creates some tech synergies and potentially helps grow the footprint of each party’s existing business, but also sets the stage for potential further acquisitions (or even, itself, being acquired) to create a broader, full-service fintech solutions provider.
But the Exegy–Vela deal, along with other M&A activity and new developments from other vendors, also comes at a time when Refinitiv has been going through some major changes that could allow other providers to gain a foothold. In just the past few years, the vendor has been hived off from Thomson Reuters—its news organization retained by its Canadian former owner—it was then sold to private equity firm Blackstone Group, which flipped it to London Stock Exchange Group. While LSEG figured out how to absorb the company, clients griped about service and uncertainty over the future of the platform. To its credit, Refinitiv responded by unveiling plans to make RTDS available in the cloud, and by hosting regular client briefings with its executives and developers, but also warned clients that they needed to update software versions quickly to remove older versions whose code referenced its former owner, Thomson Reuters.
All this gave some clients pause for thought, and even momentary uncertainty creates a gap for others trying to muscle in on Refinitiv’s space. For example, Pegasus Enterprise Solutions—a tech startup founded by two former Reuters execs and one of their market data clients—this year released its MarketsIO Platform, which CEO Terry Roche says is “the first platform with the capabilities to replace a platform with the capabilities of Trep.”
Refinitiv was unable to make a spokesperson available in time to participate in this article.
Roche says MarketsIO will help firms eliminate vendor lock-in to proprietary technologies and content, making it easier for firms to integrate best-of-breed third-party content and tools, increasing competition and reducing costs.
“The ability to make change and address costs requires choice. Vendor lock-in eliminates that ability. Without choices, firms have no option but to pay higher prices. And with industry consolidation, I don’t see that changing,” Roche says. “We’re building an open-technology feature to enable the capital markets to operate as a modern industry, to exchange standards, to recover and monetize their IP, and transform the fabric of data that’s been unchanged for 30 years.”
One key to Pegasus’ proposition is that it’s not a data company, and so has no interest in building a platform that advantages proprietary datasets. Its aim is purely to provide a suite of tools—from its MarketsIO EventStream platform to APIs, an Excel add-in, a data viewer for operations staff, Control Center entitlements service, and other components, all of which require one-tenth of the code footprint of traditional platforms, and which Roche says should deliver savings of at least 50%. These tools aim to deliver the mechanisms and controls by which a client firm can use anyone’s data how and where they want to.
“Our mission is to empower those who create and consume data, to unlock them, and to provide a competitive environment for market data that provides choice and lower cost,” Roche says. “The first step to cutting platform costs, we suggest, is to obtain platform independence from the technology you rely on. And the way to do that is to have a high-performance API suite to connect to your other systems using standard interfaces. That makes development teams more efficient, so that when they make changes, those take place in a more efficient way.”
Build bridges, not barriers
Rob Wallos, chief innovation officer at West Highland Support Services, who served as global technology director at Thomson Reuters between 2010 and 2015, thinks the API model has the potential to serve as the foundation of next-generation data platforms, augmented with services that add value.
“I feel like Rest APIs are probably sufficient for most applications, outside of low-latency ones. So, I would identify those and marry them to an API provider where I could change things easily. I would do that first, and that would take a lot of applications off the table. Then I’d move on to the next demographic, such as front-office users who need tools, analytics, and rich data—areas where providers like Refinitiv shine—and to take advantage of the interplay between the data,” Wallos says.
Making developers more efficient empowers them to make changes—potentially to replace large parts of firms’ existing infrastructures, or to create structurally independent “bridges” between existing infrastructure and different data sources and applications, without being locked into a specific vendor’s content or to one central component that can’t be replaced.
One data and technology vendor that has touted the idea of bridges to future-proof firms’ infrastructures is Activ Financial, which was this year acquired by IT infrastructure provider Options Technology. Danny Moore, CEO of Options—who, in his prior role as COO of Wombat, competed directly against Activ—notes the investment that Activ made over the years in things like identifier mapping, data transformation, and data conflation, and other “boring” but necessary functions to create “a very complete” platform that will combine co-located data capture at exchanges with its platform running in multiple cloud environments.
That in itself, with Activ’s enterprise platform and broad data coverage, provides the basis to displace elements of existing platforms. But where Moore sees even greater potential, much like Pegasus, is in becoming an independent enabler for content providers.
“We have global distribution, distributed data capture, and standardized formats … and what we want to do is create something like an app store that enables exchanges and other data providers to get their high-quality data to market,” he says. “Then it becomes easier to have conversations with data providers about what they want to create and how they want to commercialize it—we become that enablement layer.”
Show me the money
The prevailing theme driving firms to consider new platforms is cost and the potential for new technology to reduce costs.
“Back in the early 2000s, when Reuters was facing competition from startups like Wombat, HyperFeed, Infodyne, and Activ Financial, the main issue wasn’t cost—it was data quality and speed,” Wallos says. “Now, I think interest in other platforms is a question of cost. Companies feel the price points they’re paying for feeds versus what they’re getting … and some feel they can do better with a smaller provider,” such as one with a lightweight platform that runs in the cloud and serves a smaller, more focused set of use cases.
While the most latency-sensitive trading firms may decide it’s worth the expense of building their own solutions or buying ultra-high performance solutions from niche providers, most firms consider market data platforms a necessary expense rather than a strategic investment that contributes directly to business or revenue growth. And for that majority of firms, switching platforms—while a potentially expensive and complex challenge—has the potential to yield significant cost savings, especially if there are viable alternative platforms available to keep prices competitive.
Like Pegasus, Options and others, this is one of the drivers that New York-based data and technology vendor MayStreet is hoping to capitalize on. MayStreet has traditionally built components such as its Bellport feed handlers to support high-performance data needs, but now sees much greater potential from offering these collectively as a platform—an integrated suite of components, comprising its feed handlers, its Data Lake, and Analytics Workbench.
“Financial firms spend hundreds of billions of dollars per year globally on capital markets IT—$33 billion on data alone, and 10 times that on making that data usable,” says MayStreet CEO Patrick Flannery. “I think there’s a much bigger business to be built here, given that the financial services industry is very competitive and there is not a lot of advantage to be gained from building this in-house.”
Significant parts of that spend could be replaced with combinations of components that create an on-demand data infrastructure, Flannery says, adding that relevant use cases range from trading groups to risk, compliance, and trade desk support departments.
“Too often, a firm might be paying a lot of money for different solutions in the front office and middle office, such as a low-latency solution in the front office, and other systems for risk, reconciliation, and so on,” he says, adding that using the same underlying building block-style components could potentially deliver “significant” cost-of-ownership improvements. “We did a crude comparison of our Data Lake against [collecting data via] co-location. A tier-one bank might pay $15 million per year for a data lake. Now, it depends how many venues that bank connects to—many need far fewer than 300 venues—but we can deliver that for one-fifth of the cost.”
And it’s not just smaller, agile tech startups driving change: Some of those venues themselves also believe there’s an opportunity for them to play a role beyond just provision of exchange content. This is the case with the LSEG’s acquisition of Refinitiv, and now Nasdaq is also eying this space, with its new cloud-based Nasdaq Data Fabric offering, which it says can outsource large parts of a firm’s data infrastructure.
“There are all these technical challenges that we’ve become used to. But technology has improved a lot, and cloud is making more things possible,” says Bill Dague, head of alternative data at Nasdaq. “With Data Fabric and Data Link, we think that for the content we have, there’s a better way to distribute it, and for the content that clients have, there’s a better way to distribute that, too.”
Not just a platform for distributing data from Nasdaq’s exchanges, Data Fabric is a companion to its Data Link offering of third-party data, based on its acquisition of alternative data marketplace Quandl in 2019. But beyond that, it also allows clients to use it as the delivery mechanism for any other third-party vendor data they consume. Instead of maintaining direct links to multiple vendors, firms access Data Fabric via existing connections to Nasdaq, while Nasdaq obtains the data from vendors via its own existing connections, eliminating one side of the cost triangle—as well as for distributing and managing internal data, a feature that is currently in testing.
Nasdaq looks at Data Fabric as an additional channel for data sources to reach potential clients, rather than something designed to replace vendors themselves—though it’s clearly designed to replace aspects of data infrastructure currently controlled by other vendors. In short, Data Fabric potentially becomes the conduit for all a firm’s data from a multitude of sources, via dedicated and isolated, secure channels.
“For example, if someone subscribes to Bloomberg Fundamentals, we go and pick it up. If a second client needs that, we go and get it again—we’re not trying to collect it once and federate it … and we don’t want to become a redistributor,” Dague says. “We think of our platform more like a Snowflake, Databricks, or even some of Amazon Web Services—as an extension of the client’s infrastructure.”
In addition, firms can use Data Fabric for compliance and governance tasks, such as to centrally track and manage purchases and usage, entitlements, and reporting. That alone would make this a compelling proposition, but Nasdaq’s existing presence and scale as a market and infrastructure operator may make it an easier sale to potential users thinking twice about big migration projects.
“People know that we understand markets, we understand data, and we understand how to run mission-critical systems,” Dague says.
It’s not all about the Benjamins
But cost isn’t the only driver of change: Nasdaq argues that Data Fabric addresses a combination of cost and complexities, as well as the need to spin up access to data just as rapidly as firms can spin up cloud resources, so as to be able to quickly take advantage of new trading opportunities. “Every day that a firm doesn’t have access to a dataset is a day it’s not in the market, and not making money,” Dague says.
But another reason that data platforms need to change—beyond keeping up with the advances of new technologies—is the need to keep up with the advancing needs of a changing user base.
Hence, what’s more important than being able to displace the data platforms available today is what a platform’s technology roadmap looks like over the next decade, says James Bomer, COO of the Activ business at Options Technology. “Going forward, we can expect to see more diverse applications requiring more diverse sets of data,” he says.
Data giant Bloomberg experienced this while developing its BQuant platform for creating and publishing research, which was originally designed to meet specific needs of researchers and analysts. Bloomberg had already built desktop and Microsoft .Net tools for building apps for clients, but these were aimed at developers and were “clunky” for analysts to use. Instead, the team wanted to be able to publish directly from Python notebooks.
“In the beginning, it was a desktop product—a terminal for quants. But we quickly realized this was something that could be used across an enterprise,” says Tony McManus, global head of Bloomberg’s Enterprise Data division, and himself a former director at Wombat and MD at NYSE Technologies.
“We envisaged this as being primarily for quants to do research,” echoes Bloomberg CTO Shawn Edwards. “But the biggest adoption was among firms using it for internal publication of research to their own workflows. It showed that quants needed a way to communicate with other users within their firms.”
And by the time the vendor released BQuant Enterprise earlier this year, which allowed users to share the output of BQuant across their firm, it was clear that BQuant’s potential had evolved far beyond its initial use cases.
“Originally, we weren’t really targeting quantitative programmers. But our user base was changing. … So, early on we recognized that this was something that could be truly transformational,” Edwards says.
At one level, BQuant Enterprise provides greater compute power than the desktop BQuant could provide alone, allowing users to run automated machine-learning computations on vast quantities of internal and third-party data. It’s cloud-native, and uses Kubernetes containers to virtualize Jupyter, Apache Spark for big data processing, NumPy for mathematical functions, and other data analysis tools and open-source libraries.
“It takes you from ideation through testing and to production, so you have the complete workflow,” Edwards says.
But on another level, it goes beyond a quant-focused platform to provide broader capabilities for sharing data across an organization.
“We see BQuant as not only a solution for building factor scoring and pre-trade portfolio construction and analysis, but as a long-term platform for many solutions. We’ll get to other areas like post-trade analysis, transaction cost analysis, and analyzing trading algorithms—we don’t ring-fence or limit how or where clients can use it,” Edwards says. In fact, the platform will play a strategic role in Bloomberg’s efforts to integrate alternative datasets from its recent acquisition of Second Measure and to build out a more powerful set of alternative data-focused tools for quants and analysts—all linked to other data within Bloomberg—that can be used to enrich users’ existing workflows.
A paradigm shift
Like BQuant, Ingenii—a startup formed by former executives of managed data services platform Hentsu, which was acquired by buy-side technology firm Portfolio BI in February—is focusing on research with the initial launch of its own cloud-based distribution platform, but sees the potential for it to become an enterprise platform, serving different tiers of users throughout an organization, from traders to senior management.
Among its target base of hedge fund clients, firms are approaching research differently from the past, trying out new datasets to see if they work—and if they fail, to fail fast. That requires significant data engineering resources for a firm to do in-house.
“What we’re finding is that most of the time, even if they have data engineers in-house, they don’t want them doing that full-time,” says Ingenii CEO Christine Johnson. The vendor’s proposition is to give away its platform and charge maintenance fees for updates and additional features. Johnson says Ingenii’s API connectors can already handle 90% of data that firms might need out of the box, and that the vendor can add other sources and visualization tools as required by clients.
However, looking ahead, Johnson sees Ingenii as moving beyond an infrastructure play toward a different level of data management, research and analysis. “There’s a massive technological convergence coming in the next three to five years between quantum computing and artificial intelligence that will allow you to compute things simultaneously on a single machine with more power than we can comprehend,” she says. “So, if you’re a hedge fund manager and want to research multiple datasets that in the past would take weeks to run, that’s huge. That, and the ability to consume massive amounts of data in parallel, will also contribute to the evolution of AI. You can’t do that on clunky old architectures—you have to be in the cloud, and using quantum.”
Like Johnson and so many others, Bill Bierds, president and chief business development officer of BCC Group is also bullish about the cloud as the new domain of market data platforms. And that’s hardly surprising: First, BCCG operates a cloud-based data platform. And second, the evidence that the cloud can provide a home for real-time (if not yet ultra-low latency) market data beyond massive storage and compute resources is growing.
In recent months, Nasdaq’s launch of Data Link and Data Fabric—as well as its recent announcement of plans to run its matching engine in Amazon Web Services’ cloud—have demonstrated this, as has Google’s alliance with CME Group to make more data available via Google’s cloud, and FactSet making some 30 datasets available via AWS’ Data Exchange cloud data marketplace.
“Market data is going to the cloud. It’s going to happen over the next three to five years, and we want people to be more aware and better prepared,” Bierds says. “There still seems to be an incorrect perception that cloud is not reliable or ready—but it is. I think people who are talking about problems with data in the cloud are thinking about ultra-low latency or co-located data. But we’re delivering double-digit millisecond speeds for clients.”
To reach the “right” people talking about cloud strategically within firms, BCCG has partnered with IBM and KPMG to bring its data-specific expertise to the advisory firms’ consulting efforts. “For example, IBM is already talking to most business areas within these firms, so we can leverage the relationships they have … to get to the right executives who have cloud on their minds,” Bierds says. “We’re talking to market data managers, and we need to reach a different audience.”
‘Banks don’t like to rip things out’
Even when development is driven by evolving end-user needs, new technologies and capabilities often contribute to both driving those changing user demands, and also to being able to turn concept into reality.
“The emergence of really rich open-source software, such as Python and Jupyter notebooks—especially for quants and other research communities—is an important contributing factor,” says Bloomberg’s Edwards. “We are prominent contributors to and consumers of open-source software—we have funded JupyterLab, we’ve had people on Project Jupyter’s steering council … and we’ve even open-sourced some things,” such as BQPlot, the vendor’s interactive charting and plotting tool for Jupyter.
But whatever the driver, it’s one thing for a vendor to build a new solution; it’s another thing entirely to persuade user firms that they need to fix something that ostensibly ain’t broke.
“I think existing platforms like RTDS will probably be around for another decade because they serve a purpose that I don’t see changing—collecting and aggregating data—and data rates will only continue to increase.” says Brennan Carley, who recently retired from Refinitiv after a decade at the vendor in various senior roles, including running its Enterprise Data Solutions business, which includes responsibility for RTDS. “The main users of these platforms have a huge amount of cost sunk into them—not just software licenses, but everything that’s built around them. And banks don’t like to rip out things that they’ve sunk lots of money into.”
Indeed, MayStreet’s Flannery says one of the practical issues that the vendor has to deal with is whether the market is ready to handle a big change. “Right now, if we said ‘it’s all or nothing,’ firms would choose nothing. So, we need to have piecemeal ways to support customers. We think that approach allows us to engage with customers,” he says.
Thus, Flannery prefers the softly-softly approach rather than suggesting a “big bang” cutover, saying that its platform is “not necessarily about replacing existing solutions; it’s more about new use cases, such as tick collection and storage. Having said that … we see lots of opportunities where firms could remove use cases where they use more cumbersome and costly solutions.”
One of the reasons why solutions such as those offered by Pegasus, Activ and others to abstract critical layers of infrastructure offer such encouragement to firms is because they can feel trapped by their technology choices.
“When I was at Citi, I thought I got good value from my Trep license,” says West Highland’s Wallos, who before joining Thomson Reuters in 2010 spent four years as global head of market data architecture at Citi, and the seven years prior to that at Bear Stearns running the firm’s RMDS and Wombat architectures. “But had I wanted to move everybody at Citi off Trep, it would have taken years and $25 million in development costs. We had something like 400 suppliers connected to Trep. And every time you want to change the data structure or change an API, the development team has to manage that. Every little change that’s not planned can be a major issue.”
That’s why there is understandable trepidation on the part of financial firms, adds BCCG’s Bierds. “People are very nervous about moving away from Refinitiv and Trep/RTDS because they’ve spent 30 years building everything around those platforms. So now, if we go into a Trep customer, we are very prescriptive, and we explain that you have to write applications differently so that they aren’t locked in to one vendor’s data. Technology shouldn’t be the reason you’re stuck with a provider.”
When firms are more open to big changes, it’s often because they have already undergone some level of major structural change. One senior market data technology executive at a European bank says there were three catalysts for the firm becoming more open to new solutions. One was a scaling-back of some business lines, drastically reducing the scale of its market data consumption. The second was the adoption of a cloud-first strategy to align with exchanges and brokers making data available via the cloud, to save money without adding complexity. The third is that whereas the bank has traditionally been conservative about adopting new and cutting-edge technologies, nearly two years under Covid-19 pandemic restrictions has made it more open to trying new things. “Now, they’re more willing to say, ‘Have a go,’ than before,” the executive says.
But that approach can be rare. John Greenan, CEO of technology advisory firm Alignment Systems, says that with the trend toward cost-cutting and outsourcing among larger investment firms over the past decade, innovation isn’t being driven by the big banks with large data budgets, but rather by institutions willing to challenge established practices and to try new things, such as blockchain and peer-to-peer networks.
For example, Greenan highlights the Pyth Network, a peer-to-peer network where vendors and liquidity providers contribute their market data and gain access to that of their peers. Though currently heavily weighted toward crypto content, the network counts several major firms among its members, including the Chicago Trading Company, Flow Traders, Jane Street, Jump Trading, and Susquehanna, as well as startup exchange Memx, IEX Cloud, and the Bermuda Stock Exchange.
“Right now, the primary use cases are around crypto trading,” says a Pyth official, where traders need market data on traditional asset classes to provide a benchmark for valuing crypto assets. Pyth currently only carries limited data on the equities and foreign exchange markets, but the official says he can imagine both expanding the symbols provided in each asset class, as well as potentially expanding to cover other asset classes, based on demand from participant firms.
Thus, Pyth’s main play may be in displacing content sources. The official says the network also has the potential to provide a market data infrastructure, but warns that the Solana blockchain platform on which Pyth runs, while fast enough for many data needs, may be too slow to supplant existing data platforms entirely.
“Solana updates every 400 milliseconds or so, which is fast for a blockchain, but slow for equities markets, for example. So I don’t think it will ever completely replace the types of platforms that our participant firms currently use,” the Pyth official says.
Nevertheless, firms could use the data from Pyth to power non-latency sensitive applications, or to potentially build a “serverless market data terminal … without needing any of the back-end data engineering or architecture.” Though these datasets don’t contain instrument-level reference data and standard identifiers, he says that initiatives to issue securities and publish data via distributed ledgers could solve that challenge.
And though adoption remains uneven, making firms unlikely to re-point existing systems to this new paradigm immediately, he says the gradual migration of these data types on-chain will start to make it a more compelling proposition for firms as existing systems reach the end of their life, or contract term, and firms need to look for replacements.
“This is the moment where the market data industry gets with the program, or goes the way of Blockbuster, film cameras, and black-and-white TVs,” he says.
Sowing the seeds of change
But while all providers are making advances toward new ways of delivering data, Bloomberg’s McManus warns that change won’t happen overnight. “It’s not like one day you have a datacenter and the next day you have a cloud. There will be years of operating in a hybrid model,” he says, adding that the richer the ecosystem of content, tools, and services that a vendor provides, the more complex that process becomes. “The question for me is how you help customers navigate that complexity.”
And there are efforts underway, leveraging technology advances, that are designed to cut out that complexity altogether. Perhaps the real question is whether the industry will support or stifle them.
In the past, data delivery required a large technology footprint because that’s what was necessary to support distribution of real-time data. Terminals ran on proprietary desktops, but with the evolution of the internet, many forms of data displays and services were able to migrate to being web-based.
Now, tools exist to move even more of that data distribution process online, reducing the need for costly infrastructure in-house or even dedicated platforms in the cloud. At the very least, the data platforms of tomorrow have the potential to look very different from those of the past two decades—and most likely will involve combinations of all those mentioned above, and more, rather than being the domain of a single platform or provider.
Data is the water that nourishes the soil from which trade ideas grow. But to yield change, the industry must nurture and cultivate a fertile ecosystem. With cost pressures rising and so many choices now viable options, perhaps financial firms are ready to get their hands dirty.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.