Standards Leaders Set Sights on Upper-Level Ontology for Industry
TC 68 chair sets 2020 goal for industry-wide upper-level semantic ontology.
The new head of a key technical committee is pushing forward some ambitious goals for financial industry standards, most notably an upper-level ontology he hopes to establish by 2020.
In January, FIX stalwart Jim Northey was named chair of Technical Committee (TC) 68, which authors, supports and maintains ISO 20022, a single standardization approach for financial services developed through the International Organization for Standardization (ISO).
Northey says the semantic ontology was “one of my big goals going in,” adding that he and his subcommittee chairs are currently discussing which group is best suited to take on the project, and how to ensure it has “proper support” from industry and national standards bodies.
In the meantime, TC 68 is moving forward with projects that support the larger goal of an upper-level ontology. The committee is incorporating semantics into its reference data standards, with its SC 9 information exchange subcommittee currently finalizing technical reports on semantic representation of ISO 20022. Northey says TC 68 is proposing an advisory group on best practices and data consistency, which he hopes will be approved during ISO/TC 68’s mid-May plenary meeting in Paris.
“I would hope to see that same core group of people… go on to do this next round of work [on the upper-level ontology],” he says. “At the same time, as we’re revising our existing reference data standards, I would like to see work items in the 2019 timeframe start to have them modeled and viewed in semantic terms.”
As the industry moves away from legacy messaging toward Internet of Things devices and mobile payments, data items are becoming increasingly important, Northey says.
“If you don’t have really concise data definitions and identifiers, then you’re not going to have any chance of standardization. I think the industry has shown that semantics is a much better way to improve the quality and the definition, and then [you] make those actionable by things like machine learning and these other advanced tools. So that’s definitely the way we’re going,” he says.
The work is already keeping the TC 68 committee very busy. “I would say the number of standards we have going on simultaneously has never been greater. We’re working on a natural person identifier as part of regulatory response; we’re looking at crypto asset token identifying standards; and we’re doing some revisions to the LEI [legal entity identifier] standard. There’s a lot of stuff going on,” he adds.
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.