Zurich Insurance Completes Two-Year Data Migration Effort
Project reduces cost and complexity at global insurance firm
Service provider BackOffice Associates (BOA) and Swiss-based Zurich Insurance have for the past two years been engaged in a project to migrate data from the legacy systems of the global insurer into one standard system.
Zurich, which has grown since its founding in 1872 into a business employing 55,000 staff in 170 countries, had developed or acquired 23 SAP enterprise resource planning systems, all using a variety of data standards. In 2013, the insurer was looking to harmonize data between geographies and find a more efficient and controlled way of doing data cleansing and migration, says Zurich SAP convergence project manager Andrea Smith.
"We needed full traceability and greater confidence among business users that we were cleansing, migrating and reconciling data at a level that they could own and defend to internal people who wanted to know how we had migrated everything with accuracy," says Smith. Zurich chose BOA partly because the provider acquired Zurich's partner at the time, Entota, but also because of BOA's track record with complex information governance projects.
BOA's chief executive officer for Europe and Africa, Clive Bellmore, says his company's experience with large clients and its long-standing partnership with SAP were also major factors in its suitability to help Zurich.
He believes that BOA's approach is effective because of its consultants' focus on the business of the client. "Our consultants are functional businesspeople who also understand data. That is the way we approach this. We understand what Zurich is trying to do as a business, and then we align the skills within our consulting base to support that."
Smith agrees that this approach has been effective at Zurich: "The methodology that the consultants bring with them, the domain expertise, helps take our technology work out of the picture," she says. "Instead of spending four to six months talking about setting up a toolset in deeply technical terms and then waiting until the very end to do some form of data cleansing, you can start earlier on the business part of the equation."
Implementing the Project
In this "ambitious" project, says Bellmore, implementation involved deploying multiple waves of data migration in parallel in different geographies.
BOA began with an assessment of the data in the source systems, before mapping out the rules and definitions to be applied to the data before it was moved into the target SAP system.
"The data assessment is about looking at data from a business perspective and establishing what Zurich wants to achieve globally in their business process," says Bellmore. "Doing that upfront work gives us a detailed view of the scope and the culture of the client, about the problems we may be going to face with them. It allows us to shrink the size of the project to be relevant to the business."
After the assessment, the data was migrated using SAP Data Services Migration Accelerator. This is a software platform that provides a framework for migration that is intended to be user-friendly to businesspeople and repeatable and re-usable after the go-live of the migration, says Bellmore.
99% Accuracy
Smith is very happy with the results of BOA's migration efforts: "The net result was that through several data load cycles through 2014—we had about three before the production data load—we took all that data in, raw and ugly and uncleansed as it was, and by the end of the year completed a far more accurate, far more controlled, far better data migration for, in the UK's case, probably our largest deployment.
"The amount of data we had to cleanse and migrate, the number of interfaces that we had to stand up, the number of customizations that we delivered in parallel to all of this work going on—it was a huge challenge and it was done well."
Zurich says all ledger and non-ledger data was reconciled and signed off with higher than 99% accuracy. In the UK, during an eight-month period, Zurich cleansed 14 years' worth of legacy data: 230 million data items with more than 99% accuracy. For Zurich Mexico Santander, a joint venture across three business units with Santander Bank, Zurich migrated all legacy data from multiple source systems without a direct connection. In 2014, Zurich spent seven percent less on data migration efforts, compared to 2013, the insurer says.
Smith concludes: "For me, the most important result was that the business could sign it off with confidence and could provide the auditors with a fully traceable record of how all that cleansing and migration work was done. Compared to the years previously, that was a significant difference and worth every penny we spent on the new tool and every minute we spent with consultants."
Only users who have a paid subscription or are part of a corporate subscription are able to print or copy content.
To access these options, along with all other subscription benefits, please contact info@waterstechnology.com or view our subscription options here: http://subscriptions.waterstechnology.com/subscribe
You are currently unable to print this content. Please contact info@waterstechnology.com to find out more.
You are currently unable to copy this content. Please contact info@waterstechnology.com to find out more.
Copyright Infopro Digital Limited. All rights reserved.
As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (point 2.4), printing is limited to a single copy.
If you would like to purchase additional rights please email info@waterstechnology.com
Copyright Infopro Digital Limited. All rights reserved.
You may share this content using our article tools. As outlined in our terms and conditions, https://www.infopro-digital.com/terms-and-conditions/subscriptions/ (clause 2.4), an Authorised User may only make one copy of the materials for their own personal use. You must also comply with the restrictions in clause 2.5.
If you would like to purchase additional rights please email info@waterstechnology.com
More on Data Management
New working group to create open framework for managing rising market data costs
Substantive Research is putting together a working group of market data-consuming firms with the aim of crafting quantitative metrics for market data cost avoidance.
Off-channel messaging (and regulators) still a massive headache for banks
Waters Wrap: Anthony wonders why US regulators are waging a war using fines, while European regulators have chosen a less draconian path.
Back to basics: Data management woes continue for the buy side
Data management platform Fencore helps investment managers resolve symptoms of not having a central data layer.
‘Feature, not a bug’: Bloomberg makes the case for Figi
Bloomberg created the Figi identifier, but ceded all its rights to the Object Management Group 10 years ago. Here, Bloomberg’s Richard Robinson and Steve Meizanis write to dispel what they believe to be misconceptions about Figi and the FDTA.
SS&C builds data mesh to unite acquired platforms
The vendor is using GenAI and APIs as part of the ongoing project.
Aussie asset managers struggle to meet ‘bank-like’ collateral, margin obligations
New margin and collateral requirements imposed by UMR and its regulator, Apra, are forcing buy-side firms to find tools to help.
Where have all the exchange platform providers gone?
The IMD Wrap: Running an exchange is a profitable business. The margins on market data sales alone can be staggering. And since every exchange needs a reliable and efficient exchange technology stack, Max asks why more vendors aren’t diving into this space.
Reading the bones: Citi, BNY, Morgan Stanley invest in AI, alt data, & private markets
Investment arms at large US banks are taken with emerging technologies such as generative AI, alternative and unstructured data, and private markets as they look to partner with, acquire, and invest in leading startups.