A “top tier” global pharmaceutical corporation.
When talking with our customers, we’re often surprised to find them on the cutting edge of innovation in their own industry, but clinging to tradition when meeting IT challenges, particularly data migrations. One such client, a global pharmaceutical corporation, faced the challenge of migrating one million documents from a Documentum Compliance Manager to CSC FirstDoc as the result of a corporate merger.
The migration was to be performed with existing migration tools, by an IT services firm with a strong history at the client. This third party, recommended by CSC, used off-the-shelf migration software.
The analysis, configuration and customization of the migration process required five months. The whole process would require about 3,000 transformation rules.
Management had an ironclad mandate to complete the entire project in 90 days. The client decided that they would implement migration testing using their de facto testing method: manual review of source and destination records, with the sample size determined by ANSI sampling. Theoretically, this traditional method uncovers errors effectively for high volume, repeatable processes and is often used in manufacturing. The initial sample-based testing focused on the verification that the migration rules were implemented “as designed”. That is, with approximately 3,000 migration rules, the original testing chose to verify specific rules chosen via ANSI sampling.
In initial testing of documents and metadata migrated for user acceptance testing, about 1,000 documents were sampled. If finding errors was the goal, sampling was a resounding success. As expected for any moderately complex migration, a Pandora’s box of problems opened up. For example:
As issues were addressed and testing continued, headaches compounded when errors continued to be discovered, and additional QA support was required by multiple resources for an extended period. Worst of all, after testing for over three months — 200+ person-days –, exceeding the absolute deadline for the entire project, the client did not yet have confidence the migration was functioning “as designed”.
And all of this on a modest sample of the UAT migration (itself a very small piece of the total migration scope). Since the number of potential errors was still unknown, there was no way to determine when the migration could actually be completed.
The team needed a new approach to production migration qualification – and fast. The notion of 100% testing was now considered instead of their traditional sampling approach. Valiance Partners’ TRUcompare software – purpose-built to validate GxP migrations — was adopted for qualification of all future batches. There were seven batches in the planned production migration.
With only a two-week lead time, TRUcompare was configured for automated data migration testing. TRUcompare IQ/OQ was also performed during this period and the OQ confirmed the viability of the testing process. Two weeks was all it took TRUcompare for the analysis, configuration and customization process – 10 percent of the original five months’ time lost.
Here are the migration testing results:
In each case, the Migration Specification was updated, the migration process was modified and re-executed and TRUcompare was used to perform 100% regression testing. The overall timeframe for completing each batch and “getting it right” was about one week.
The virtues of automated data migration testing are many. Fundamentally, it turns a difficult to manage process into one that is deterministic. It makes testing of 100% of the migrated data and content viable, thereby minimizing surprises and increasing user confidence.
As the project continued, the company knew it could set reasonable deadlines with confidence, which would have been impossible without an automated testing capability. Migration qualification time was reduced to two days per batch. TWO DAYS, instead of an undefined length of time to uncover only a subset of the total number of errors. 100 percent document verification was performed for the entire batch, to ensure all documents were classified correctly with the proper metadata.
Note that if the company had used 100 percent testing during unit testing of the migration process, the data would have been 100 percent verified before the first batch was migrated. This would have saved over three months in the overall deployment (used for sample-based testing).
This particular pharmaceutical company has many fine traditions, but ANSI sampling for data migration testing is no longer one of them. 100 percent testing of data migrations is a new concept to many, but this example illustrates the obvious value that such an approach and technology can bring to the migration process.
A migration process is driven by user requirements, which are translated by the migration team to repurpose legacy data for use in a different application. The user’s requirements and understanding of the new application typically evolve throughout the process. Additionally, the migration team’s understanding and ability to implement these user requirements evolve. Effectively, the users and the migration team should be expected to get something close to an “80 percent solution” initially. Thereafter, a viable migration process must assume that there will be a number of iterations necessary to go from 80% to close to 100 percent. Automated testing makes this inherently iterative process far more efficient and effective.
The complexity of this migration and the risks associated with error forces us to rethink the viability of sample based testing. It is simply no longer feasible to complete by hand.
Have a similar challenge?
Or, call us at: +1.800.880.4540
UK and Europe: +353 1 4693722