THE PLM STATE

The PLM State: Big Bang- Great for Universe Creation Bad for Data Migration

One of the most prevalent and scientifically accepted theories on how the universe was created is the "Big Bang" theory. According to the ever reliable Wikipedia, "The Big Bang was the event which led to the formation of the universe, according to the prevailing cosmological theory of the universe's early development (known as the Big Bang theory or Big Bang model). According to the Big Bang model, the universe, originally in an extremely hot and dense state that expanded rapidly, has since cooled by expanding to the present diluted state, and continues to expand today." In the world of data migration "Big Bang" would be considered trying to move all of your data over from one system or environment to another in a single exercise. While "Big Bang" seemed to work well for creating the universe I am afraid this approach for data migration is less effective. This blog article will discuss the downside of using a "Big Bang" approach to data migration and the advantages of incremental migration. This approach will be known as the "slow drip method".

We have discussed quite thoroughly some of the challenges involved in migrating data in previous articles. In "What's the big deal about data migration?" we looked at the challenges around trying to migrate information without disrupting current production and some of the best and worst practices around data migration. In "PLM Migration, No Data Left Behind" we covered the necessity and difficulty involved in moving older information and revisions to new environments. In this article we will analyze the feasibility of attempting to migrate data in a compressed timeframe, typically over a weekend. The way it usually works is that a company identifies their new data management system and sets a go-live date. The objective is to move all of the data from the old system to the new system and then shut down the old system in as short a time as possible. When making a change it is natural to want to get it done as quickly as possible and to not have a lingering transition. Unfortunately, the nature of data migration does not lend itself to this approach. Most data migration projects involve large amounts of data so there is a practical element that just involves the amount of time it takes to move large amounts of data over from one system to another. Further complicating the issue is the condition of the information and whether or not it is suitable for migration. Ideally a company would spend some amount of time prior to the migration scrubbing the data and making sure it is in proper condition for migration. There is really no automated way to cleanse data and often the only way to know there are issues with it is to try and move it to a new system. We also have previously discussed historical data and the challenges of moving it and whether or not a company really needs access to older product information. Again in a compressed time frame it is difficult to assess how important product data is so the first instinct is to go ahead and take it all just in case. If a migration can be extended to a longer period of time decisions can be made about legacy data based on better understanding of how older data is used.

Basically, in most cases there are two issues that drive a compressed migration schedule. The first is a strong desire to move forward on a new platform and eliminate a legacy system. Often a company is facing hardware obsolescence that is undermining the stability and viability of their current system. There is a sense of urgency to move the data onto a new platform before the old system crashes and becomes unusable. You also have situations where a company is dissatisfied with the capabilities of a legacy system and wants to move to a new application to improve user productivity and capability. Both of these are strong reasons to migrate quickly but there are ways to accomplish this without the stress and shortcuts that inevitably occur with rapid data migration. Virtualizing a legacy data management tool can allow a company to move to newer more reliable hardware and set up backup solutions to protect historical information. This would allow a company to better assess over time what information really needs to be in the new system. A consideration for companies hoping to make productivity gains moving to a new system is that these benefits can easily be nullified by importing flawed information into the system. It is better to take time to make sure the data is accurate than to bring it into a new environment and undermine confidence in the new system.

There is a middle ground between a rapid migration and allowing a migration to dictate when you stand up a new system. I mentioned a "slow drip method" as an alternative methodology for migration. One of our most successful data migrations involved a company that had multiple business units that were releasing products at different times throughout a year. Migrating their data over could potentially disrupt product releases and severely impact the business units performance. Given the nature of how the company functioned there would never be a time when all the business units were aligned and could move over to a new platform simultaneously. To address this issue we created tools that would allow each division to migrate over when they were ready. This allowed each unit to analyze what information they really needed in the new system and to pick the opportune time to shift over. Over the year period all of the business units moved over and then they shut down the legacy environment. Obviously this approach will not work for every company but moving over discrete product lines or business units is one way to break up a migration into more manageable pieces. Another way to address the issue is to take current projects or versions first and then move over historical information as it is needed. This gives a company more time to analyze the necessity for previous versions of data and also to scrutinize it for data integrity. Moving data over in small increments also allows for cleansing as you go making it more feasible to address missing attributes or broken links or any of the other types of challenges moving CAD data presents.

The desire to move data over to a new system in a "Big Bang" is understandable but the challenges this type of approach present can be expensive in both time and money. If it is at all possible extending migration and moving information over in small amounts offers a chance to correct errors and ensure that information is properly structured for a new environment. New technology and methods can help facilitate a gradual migration that will be less traumatic to the organization and ensure a higher likelihood of success with a new system.

[Edit: Repost from 2011]

Join-LinkedIn-button

 

Subscribe to the ZWS Blog

Recent Posts