In my previous blog, The PLM State: Drain Your Data Swamps, I made the case that data swamps (misplaced, forgotten, or ignored data that isn't readily available for use) are an expense that companies should endeavor to recover by making that data once again available to add value and be utilized for business decisions. The following day, Oleg Shilovitsky wrote in his Beyond PLM blog that PLM cannot accomplish this and listed three reasons why. I agree with much of what Oleg had to say, and would like to add some more thoughts to the points he made based on my experience with PLM migrations.
I'm in absolute agreement with Oleg when he says that legacy data import can be a difficult and complex task. PLM implementation is in itself a process that needs careful consideration and expertise to produce a PLM solution that really adds value to your business. This is the raison d'être for companies like Zero Wait-State. More than that, you cannot just throw a bunch of semi-canned scripts at your data and expect a viable business solution to be the result. Beyond using a contracted service, it takes a business partner with PLM and engineering experience to execute this successfully, and, frankly, it's why I'm proud to be working for ZWS. It is in nobody's interest to underestimate the level of effort involved here.
Oleg is also right when he states that many companies shy away from migrating their existing data into a PLM system because of the perception that it is too difficult and expensive to attempt. Siloed datasets resist easy integration. But being difficult doesn't mean being impossible, nor does it mean the effort isn't a worthwhile project to undertake. Remember, you paid for the data you possess and you're paying to keep it. It's time to put it to good use.
Here are the three pain points surrounding PLM and draining your data swamp that Oleg brought up in his blog.
- "Limited data modeling. Although all PLM systems have some kind of “flexible data modeling” capabilities, it is a lot of work to get data modeling done for all legacy data. Just think about converting all existing data structures into some sort of data models in existing PLM systems. It can be lifelong manual project."
We're in agreement again that this can take a lot of effort. Here it makes sense to look at the expected return on investment because different companies can have vastly differing legacy footprints. You want to drain your data swamp of all the data that can be useful to your current and future endeavors, but if that data has truly reached its end-of-useful-life then it's time to retire that data altogether. Stop paying to keep it around (assuming there are no legal requirements to save it). ZWS uses an outcome-based approach to identify what data is relevant to migrate into PLM and what form it should take. Just because it's in your data swamp doesn't mean you want it in your PLM solution.
- "Limited flexibility of relational databases. Majority of PLM architectures are built on top of relational databases, which provides almost no ways to bring unstructured or semi-structured data such as emails, PDF content, Excel spreadsheets, etc."
Attachments are already a part of PLM. Add a document to a part and then attach whatever you'd like to the document. You can then add or edit attachments without revisioning the part. A good migration will match these external data sources with the PLM metadata during the migration. Furthermore, Agile has a robust text search of attachment data which it indexes for this purpose. Tools like Perception Software's AgileXPLORER can be utilized if your search requirements are particularly complex.
This is also where a document management system (DMS) can shine, linking together with PLM to present disparate data in a cohesive, functional manner. Again, just because you have a data point doesn't mean you want to keep it, let alone keep it in PLM. A thoughtful migration will separate the grain from the chaff.
- "Absence of data transformation and classification tools. Existing PLM platforms have no tools that can allow you re-structure or re-model data after it is already imported into the system. Think about importing data and “massaging” it afterwards."
Restructuring or remodeling data is not a fault or weakness of PLM systems - PLM exists to keep track of such changes. In all the Agile migrations that I've conducted the desired relationships and structures are built as part of the migration process itself. The data is examined and massaged before the import, not after, so the result is a clean and useful PLM solution. Proper data configuration and migration is what is required to make your data visible and accessible in the manner that makes the most sense for your individual needs.
The ultimate goal in draining your data swamp is to wring every last ounce of profitability from your data. PLM is definitely a tool that can help get you there, but it doesn't operate in a vacuum nor is the process a simple one. A comprehensive look into your data, processes, and goals, along with input from all the stakeholders of the various datasets involved will determine the approach that maximizes value in your particular situation.
Don't let this undertaking discourage you. Don't let the fear of the effort to implement a PLM or other (ERP, DMS, LIMS, etc.) solution hold you back from realizing the most value from the data you have already invested in. Especially, don't let the inability to access your own data quickly and easily become a threat to your business. Data swamps are nobody's friend. With the right resources, you may find the right solution is nearer than you think.
[clear][clear-line]
[one-third]Join our Data Migration group to discuss data migration pains and solutions with other members!
[/one-third][one-third] We will be conducting a webinar with Perception on this very topic - stay tuned for dates and times! [/one-third] Learn more about ZWS' data migration methodology.