THE PLM STATE

The PLM State: Drain Your Data Swamps

Data swamps - every corporation has them, large or small. Not to be confused with a data lake - it's where data is misplaced, or forgotten, or just ignored. It's where data goes to die. This data may be buried in a report that is drier than 007's martini or is too bulky to read. It may be grouped with other data that is irrelevant for the task at hand, a part of someone else's data silo, or part of a byzantine bureaucracy, or confusing software interface.

Data swamps are a double-cost liability. First, you paid for the data. You expended resources to acquire the knowledge that is now being ignored. The data can range from whether a particular capacitor is RoHS compliant to comparing the defect rates between competing designs. If someone in your organization tested it, recorded it, designed it, or bought it, that data point cost you money.

Second, you are paying to keep that data. While per gigabyte data storage costs are gratifyingly low when considering just the storage medium (currently around $.03), the real costs are far higher when adding in the additional infrastructure, bandwidth, and backup costs. To be sure, these costs are shared with data outside the swamp, but every bit of data adds to your IT group's load. Managing data storage growth is by far the most pressing concern among IT managers today.

Data that isn't used is data that isn't profitable. All data should add value, whether by contributing to strategic decisions, marketing, process improvements, or regulations (data kept due to regulatory retention policies adds value by allowing your business to operate in a regulated environment). You paid for, and are paying for it already, so why wouldn't you put that data to use?

But how do you drain a data swamp? The first step is to identify what data is likely to have been swamped. Identifying misplaced or forgotten data would seem like an insurmountable task, but look to those processes in your organization that generate data. I'm not talking about just test data or formal analyses, but all the data. That RoHS-compliant capacitor? That's a data point. Device master records (DMR) and device history records (DHR) are obvious datasets. That prototype from CAD cost a lot to develop, so include that data where it can be leveraged for other designs rather than being ignored after its initial use.

Enterprise resource planning (ERP), project lifecycle management (PLM), laboratory information management system (LIMS), document management system (DMS) software and others can play a role in draining your data swamp, associating and linking disparate datasets and providing a means for their use. But whatever software or combination of software solutions you employ, all have to follow what I call the three Rs to be successful. That is, the data has to be the Right Information, given to the Right People, at the Right Time. Okay, I know it's really just the same R in "right" being reused, but that's what I'm talking about, leveraging your existing letters, er.. data.

The Right Information

Business teams and leaders need reliable data to execute business strategies. Bad data is often worse than no data - it needs to be accurate. Bad data leads to bad process and bad decisions. A properly implemented solution like PLM contains the data of record that can be relied upon for those decisions.

I remember a high school grammar lesson where a servant was sent to a distant stock exchange to buy 10,000 shares in a particular company (obviously before E*Trade). But when he arrived he found the price was far higher than his employer expected it to be. So, being a conscientious fellow, he telegraphed his employer, asking if he should still purchase the stock. The employer was horrified by the high price and replied accordingly. Unfortunately, he was a real miser and decided to drop the punctuation in his telegraphed reply (as telegraphs charged by the letter). What he sent: "NO PRICE TOO HIGH". What he meant to send: "NO, PRICE TOO HIGH." Of course, the servant dutifully bought the stock with disastrous results.

The point (besides the value of commas) is that the telegraph message sent was bad information, bad data, which lead to a bad outcome. Just because data is identified in a swamp doesn't mean you want to keep it. But then the question becomes, "Why are you generating bad data in the first place?"

A properly designed solution like PLM also prevents information overload, where irrelevant information is swamping the data that's important. Lifecycle status is maintained. Data that is actionable can be grouped separately from data that is less important, and stored searches and reports can be customized to meet a variety of needs.

The Right People

The best data in the world is no use to your business if those who need that data don't see it. That's what I mean by data swamp. The strength of a robust data management system is that this hidden data is visible and available for use, instantly. Putting the right data in the hands of the right people is like putting the right tools in the hands of a craftsman or artist. Good things result.

Decades ago I worked as a chemist in the QC testing lab for a medical device manufacturer. All of our materials and product testing at that time was managed via triplicate paper copies. Final results were sent to engineering and also filed using these paper copies. One day during an FDA audit the auditors requested to see the results of a particular dye used in one of our products. It had been tested, but the data could not be found. Frantic searches were made in engineering and QC, but the paperwork was missing in both departments. The FDA took a dim view of the missing data and threatened a product recall.

This bad result of not being able to place existing data in the hands of those who needed it actually had a silver lining. The very next week, management gave the go-ahead to install a long-overdue LIMS solution to manage our testing data.

On the other side of the coin, if your company data becomes available to those not authorized to have it the results can be very, very damaging. A properly designed system is able to ensure that the right people have access to the data they need while keeping that same data locked away from those without the needed permissions. Firewalls, roles and privileges, SSL, and a proper password policy are your friends and should be a part of an enterprise-wide security framework.

The Right Time

Everyone knows their business environment isn't slowing down. Snail mail has given way to email which has given way to texts. Data has to be timely to be profitable. Data received too late to act upon is worthless.

ww11
The classic example from history is a telegram sent prior to the U.S. entry into World War II. Based on military intelligence intercepts, a telegram was sent on December 7th, 1941 from the mainland to warn Pearl Harbor commanders of a likely attack. The telegram left the mainland that morning at 6:31 and arrived in Hawaii at 7:33. The attack on Pearl Harbor started at 7:53 while a bicycle messenger was en route with the telegram. Hampered by the attack, the decoded message didn't reach the Pearl Harbor commanders until 2:58 that afternoon, some five and a half hours after the attack ended.

Draining your data swamps means your data is instantly available and accessible when and where it is needed. Having that data available when an audit or manufacturing issue arises means avoiding an interruption in your schedule, or capitalizing on a market change.

It's your data. You paid for it. You are paying to keep it. Don't abandon it in a data swamp. Wring every last ounce of profitability from your data via your chosen solution(s) - ERP, PLM, LIMS, DMS, etc.. When the right information gets to the right people at the right time your business has the information it needs to succeed.

Join-LinkedIn-button LearnMore-Small-button

Subscribe to the ZWS Blog

Recent Posts