THE PLM STATE

The PLM State TBT: Storm Clouds - The Perils of Cloud-Based PLM

Welcome to this week's TBT blog. I selected this one,
orginally written in 2012, since we are about to head to
Oracle's MSCE where cloud-based PLM will be heavily discussed.

storm_is_coming.jpg

I wrote an article a while back panning Marc Andreessen's predictions about Oracle because of their failure to embrace the cloud. Oracle's purchase of RightNow demonstrates that they can enter the cloud-based software market any time they feel like it. I am continuing to notice more and more hype around the cloud and I thought the prolific Oleg Shilovitsky's blog article titled "Will Database in the Cloud supercharge PLM for Small Companies?" did a decent job outlining some of the recent developments in the cloud space with infrastructure pieces becoming available on the cloud. Of course, this still is a far cry from being able to run a PLM application in these environments. Currently, the choices customers have for PLM on a cloud are to run a private cloud or use Arena's or Autodesk's SaaS model. Zero Wait-State has been working with some companies that were early adopters of cloud-based PLM and now are looking to move up to a more robust PLM environment. From my perspective, this is a very natural occurrence as startups and small companies are drawn to the low cost and easy implementation a solution like cloud-based PLM offers. However, based on one company's experience, there may be more risk than first realized when opting for the cloud/SaaS option. This article will explore the downside to using PLM on a public cloud and why it might be more prudent to opt for in-house or a hosted environment where you have full control of your data and application.

Wes Shimanek from Intel wrote an interesting article titled, "PLM and Cloud Computing-Is it meant to be?" In the article he did a good job summarizing the concerns around using PLM on the cloud. To quote, “Top concerns in cloud-based computing remain security, predictability and availability.” Shimanek goes on to discuss “how Intel addresses these concerns with their superior hardware solutions” but I am not sure that hardware can sufficiently resolve these issues completely.intel logo.png Security is somewhat of a non-issue in that most data centers – whether part of a company like Autodesk or some sort of hosting solution – typically offer better security than most private companies. Predictability is really a performance-based issue that is dependent on numerous factors including the robustness of servers, the amount of data involved, the amount of traffic on the net, and the size of the pipes between the user of the data and the source. This one will always be a challenge with cloud-based solutions and with some PLM being data intensive, it will take some time for bandwidth to evolve to make this a non-issue. Another element of predictability is the environment itself. Using virtualization and a private cloud allows companies to fully control their PLM application environment. Using Cloud PLM's approach, you are at the mercy of the vendor as to how much capacity they have and also when they choose to upgrade the software. Obviously, you would think that they would be extremely sensitive to their customer's requirements in order to keep their business viable but it is a risk nonetheless. The final concern from Shimanek's article is availability. I feel certain that from Intel's perspective, availability is related to hardware uptime, but what if you want access to your data to move it to another platform or to snapshot it for safekeeping? Again, if you are on a private cloud, this is not really an issue but apparently it can be a big problem if you currently have your data stored in a cloud PLM environment.

We are working with one of the larger cloud-based PLM clients. They have been using the product for several years and by their estimates have over 100 users on the product. The company has been growing rapidly so they are moving to Agile PLM which offers capabilities beyond cloud-based PLM in many areas. Normally this would be a great illustration of where cloud-based PLM and Agile PLM fit in the PLM ecosphere with cloud-based PLM being a tool that smaller companies can adopt and use quickly and cheaply and then move up to a more mature system as they grow. Unfortunately, our client's cloud-based PLM vendor does not see itself in this role and apparently wants to try and hold on to their customers as long as they can. According to the client when they informed their vendor they wanted their data off the system, their vendor told them it would cost $10,000 and take four weeks. This was problematic for the client on several levels. First, the charge was excessive and unexpected. Secondly, and more of an issue, was the fact that the client continues to use their current cloud-based PLM for the moment and the data will change over that period of time and they will need deltas at the point at which they switch over to Agile for production. The client was faced with a dilemma. If they followed their required approach it would be costly and there would be a sizeable gap in their data set. They could request another dump after the migration but they expected this would cost another $10K and they would have to figure out how to populate that into the now active Agile PLM database which would be labor intensive. They came up with what I think was a pretty ingenious solution. They created a program that basically simulates a couple of users and used this program to start extracting information from their cloud-based PLM site as users would manually. This effort took them about one day to create. When the vendor discovered this, they threatened legal action and indicated they would shut down the client's access to the PLM system. Their excuse was that this program could potentially impact other users of their software. To me this is even scarier than the vendor’s conduct about the data extraction. If the vendor is that vulnerable, anyone on their solution today should be very concerned. I am sure it was just a bluff and that the client's little robot was not a real threat but the net effect is that the client is now disgruntled with their cloud-based PLM vendor and what could have been a positive marketing story for this company becomes a cautionary tale for companies considering PLM on the cloud.

The takeaway from this story is that if you are considering using PLM on the cloud, be aware of your exit strategy up front. If things go well you may at some point want to move to another platform and you need to fully understand what is involved in doing this. This is also true of private clouds where you might be hosting a virtual environment at an external hosting company. Get the exit conditions in writing up front from the solution provider so there are no surprises down the line. This will also allow for you to anticipate any cost or clear skies.jpgresource requirements if and when you do decide to make a change. I think the concept of virtualization and cloud-based applications is sound but it does add a level of complexity to PLM transitions and integrations that are not present with on-premise solutions. When your data is housed with a third party their needs to be clear understanding about how you get access to this data if you want to move or if something happens to your provider. In today's environment as we become more and more dependent on these mission critical systems, there needs to be better up-front planning and contingencies for keeping control of data. With private clouds, it is less problematic, but when you are actually running on a third-party system, the stakes get higher and the path to vendor independence is less clear. To keep you skies clear and avoid storm clouds, go into these arrangements with your eyes open and you may want to bring some galoshes.

[Edit: repost from 2012]

ZWS PLM Engineering Collaboration E-Book

Subscribe to the ZWS Blog

Recent Posts

Engineering Collaboration E-Book