Let’s talk about PLM technologies. Err.. PLM is not a technology. Even more, PLM is even not a product. So, what is that? Business strategy? Product development politics? For the sake of this conversation let’s leave these debates out. I want to speak about PLM technologies that allow you to manage product data, CAD files, bill of materials, rich set of related information as well as processes around it. This technology came to us about 20-25 years ago first as a very hard-coded set of tools. You had to build it literally different for every customer. So, it supported only large customers that were able to pay for software, infrastructure and implementation. Later on, PDM/PLM turned into software toolkit. The next step in PDM/PLM technology evolution was called flexible data modeling. The first flexible (dynamic) PLM data modeling tools were released back in 1995-2000 and… not much changed since then.
So, what happened since that time? PLM vendors went to develop out-of-the-box and vertical industry solutions in a massive way. David Linthicum’s article Saleforce.com officially is out of ideas reminded me about the joke comparing technology vs. industry play. Here is the passage:
When you run out of new ways to provide innovative technology, you go vertical. That was the running joke among CTOs back in the day. It usually meant the market had reached the saturation point and you could not find new growth
I found this message very compelling to what happens in PLM industry. PLM vendors are trying to compete by providing more comprehensive set of data models, best practices, process templates. By doing so, vendors want to reduce TCO of PLM implementations. It is actually brings success and many customers are using these solutions as a starting point for their PLM implementation.
So, where is the problem? For most of the situations, PLM is still costly and expensive implementation. Services may take up to 50% of the cost. Here is the issue – core PLM data and process modeling technology didn’t change a lot for the last 10-15 years. Data models, CAD file management, product structure, process orchestration. All these things are evolving, but very little. The fundamental capabilities are the same. And it is very expensive to develop solutions using these technologies.
You may ask me about cloud technologies. Cloud is the answer. But only partially. It solves problems related to infrastructure, deployments and updates. Cloud provides clear benefits here. However, from the implementation technology standpoint, it is very similar to what non-cloud solutions can offer. Another interesting passage from Infoworld cloud computing article explains what is the problem new SaaS/cloud products can experience when trying to displace existing vendors:
So many companies have tried this approach — many times — but most found limited success. I can’t help but think the same will occur here. Salesforce will soon discover that when you get into vertical industries, the existing foundation of industry-specific applications is difficult to displace. Although Salesforce can always play the SaaS card, most of those industry-specific providers have already moved to SaaS or are in the midst of such a move. That means SaaS won’t be the key differentiator it was when Salesforce first provided its powerful sales automation service more than a decade ago.
What is my conclusion? Efficiency and cost. These are two most important things to make PLM implementation successful. So, the technology must be improved. Data and model capturing tools, flexibility and ease of use – everything must be more efficient to support future of manufacturing processes. How to do so? This is a good topic to discuss with technology leaders and strategiest. I’m going to attend COFES 2014 in 10 days. I hope to find some answers there and share it with you.
Best, Oleg