A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

How change complexity is killing PLM

How change complexity is killing PLM
Oleg
Oleg
27 March, 2017 | 3 min for reading

plm-change-complexity

Lifecycle management is one of the most fundamental value propositions PLM systems are selling. The ability to manage lifecycle of the product information – product data, its states, changes and related process is a quintessential characteristic of PLM. And despite many issues related to complexity of implementations, cost and the need to transformation organizations, leading manufacturing companies today are using PLM systems in one form or another. You can find lot of confirmation to that.

However, there is one thing that coming often and it is related to change management of PLM software and implementations. PLM technologies aren’t the most simple software in the world. To manage product data and changes is hard. The diversity of manufacturing companies and requirements is high. All together created lifecycle challenge – how to perform changes as time is moving forward, new requirements are coming, changes are demanding, new technologies are coming,etc.

Marc Lind, Aras’ SVP of Marketing  calls this situation – monolithic software problem. He is a passage I captured from Marc’s common to my earlier article Next PLM marketing battle – Monolithic Systems.

Where we’ve heard people use it to describe Wc/Tc/En is because it’s near impossible to automate a single process like just change mgt, etc. It’s because of the myriad of dependencies inside those systems. Schema dependencies, redundant biz logic, obfuscated data, partial/proprietary/closed APIs, and on and on. All of these together make the legacy PDM systems an ‘all or nothing’ proposition… even if you only want to use one aspect of the system. Very heavy weight, very complicated, very… Monolithic (even if not the precise definition on wikipedia 🙂

While Marc and I are in disagreement about usage of “monolithic software” terminology, there is a lot of truth in what Marc is saying about the way existing PLM applications are managed by companies. It made me think about why PLM implementations are killed by changes organizations need to perform in PLM infrastructure.

1. Vertical integration between data model and software features. Layers of RDBMS management, object model and software are intertwined and exposed to users. Data model was historically part of PLM software delivery to user. Even more, flexible data model is a significant differentiation factor for many PLM vendors. In such case, it backfires and create a spaghetti architecture.

2. Long cycle of implementations. An average PLM implementation time from early planning time to implementation and active usage is taking much longer than average software development these days. Therefore, very often manufacturing companies are stuck with old software and incompatible features.

3. High level of custom development. The amount of customization in PLM implementation is high and creates multiple dependencies on data model and related functionality. All together, it makes changes even longer.

So, what is the trend in PLM lifecycle management? I can see future trend towards pushing problems of PLM implementation lifecycle management back to vendors. Some of them understood it earlier in the process and use it as a very innovative differentiation (part of Aras Innovator subscription is free upgrade guarantee). SaaS software will play a key role in forming new models of sustainable PLM development and services.

What is my conclusion? Monolithic software or whatever other marketing slogan cannot hide a real problem – change management in PLM software. To change PLM implementation is a very painful process. PLM vendors are developing new technologies and product features. At the same time, to bring these innovation to customers is hard and sometimes mission impossible without crushing existing PLM software or bringing additional layers of software. Such situation is not sustainable – a welcome topic for PLM technologists and architects. I can see a trend towards PLM sustainable implementation supported by PLM vendors. It will become a “must” PLM requirements very soon. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Recent Posts

Also on BeyondPLM

4 6
25 June, 2010

When you talk to a sales person from one of the PLM companies, you for sure will be exposed to...

11 June, 2011

Integration between design and manufacturing is one of the topics that normally hits a lot of discussion in the product...

25 March, 2016

A growing number of manufacturing companies are dependent on the experience of aging workforce. Often process guidelines or rules are...

13 September, 2019

Do you remember Gartner’s Magic Quadrant for software? Just a few years ago, Gartner was still running it for PLM...

26 July, 2015

TechCrunch article Software for the full stack era speaks about brutal reality existing business software platforms are going through. From the...

11 April, 2021

Last week I attended CIMdata PLM Market & Industry Forum 2021. It was a second virtual event (last year’s event...

5 July, 2016

New technologies can redefine existing usage patterns. One of the biggest pains when you run you application on the desktop...

27 January, 2010

I want to raise discussion about reporting in PLM. We don’t see it much in marketing materials about PLM. I...

13 June, 2021

Data is a new oil. Unless you lived under a rock for the last few years, I’m sure you’ve heard...

Blogroll

To the top