Why existing standards can fail future PLM platforms?

Why existing standards can fail future PLM platforms?

slow-standards-plm

There is a growing number of discussions related to  “platformization” in PLM. Few weeks ago I had a chance to read CIMdata’s article Platformization: The Next Step in PLM’s Evolution by Peter Bilello. It speaks about reliability of future PLM solutions:

Reliable solutions must be able to withstand multiple system upgrades and platform migrations. In turn, these robust solutions must be adaptable, maintainable, extensible, scalable, reconfigurable, compatible, and stable. And finally, these boundaryless solutions must be free of artificial limitations on functionality that are imposed by the marketplace segmentation of design and engineering systems with conventional architectures. Meeting these characteristics will be a tall order for many of today’s commercially available PLM solutions, but one that must be met for the future of PLM to be successful.

Few days ago, my attention was caught by the following article How will Platformization affect standards?  by Denis Morais. I found connection between future platforms and standards interesting. Standard approach is one of the the dreams PLM industry is using to produce open, easy integrated PLM solutions. My favorite passage was related to standards slowness:

The major issue I currently see with standards is that they do not iterate very fast. There is technically no reason why they cannot iterate much faster but because there are “many cooks in the kitchen” it does seem to take longer than what is needed in today’s fast paced environment.

I want to connect these two messages – standard and platformization in an opposite way. Can future PLM platform leverage existing standards? One of the PLM platform challenges is to be free from artificial limitations driven by existing practices, slow changing standards and existing market segments and customers. After two decades of active PLM developments, there is a huge fragmentation in approaches, data management models, integrations and tools. PLM vendors made leap in their ability to come with flexible products that can be adapted for specific customer needs. At the same time, new manufacturing trends impose new set of challenges – growing complexity of relationships between engineering and manufacturing organizations; growing number of new type of manufacturing companies operating completely different from existing OEMs in terms of IP control, data management and collaboration; new computing infrastructure.

What is my conclusion? Standards have hard time to keep up with changes. One of the potential mistakes PLM vendors can make is to put heavy chunk of existing standards into a foundation of future PLM platforms. I guess some of industry pundits might disagree. But here is a point. In my view, most of existing PLM business practices were developed when a single manufacturing company and their processes was a dominant paradigm. It is going to change in the future. The change of focus from single company to a network of players forming new manufacturing eco-system is a huge change in terms of how PLM platform should behave. Old standards might not work for a new manufacturing world. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers, construction companies, and their supply chain networksMy opinion can be unintentionally biased.

Share

Share This Post