Can we liberate product data from PLM vendors and dissolve it with digital thread?

Can we liberate product data from PLM vendors and dissolve it with digital thread?

Silos. We’ve heard this word many time. For many years, PLM sales pitch was to eliminate silos and create a single version of truth in an organization. Until now, singe version of truth is the best way to sell PLM.

In my PLMx PI presentation last year, I shared my thoughts about how to extract more business value from PLM. Digital value chain seems to be a good alternative, but after speaking to many people in manufacturing companies, I can see that the road towards future product data intelligence is not simple in my view.  Organizational silos are playing a fundamental role in the process of digital transformation.

What should be do with organizational silos? Should we eliminate them? Connect them? One of my proposals last year was to keep silos, but connect them using data and people.

Connecting data silos should be become a new way to think about PLM implementations and architecture. Instead building new business process and transformation organization, PLM systems should focus on intelligence and data allowing to build bridges above current organizational structure and processes.This is a place where new data management technologies will play key role in the future organization transformation from the vision of no silo extended enterprise to organized functional silos connected by common understanding of data. People are key actors in this transformation because they are demanding connected data and use it in their business processes

Smart Industry article written by Mark Reisig is proposing to dissolve silos with a digital thread. I admit, it is cool marketing idea. But let’s think what does it mean from a technical, product and organization standpoint?

According to Mark, old technologies is the one to blame..

Unfortunately, despite many digital-transformation efforts in recent years, companies are still overwhelmed by countless data silos and disconnected information. These disconnects, and the resulting lack of visibility and collaboration across the enterprise, lead to a multitude of business inefficiencies.

The reliance on proprietary PLM systems has forced organizations to choose between unsustainable customizations or sub-optimizing their processes to fit out-of-the-box software. Both choices result in a form of vendor lock-in, due to the lack of open APIs and inflexible architecture, and create yet another closed technology stack in the IT architecture.

Compounding this, traditional PLM systems are a compilation of technologies accumulated through acquisition, and integrated with other legacy systems from bygone eras, which weigh down already lethargic infrastructures and increase organizations’ technical debt.

According to the article, a better technology (read – Aras), can solve the problem, which is end-to-end, open, flexible, upgradeable and… connected to the right data. As much as I love Aras technology, it sounds like heavy marketing.

But… my attention was caught by the last point – connecting to the right data. Not about one PLM product replacing another PLM product. It is interesting because it in fact speaks about data liberation.

To eliminate silos, you first have to liberate your data so it’s available to anyone in the organization at the level of granularity required. You achieve this democratization of data by leveraging common data standards for the right data—the master data. By governing the data flow across the digital thread, you break down the silos and gain invaluable fact-based insights.

It made me think about what is behind this data liberation process? Is it about how to get data from one PLM system and place in another PLM system? Is it about taking data from multiple systems and replacing them with one big silo PLM (new system)? Is it enterprise search and indexing technology that applied on top of PLM (and other) databases helping to access all information. Maybe it is new knowledge graph technologies?

The digital thread sounds like a new magic word to solve all previous problem. However, the devil is in details. There are two aspects of liberating and dissolving process – (1) data ownership and (2) data management technology. By dissolving silos we have a danger to dissolve the ownership. People actually like their data and people like to own data. So, whatever dissolving or liberating process vendors will propose, first and very important question to answer will be related to data ownership. Data management technology is not less important. It is easy to pump data from one vendor A SQL database to vendor B SQL database. Will it liberate data? It can liberate organization from paying licenses to vendor A and move to a better licensing model. But data won’t be liberated. It will only change hands. Sometimes, it is right thing to do, but the question of choosing right hands is still relevant.

What is my conclusion? Digital thread and data liberation is very nice and powerful techno-marketing model. However, we need to remember about data ownership and database technologies. After all, data is owned by organization in silos or as a whole piece and stored in databases that belonging to a vendor. How to choose the right one is still a question – welcome back to enterprise IT and PLM competition. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased


Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Michael Larsen

    I would agree with your assessment Oleg. I believe one of the key issues is the fact that data standards and common data models do not exist in most corporations. Even if a company had total commitment to a vendor, data interchange/exchange of desired data between applications may not be completely transparent or bidirectional. I think with all of the data swamps; multitude of IIOT source data types; and legacy variety of databases and data models attempting to use a PLM as a single source of truth is admirable, but unachievable. A better approach would be to have for the most part have data stay at rest and have a distributed data management system that could perform similar functions to ETL (Extract, Transform and Load), domain specific tagging (metadata), and dynamic search through business rules and machine learning. There is one such technology I have found in recent years called MediaFlux from a company called Arcitecta ( I am not affiliated with them in any way I just feel that their technology is well suited to enterprise data that has significant volume, veracity, variety and velocity.

  • beyondplm

    Michael, thanks for comment. I agree, but I have to say that to believe that you can pull all data from enterprise systems and put it at rest in another system would be a bit idealistic. There are lot of aspects related to enterprise IT, data management, vendors, trust, etc. that will have to be discussed even before such project can come up to realization. So, data is a big challenge and biggest value in enterprise companies. And integration is one of the biggest challenge on the road of PLM adoption. Just my thoughts…

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog PLM and Future Global Master Product Data - Beyond PLM (Product Lifecycle Management) Blog()