A Post-Monolithic PLM world – Data And System Architecture

A Post-Monolithic PLM world – Data And System Architecture

Manufacturing is in the midst of a digital transformation. And it means that the industry is changing. For more than two decades in the industry, I’ve seen a number of PDM and PLM systems, and, although there are many successful PLM systems and implementations, the number of problems in PLM is continuing to grow. PLM vendors are switching to service-oriented architecture, discussing a variety of topics such as big data, and artificial intelligence. Still, the fundamentals of PLM systems are remaining the same. Platformization was a hot word a few years ago. But, an unfortunate result of tightly integrated vertical platforms provided by a single vendor is the so-called “The Vendor Black Hole”. The following slide was presented by Marc Halpern last year at the PDT Europe 2020 event.

Some experts believe that PLM will eventually move away from its monolithic structure and… become more decentralized. This would involve a shift in data and system architectures. What does this mean for manufacturers? And how can they prepare for this change? Stay tuned to find out.

Daimler Experience

For the last decade, many eyes were watching after Daimler’s decision to switch their CAD system and follow the transformation of their data management and PLM processes. Openness is one of them. Check my earlier article. Here is an interesting passage:

No one can live shut off from the rest of the world any more. Openness is crucial. We expect genuinely open systems because we must be able to integrate components from different vendors on the basis of open standards [Helmut Schütt, CIO Commercial Vehicles Trucks Buses & Vans at Daimler]

PLM manufacturers now have the opportunity to offer innovative new products and that’s not a bad thing. But for us as an OEM, the essential thing is that the individual modules can not just be bolted on but are instead highly integrated and that, if necessary, we can replace them by other best-in-class systems. That is the main challenge, and one to which the IT vendors have not yet paid enough attention. That is why we are so vociferous in demanding openness, interoperability and support for international standards [Dr. Siegmar Haasis, Head of Information Technology Management at the Mercedes-Benz Cars Research & Development Department]

As you can read, manufacturing companies are ready and openness sounds like an inevitable future. However, the vision of openness as articulated by manufacturing companies is not as simple. For example, the idea to use PLM individual modules as highly integrated, but also easily replaced by other (similarly?) components from other vendors and systems is profound and solid, but very hard to implement.

Mercedes Benz – Federated Domain and Data Mesh

My attention was caught by the article published by Y. Hooshmand, J. Resch, P. Wischnewski, and P. Patil – From a monolithic PLM landscape to a Federated Domain and Data Mesh. Thanks, Yousef Hooshmand for sharing it on your LinkedIn. The article speaks about the research and experimental work Mercedes Benz Cars teams does to solve a big problem related to data integrations and handover between multiple applications and platforms. Here is the passage that speaks about the problem.

 For example, in early phases of the Concept stage, Requirements Management (RM) tools are often used to define, to document and to manage the requirements. As development continues, various applications such as Product Data Management (PDM), Application Lifecycle Management (ALM), Enterprise-Resource-Planning (ERP), and Internet of Things (IoT) Platforms come into play. Each of these applications is used to manage a subset of product data that is generated during product lifecycle and pertaining to a product context. The product data is exchanged between these applications realizing business use cases across applications and across departments. Figure 1 shows the product lifecycle phases as well as the PLM landscape of a manufacturing company with more than 100 software tools for data handling and numerous interfaces for data exchange.  The business issues facing manufacturing companies, such as the abrupt electrification of powertrains, the demand for connected devices, and the handling of (big) data, are emerging at an accelerating pace. This is leading to challenges in realizing unconventional business requirements using conventional IT Landscape. 

The article also speaks about the same challenges of existing PLM platforms as it was stated by Marc Halpern of Gartner. Here is the passage.

 … independent software vendors are offering platform solutions that promise a homogeneous IT landscape to be competitive in disruptive situations. However, the platform solutions do not cover all aspects of changing business requirements. Furthermore, the challenges related to semantic interoperability, semantic configuration, and data consistency in the IT landscape, as well as data discoverability and usability, are not properly addressed. These are essential for an organization to be both data-driven and user-centric in order to thrive in disruptive situations 

It is not about systems, it is about data

In my earlier article this month A growing role of data in the future PLM applications, I highlighted a growing need for data aggregation, data connection, and data sharing. There is a demand to take existing PLM architectures upside down and focus on the data, rather than on the application functions. Data is growing beyond a single platform, application, or organization, which allows redefining an old concept of a Single Source of Truth from the technological and system architecture standpoint. The question of how to find SSOT across multiple systems is quickly becoming on the top of industrial companies’ minds to replace an old concept of vertically integrated PLM platform approach.

The importance of data became obvious in the last decade when companies focusing on product lifecycle management understood that instead of focusing on how to provide a single system, it would be much better to focus on how to make data available across multiple systems and companies. Product lifecycle management turned into a business strategy (instead of a single system) and companies started to spend more resources on how establishing the right data product data management strategy and technologies to make data federated, shared, and transferred to other systems.

Unfortunately, it didn’t change fundamentally the nature of old and large PLM software architecture, companies agreed that a “single system” is a dream and companies need to find a new paradigm for product lifecycle management capabilities to work across multiple company silos.

Digital Thread – A Vehicle For A Future Data Models

The idea of a digital thread is to connect different silos of information belonging to lifecycle stages together. While the idea is not really new, it started to accelerate recently when companies started a broad digital transformation strategy. In my article Digital Thread, Bill of Materials and Multi-Tenant Architecture I speak about existing PLM architecture gaps such as multi-tenancy and the need to connect multiple organizations and multiple systems in a single logical model that can be used to optimize processes.

Current PLM systems are mostly focusing on document management and product lifecycle for a single company. Product lifecycle expands the product development process by breaking a single organization’s silos, breaking limits of the current product lifecycle management, and expanding existing paradigms of process management. Digital Thread covers an entire product value chain and not a single organization. This is a challenge for existing PLM solutions. Limitations of current platforms are especially visible in solutions covering supply chain management where coordination of work between multiple companies is so critical. The role of multi-tenant data architecture in future product lifecycle management is to organize a network layer connecting data structure (BOM) representing multiple elements of multi-disciplinary system development. PLM software capable to organize such a network layer will be capable to provide unique functions and data intelligence capabilities to optimize a product lifecycle and business processes.

Such a new software architecture can help to realize the value of data sharing in manufacturing and to support industrial companies in organizing cross-system and cross-organizational workflows.

Semantic Layer – A New Data Foundation

How to bring a great vision of digital thread, data sharing, and collaboration? These ideas are not really new and vendors were flirting with them for the last 10-15 years (or maybe more). The real difference can be only achieved by re-inventing the data foundation of existing PLM systems, which almost always rely on a single SQL database platform approach.

In my article Data Models Evolution for Future PLM platforms, I speak about three fundamental differentiators of new data architectures: (1) Global flexible data models; (2) Semantic data modeling; and (3) Polyglot persistence. In my view, a combination of these new architectural principles allows to building a new type of PLM architecture capable to provide an answer to the demands of industrial companies such as supporting multiple applications, connecting data siloes, working across multiple departments, joint ventures, and organizations.

However, just following these architectural principles won’t be enough. A critical element of the solution is openness and new business models related to data. I shared my view on how companies can create an Open Semantic Data Modeling Layer.

A Semantic Data Modeling Layer is a data model that allows you to store and retrieve your data as well as the metadata associated with the data. A Semantic Data Modeling Layer can be used by both humans and machines to retrieve information from your system, which is useful for building a variety of queries, and reports, or answering questions about what you have stored in your system. The technology behind this layer of software provides a way for people to understand what they are looking at when they view an organization’s dataset. The semantic layer also gives organizations the ability to store their datasets without having to worry about formats or structures. This means more flexibility on how we want our data structured and formatted so that we don’t need to worry about conversion rates. The last is very important when the discussion comes down to how this unified semantic layer can be organized.

Instead of searching for a single database, I suggest choosing technology and framework to model data that is open, standard, and at the same time powerful enough to provide a foundation for such complex data as it needed for modern design, digital twins, digital threads and other data modeling abstractions needed in engineering, product development, and manufacturing. My favorite candidate for this technology is Semantic Web technologies such as RDF / OWL.

However, the semantic layer itself doesn’t solve the problem. It requires the application of openness and business models that can drive companies to work together instead of focusing on data locking and the building of walled gardens. One of the ideas is to re-invent open source PLM by creating shareable ontologies and data sets that can be shared and used across multiple systems and companies. Check for more details in my Semantic Data Layer and Open Source PLM.

What is my conclusion?

The industry is coming to the point when a new PLM architecture will become inevitable. Mercedes Benz’s article is a great confirmation that existing IT architectures offered by PLM vendors are limited and cannot provide an answer to modern manufacturing process challenges. For the last few years, I’ve been focusing on building OpenBOM – a new type of PLM platform that incorporates principles I described in this article 1/ Polyglot data architecture; 2/ Multi-tenant data models; 3/ Semantic modeling layer; 4/ Digital thread support and 5/ Connected user experience. All together they will provide a foundation for an expandable platform approach, and ability to recombine system functions and user experience by mixing online services provided by multiple vendors. This is a path to creating a connected PLM experience for a new manufacturing world. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital cloud-native PLM platform that manages product data and connects manufacturers, construction companies, and their supply chain networksMy opinion can be unintentionally biased.


Share This Post