Transforming PLM: Breaking Monolithic Architectures Into Open Services

Transforming PLM: Breaking Monolithic Architectures Into Open Services

Last year, I published the article 5 Steps To Break up Monolithic PLM Architecture. Please check it out. In the article I’ve been sharing my thoughts about the transformation of existing PLM platforms into a scalable set of interoperate serivices. Here is a passage:

While the discussions about how to break monolithic PLM architectures are not surprising these days, I found that most companies are struggling with the implementations. On the side of industrials, a programmatic strategy and specific technology roadmap is missing. On the side of PLM vendors, I can see how leading PLM vendors are continuously bringing their legacy monolithic architectures as solutions for digital thread implementations.

The five steps I proposed: (1) Identify data sources; (2) Find a platform to manage multiple services; (3) Implement data source services; (4) Test solution on the small scale; (5) Scale up. While I still believe in these steps, a year later I found them too abstract and not very much helpful for companies to apply it in the ecosystem of existing PLM applications.

PLM and Vertical Integration Trend

In recent years, the world of Product Lifecycle Management (PLM) has witnessed a significant shift towards vertical integration among vendors. While this approach initially seemed like a logical step forward, it has led to a concerning trend of vendor lock-in and inefficiencies within the industry. Customers find themselves beholden to a single vendor for all their PLM needs, limiting their flexibility and stifling innovation. The reality of mindshare PLM platforms is adding tightly couple functions, which madvery hard. e implementation too complex and customer decisions sometimes

My attention was caught by Prof. Martin Eigner post speaking about ProSTEP Symposium 2024. You can check it on LinkedIn by navigating to this link. Here is an interesting passage:

You can’t make a monolith from the 90s cloud-native by adding an x or +. The whole monolith with a few separate services as containers does not help users in terms of scalability or partial deployment. To achieve these goals, we optimally need a microservice architecture, at least a target-oriented containerization, as well as a quantitative and qualitative scalability of the underlying databases (multi-tenant and polyglot persistence). Providers and users will have to realize that there will be no IT solution that runs equally well on-premise and in the cloud. So-called hybrid solutions that offer containerization on a web service platform are emerging as a transitional solution and compromise. It must also be openly stated that customization in a SaaS solution by the customer is only possible if low-code engines are used. My summary, let’s add to the Code for Openess: “Openess, standards and honesty”

Professor Martin Eigner remarks underscored the urgency of rethinking PLM architectures to foster greater flexibility and interoperability. Inspired by his words, it became apparent that the time had come to chart a new course for PLM.

Open Services and PLM Layers

In response, a proposal emerges to break down monolithic PLM architectures into open services, paving the way for a more adaptable and interconnected ecosystem. This approach advocates for a fundamental shift in mindset, emphasizing the importance of data-driven architectures over application-centric models.

At the heart of this proposal lies the concept of abstraction data layers. By introducing an open design layer, organizations can separate design lifecycle management from other PLM functionalities. This foundational layer allows for the independent management of design data, empowering organizations to select and integrate tools that best suit their needs.

Building upon this foundation, the proposal advocates for the establishment of service layers to manage various aspects of the product lifecycle. These modular services, informed by modern concepts such as graph models, offer enhanced interoperability and flexibility. By breaking down PLM functionalities into smaller, self-contained services, organizations can scale each component independently and integrate them seamlessly with other systems.

The key element of the first step is to introduce an open design layer making design (known before as TDM) available to manage design lifecycle. This first data layer can be supported by existing PDMs, modern cloud CAD (including PDM layer) or any additional SaaS applications managing CAD data. Once it is done, we will have the first service layer.

The second layer to represent modern digital product model allows to connect multiple design sources together. This service layer introduces services and components to manage product data. This is where the idea of graph models can shine, become it makes services much easier to interoperate and also provide mechanism for one of possible product models- single BOM, multiple BOMs or view-based models. Product model creates a foundation for other lifecycle services and applications. The components of this layer represents the core foundation of product lifecycle.

The third layer represents various applications that can connect to product model layer for data. This layer brings various application services (eg. cost management, quality management, supply chain sourcing, manufacturing processes and many others).

How this model is different from everything we see in existing monolithic applications? I can see see two elements of modern architecture that will trigger the process of PLM monolithic architecture transformation- (1) unification of design (aka Design Layer) [here is an example of design digital thread implementation], (2) open product model, which allows implementation of the same conceptual model using multiple services. [I will talk about it in my later posts] Granularity of this services, micro-service architecture and per-service storage, will make them capable to scale, to be independent and connected with many other applications.

Microservices architecture and Digital Product Model

Central to this approach is the adoption of a micro-service architecture. Emphasizing granularity and independence in service architecture enables greater scalability and adaptability. By decoupling PLM functionalities into smaller, more manageable units, organizations can respond more effectively to changing business needs and technological advancements.

What sets this model apart from traditional monolithic PLM applications is its focus on data services. Rather than simply containerizing existing applications, this approach reimagines the underlying data architecture. By disconnecting data from applications, organizations can create a more sustainable and adaptable PLM environment. This shift empowers organizations to choose the best tools for their specific needs without being tethered to a single vendor’s ecosystem.

What is my conclusion?

PLM platforms should step back in building vertical pyramids and switch to re-usable services that can interoperate using a combination open REST APIs, reusable components, self discoverable graph based models and low code programming capabilities. Introduction.

The proposal to break down monolithic PLM architectures into open services represents a transformative step forward for the industry. By embracing this approach, organizations can unlock new levels of flexibility, interoperability, and innovation within their PLM ecosystems. It is time to embrace the power of open services and usher in a new era of PLM excellence.

Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital-thread platform with cloud-native PDM & PLM capabilities to manage product data lifecycle and connect manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.

Share

Share This Post