A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

Rethinking Monolithic PLM Architecture – Exploring What Comes Next

Rethinking Monolithic PLM Architecture – Exploring What Comes Next
Oleg
Oleg
3 May, 2025 | 9 min for reading

Let’s talk about PLM architecture.

A recent post and comments by Andreas Lindenthal triggered a fresh wave of discussion around the merits of monolithic PLM platforms. In response to the growing conversation about federated and composable architectures, Andreas wrote:

“Lately some posts have appeared in PLM discussion forums around ‘monolithic architectures’ and how they allegedly are a thing of the past. The new approach apparently are composable, federated data models that rely on open APIs to exchange data between the different tools that each business function uses. One of the huge benefits of a unified PLM system and database (or monolithic architecture, as some people call it) for all business functions is that there is no need to build custom integrations for each function. So I do not believe at all that integrated PLM tools such as #3DExperience, #Teamcenter, and #Windchill that offer different modules for a large number of business functions are a thing of the past. To the contrary, I believe that the functionality and reach of these unified PLM suites will be further expanded to support additional business areas and use cases.”

This is a strong and thoughtful position—one that reflects the legacy logic of many PLM implementations. However, it also opens the door to a deeper and timely discussion about why traditional PLM systems often underperform.

The concept of a “single source of truth” (SSOT) has long been held as a core tenet of PLM philosophy. In practice, this principle was often translated into building a single, centralized database to manage all product-related information. While the intention was to ensure consistency and eliminate redundancy, the resulting systems became too complex and rigid to scale effectively—especially in today’s fast-moving, interconnected business landscape.

This brings us to an important set of questions: What are the trade-offs of this architecture in 2025? Why do so many companies still struggle with PLM adoption despite its promise? And is there a better, more flexible way to think about PLM architecture in a connected, AI-powered, multi-company world?

In this post, I want to explore why the single-database vision is breaking down, what challenges it creates in practice, and what alternatives we should consider to build a more scalable, collaborative, and intelligent PLM ecosystem.

The Myth of the One-Database Dream

Let’s start with the obvious: it’s impossible to put everything into a single database. Products today are designed and built in distributed environments—by internal teams, external suppliers, manufacturing partners, and customers working together across time zones, disciplines, and systems. Trying to jam all this information into one database is like trying to pour the ocean into a swimming pool.

According to Andreas, the problem with PLM adoption doesn’t stem from technology limitations, but rather from people not knowing how to use the tools correctly. It’s a provocative take—and one that deserves discussion. As he puts it:

“The problem is not the tools, the problem is companies don’t use them properly. Most companies have implemented only a fraction of the available functionality and use PLM more like a glorified engineering data management system, i.e., they manage CAD data, BOMs, and engineering changes. And of course for that these sophisticated PLM suites are way too expensive. It’s like buying an expensive power tool to hang up one picture in your house. A simple hammer would be sufficient. I believe the statement they don’t need PLM is not accurate. No, most companies that design and manufacture products do need PLM. But they don’t need an expensive PLM system to do what they are currently using it for.”

While there is truth in the need for better education and change management, the deeper structural and architectural challenges of monolithic PLM systems cannot be overlooked.

Moreover, no matter how big or vertically integrated a vendor becomes, companies still need to work together. OEMs collaborate with thousands of suppliers. Startups connect with contract manufacturers. Everyone touches multiple systems—CAD, ERP, MES, simulation, compliance, procurement tools, and more. PLM must be an enabler of that networked collaboration, not a walled garden that tries to keep everything in. And if we speak about PLM software for small and medium size companies, current architectures are too expensive to host and too rigid to be applied to agile and fast moving companies.

Besides, that the economical business models of existing enterprise PLM tools will unlikely support the demand for ‘inexpensive enterprise PLM’ presented by Andreas (I probably need to write a separate blog about it).

Unified Product lifecycle Management Platforms

In the pursuit of growth, traditional PLM vendors haven’t scaled by solving the root problem—they’ve scaled by acquisition. PLM market was M&A driven for the last 20+ years. Over the years, monolithic platforms evolved into conglomerates of tools, connected together with integrations (sometimes even implemented by 3rd parties), sometimes incompatible data models, and user experiences that relies on “sync” data between the tools although presented a “seamless facade”. All this is combined with the historical deployment and customizations that need to be “lift and shifted” to a new versions of the same tools. Therefore, these product data management, product lifecycle management and business processes systems are complex to deploy, expensive to maintain, and very hard to adapt to the needs of a modern digital enterprise.

What began as a dream of seamless, scalable platforms has turned into an uphill battle for every upgrade and customization. And despite all this investment, PLM remains siloed and underutilized.

The Real Problem: Adoption Complexity

The adoption complexity of Product Lifecycle Management (PLM) tools remains questionable. Check my notes from recent CIMdata conference. The focus on PLM implementations remains heavily on engineering and only 13% of companies using PLM say they “cannot live without it.” That’s a stunningly low number for a category that defines as a foundation of innovation process. The hard truth is that for many companies, PLM is still seen as an expensive engineering tool—difficult to implement, slow to adapt, and lacking business value outside the design team.

Even the most traditional buyers—large OEMs—already own multiple PLM systems and have poured millions into maintaining legacy IP. Bringing in new systems or extending capabilities across the enterprise has become a high-risk, slow-moving endeavor.

So What’s the Alternative?

This is a huge question for all companies and people I know. Let’s put aside all companies that led by “near retirement leaders” that believe that what they did will sustain for the next 5 years. This is a category by itself and arguments here are less technical, but more political.

From the PLM architecture perspective, I believe in the strategy that will allow to improve product development processes by re-wiring them using new data services and agentic workflows. It can be done by using agile platforms, connected data products, online services, and intelligent PLM infrastructure. Not a monolith, but a platform built for scale, adaptability, and collaboration. Here are three pillars of this vision:

  1. Scalable Multi-Tenant Platforms
    Cloud-native platforms must be designed to scale—from a small engineering team in an SMB to global OEMs with thousands of suppliers. Multi-tenancy isn’t just about cost—it’s about speed, upgradeability, and the ability to connect companies in a shared environment while maintaining secure boundaries.
  2. Composable Data Services
    Instead of forcing everyone into one giant database, we need online data services and products—modular, flexible, and API-driven. These services should expose product information when and where it’s needed, across systems and teams, without fragile integrations.
  3. PLM + AI = Orchestration
    The next frontier is AI agents—intelligent services that orchestrate tasks, automate decisions, and connect the dots between tools and data. When PLM becomes a smart layer that interacts with ERP, CAD, sourcing tools, and other AI services, it transforms from a system of record into a system of action.

This strategy doesn’t mean that investment into existing systems will disappear – it means that new software tools will be applied to organize cross domain processes. Here is a typical passage from the requests I often see for OpenBOM – “We are looking for a software that support eBOM to mBOM translation. We are an industrial company working with [known PLM] as PLM and just rolling out [new ERP] as ERP -> hence we are discussing the eBOM <>mBOM”. This is only one example. The demand to solve such a problem is growing and existing monolithic software packages doesn’t provide solutions for these implementations and services. Nnew platforms will play a growing roles in organizing these implementations and “rewiring of these processes”.

From Applications to Data

Another important shift is happening across industries: companies are moving away from focusing purely on applications and beginning to treat data as a long-term business asset. Applications come and go. The average lifecycle of enterprise software is shrinking, and companies are realizing that locking data into rigid, monolithic systems makes it harder to adapt and scale.

By contrast, flexible data models and services allow organizations to repurpose and extend the value of their data across workflows, tools, and business functions. This mindset supports interoperability and fosters resilience as businesses evolve.

Focusing on the data itself is becoming even more important in the context of AI development. As AI technologies mature, the ability to tap into structured, connected, and well-managed data becomes a critical enabler. Clean and accessible data serves as the foundation for powerful AI applications and analytics, helping organizations drive smarter decisions, automate processes, and uncover insights that would otherwise remain hidden.

The Opportunity Ahead

As companies seek to adopt more flexible, scalable PLM strategies, it becomes essential to rethink the way systems are layered and architected. One effective approach to de-risking PLM implementations is to separate functionality into three distinct but connected layers: systems of record (SOR), systems of engagement (SOE), and systems of intelligence (SOI).

  • System of Record (SOR) is where authoritative data is stored—such as items, BOMs, revisions, and compliance artifacts.
  • System of Engagement (SOE) focuses on how users interact with that data through workflows, collaboration tools, and user experiences tailored to different stakeholders.
  • System of Intelligence (SOI) represents the analytical and AI-driven layer that helps businesses derive insight, support decisions, and optimize operations.

This layered approach brings granularity, flexibility, and resilience, allowing companies to evolve each layer independently as business needs change—without the burden of re-architecting the entire system. Check my article – The Battle for Master Data Supremacy Between PLM, ERP, CRM, MES and Others: Contenders and Approach.

What Is My Conclusion?

While legacy PLM vendors struggle to modernize their aging platforms, the opportunity for agile, networked, multi-tenant systems is wide open. The market for small and medium size companies is wide open Companies are asking harder questions about cost, scalability, and adoption—and the market is ready for a new kind of PLM: one that connects rather than controls, scales rather than stagnates, and enables intelligence rather than enforcing rigidity.

The future of PLM isn’t monolithic. It’s modular, connected, and intelligent.

Just my thoughts…

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
12 September, 2012

CAD and PLM means a lot of data these days. Thinking about growing complexity of products, the amount of information...

27 October, 2018

Data was always in the center of any PDM and PLM applications. The ability of PLM system to manage complex...

22 September, 2010

Yesterday, I attended COFES Russia / isicad 2010 forum in Moscow. My presentation on the forum was about my view...

11 December, 2017

Jos Voskuil took Part Number topic to another spin and share his thought about  – Intelligent Part or Product Numbers? ...

13 September, 2023

In person events are coming back in a full swing and I’m happy to see a growing number of opportunities...

5 June, 2021

Earlier this week, I had a chance to attend the PLM Market and Industry Forum organized by CIMdata. It was...

17 February, 2021

The file is a fundamental concept in any operating system. Files are one of the most popular mechanisms for data...

20 June, 2014

It has been a year since I published my How to select PDM system in 5 simple steps? Engineering.com article The...

11 March, 2017

Almost 10 years ago, Gartner defined IT obsolescence management as major emerging issue. Here is a passage from old the old article...

Blogroll

To the top