A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

From PLM to xLM – My Take After Prof. Fischer’s Universal BOM Comment

From PLM to xLM – My Take After Prof. Fischer’s Universal BOM Comment
Oleg
Oleg
19 October, 2025 | 9 min for reading

A week ago, Prof. Dr. Jörg W. Fischer commented on my provoking article – Are We Finally Getting Closer to a Universal BOM Standard speaking about the opportunity to find a universal data model for all BOMs – EBOM, MBOM, SBOM, etc. with a strong proposal to stop a pointless discussion and check his articles. 

A universal BOM does not exist 😅 and will not. Boms are living from transforming into each other. I‘m happy to explain sometime to stop this useless discussion. I recommend reading my posts 😊 and watching the YouTube channel. The idea of a universal BOM comes from the perspective of product development. As soon as you start to understand how BOMs are really handled downstream, your view changes. BOMs feed each other through a structured supply sequence. What you call a universal BOM is just one element of this sequence — and therefore not universal“.

It was an opportunity that was hard to miss. In my article today, I want to share my thoughts after re-watching Prof. Fischer lectures (especially those that speak about BOM structures) and reads some of his recent posts about CTO+ approaches. 

I found them insightful, not only because they reflect Prof. Fischer’s years of research and industrial experience, but also because they connect directly to the trends I observe in manufacturing today. Despite the strong message in the comment, I believe we’re in violent agreement when it comes to the relationship between the two terms — xBOM and xLM. But let’s get into more details.

We are living through a fascinating moment. Manufacturers companies implementing engineering and manufacturing software are struggling with four converging challenges:

  1. A growing demand for integration between systems — PLM, ERP, MES, CRM — that were originally never designed to work together.
  2. The need for structured data models that coordinate the entire lifecycle — what I call xBOM, and what Prof. Fischer calls XLM (Extended Lifecycle Management).
  3. The pressure of complexity and configurability as products become more individualized, software-driven, and cross-disciplinary.
  4. The challenge of transforming legacy PLM architectures and the fragile PLM–ERP status quo that continues to slow innovation.

At the same time, as Jos Voskuil noted in his recent article “There is more than THE BOM!”, our industry is being forced to rethink how we model information itself. BOMs are no longer simple hierarchies — they are evolving data structures that mirror the way companies actually work.

Together, these perspectives form a clear message: PLM must evolve. And that evolution may not be about extending what we already have — but rethinking how lifecycle information is represented, connected, and used.

Observations: The Convergence of XLM and Universal BOM

Prof. Fischer’s work builds on a few foundational insights that I believe are essential to understanding where we are heading.

First, structures define processes. The Bill of Materials is the foundation of a company’s operation. It defines how products are designed, manufactured, and delivered. Fischer calls BOMs “partial models” — the seed crystals of all processes. Change the BOM structure, and you change the way the company works.

Second, ERP systems are built on structure, not just process logic. The core of ERP is master data — materials, routings, work centers, suppliers — and this data forms the skeleton of execution. But as Fischer points out, master data isn’t stable; it evolves. It has its own lifecycle, with versions, effectivity dates, and contextual dependencies. Historically, ERP systems were never designed to handle this level of change.

Third, BOMs are lifecycle engines. The MBOM is the “general of the supply chain” — commanding production, sourcing, and logistics. Managing it properly means managing the entire execution layer of the enterprise.

Finally, AI cannot compensate for poor structure. Without well-formed, versioned, and connected data, AI and automation become noise generators rather than intelligence amplifiers.

I found these ideas aligned with what I’ve been exploring through my article about Universal BOM and also with xBOM graph based architecture I developed with OpenBOM. In my view, the Universal BOM is not a single artifact but a graph of interconnected structures that capture the relationships between all lifecycle views — engineering, manufacturing, service, supply chain, and more. It’s a living network, not a hierarchy. It allows each domain to maintain its perspective while staying synchronized through shared semantics and data references.

And as far as I was able to understand, this is precisely what Fischer calls XLM — the cross-domain lifecycle model that governs all evolving master data, feeding execution systems like ERP, MES, and CRM.

The Architectural Challenge: Building XLM on Structured BOMs

The difficulty lies not in defining the concept, but in implementing it. Our current enterprise stack was never designed for this.

PLM evolved from PDM — document-centric systems built to manage CAD files, revisions, and approvals. ERP evolved from financial systems focused on accounting, transactions, and organizational control. Each serves its purpose, but neither was designed to manage the lifecycle of an interconnected, multi-disciplinary product that includes mechanical, electrical, electronic, and software components — all developed in different systems, at different speeds, by different teams.

Here, Fischer and I are in full agreement: The future of lifecycle management depends on structured, lifecycle-aware data.

Without it, traceability, reuse, and automation break down. Structured BOMs — including Plant BOMs and MBOMs — must move closer to PLM/XLM to enable agility and maintain coherence across design and execution.

Fischer describes this shift through his concept of the “flipped automation pyramid.” Instead of ERP and MES driving master data downward, PLM/XLM should become the source of truth for the data that feeds execution. ERP and MES then focus on what they do best — orders, schedules, and transactions. PLM/XLM governs the data itself.

This is the key insight: data architecture, not process mapping, defines the future of digital integration. Processes can only be as good as the data they act upon. And without a consistent data model, even the most well-documented processes collapse under their own complexity.

The Core Question: How to Build XLM — Bottom-Up or Top-Down?

The big question is how to get there. Prof Fischer introduced the diagram that demonstrated the shift in system architecture. 

I found this picture interesting and a bit reminiscent of the conversation about rethinking of EBOM and MBOM

For decades, data and process development have been treated as separate exercises — design the process first, then collect the data. But that logic no longer works. Data and process are interdependent. Without data, no process can run; without process, data has no meaning. Building XLM requires their co-development.

Top-Down Expansion (The Last 10–15 Years)

Over the last decade, major PLM vendors have pursued a top-down strategy. They’ve added more modules — requirements management, quality, project management, manufacturing planning, and service — to create large, vertically integrated suites.

The intent was clear: to model the entire process. The assumption was that data would follow.

But in practice, the opposite happened. These systems became complex, expensive, and difficult to evolve. Process frameworks designed in the abstract couldn’t adapt to changing organizational realities. Instead of delivering integration, they created layers of configuration debt.

The lesson is simple: you can’t fix process problems without fixing data problems first.

Bottom-Up Evolution

The alternative is a bottom-up approach — starting from the data while evolving the process in parallel.

This begins with creating flexible, federated data models — like xBOM — that connect existing systems rather than replace them. Such models form the information backbone that makes processes executable and traceable.

Once the data structure exists, processes can grow naturally around it. Engineering, manufacturing, and supply chain can align through shared data semantics and feedback loops. This creates a continuous data–process coevolution cycle:data informs process; process shapes data.

This approach favors open APIs, data federation, and composable services over monolithic integration. It aligns with modern software thinking: build small systems that work and connect them through well-defined interfaces.

Or, as Gall’s Law reminds us: “Complex systems that work are built from simple systems that worked.”

Screenshot

The path to XLM isn’t about imposing new process layers. It’s about letting working data models and adaptive process frameworks emerge through real use — not through top-down process architectures.

Looking Ahead: From PLM to XLM — Data Models, Graphs, and New Business Models

The shift from PLM to XLM is unlikely happen through marketing slogans. It will happen through better data modeling, graph-based architectures, and AI-assisted reasoning built on solid, connected data.

Graph-based lifecycle models offer a powerful way to represent complex relationships across BOMs, configurations, and effectivity. They provide the contextual richness that relational databases struggle to maintain.

AI can augment reasoning across this graph — suggesting compatible configurations, identifying change impacts, or predicting supply disruptions — but only if the data foundation exists.

This is what I call product memory — the persistent, structured knowledge of how a product evolves, behaves, and connects to every other element in its lifecycle.

Without that memory, AI is guessing. With it, AI can reason.

At the same time, business models are changing. Traditional enterprise licensing creates friction — long implementations, heavy customization, and slow time-to-value. In contrast, consumption-based SaaS and modular subscription models lower the barrier to entry.

They enable companies of all sizes to adopt lifecycle intelligence gradually, scaling from one process to many without a massive upfront investment.

These models also encourage openness: APIs, data sharing, and cross-system collaboration become natural economic incentives rather than integration burdens.

What is my conclusion? (And The 10x PLM Question)

Every major industry transformation has a 10x moment — a breakthrough that changes everything. The PC replaced the mainframe. The iPhone redefined mobile computing. In the engineering world, tools like AutoCAD, Pro/ENGINEER, and SolidWorks each created their own 10x moment by dramatically simplifying how people design. I shared my thoughts about 10x for PLM in 2025 yesterday. 

The question now is: where will the 10x moment for lifecycle management come from?

Will it emerge from data intelligence — the realization that connected, structured information is more valuable than any single application?

Will it come from AI reasoning, enabling systems to understand and act on lifecycle data autonomously?

Or will it come from new business models, where lifecycle services are delivered as consumable, composable digital capabilities instead of monolithic systems?

We can sense the momentum. The convergence of PLM, ERP, MES, and CRM — the rise of structured data models and knowledge graphs — is already reshaping how manufacturers think about data ownership and collaboration.

But the 10x leap will require more than technology; it will require rethinking the business of lifecycle management itself — how value is created, shared, and consumed across the product ecosystem.

So I leave this question to you:

As we move from PLM to XLM, can we find that 10x catalyst — the combination of architecture and business model — that finally makes lifecycle management as natural, connected, and intelligent as it was always meant to be?

Just my thoughts… 

Best, Oleg 

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. Interested in OpenBOM AI Beta? Check with me about what is the future of Agentic Engineering Workflows.

With extensive experience in federated CAD-PDM and PLM architecture, I advocate for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.

Recent Posts

Also on BeyondPLM

4 6
29 June, 2015

Product Lifecycle Management software is obsessed with the idea of workflow. You can see it offered by all PLM vendors. To...

29 April, 2010

If you ask me, who is the biggest competitor of PLM apps, my constant answer is simple – Excel. I...

14 November, 2017

I spent my yesterday morning at AU2017 Forge Developer conference. I featured keynotes in my previous article. So, today I...

6 April, 2010

Oh, yes… iPad is finally here. Do you think it will create new opportunities to PLM vendors? This is a...

11 December, 2008

Organizational expectation from PLM implementation is to organize entire lifecycle process of product in organization. With this expectation PLM positioned...

24 August, 2009

You can ask me – why would you like to talk about trends? Trends are something normally going toward an...

22 March, 2012

Let’s talk about Google today. I’ve been writing about Google technologies and Google enterprise efforts quite frequently. One of the...

4 September, 2018

For the last few years, PLM community and vendors got used to the word “cloud”. You can see that every...

11 February, 2016

Once open source software was a no-go solution in enterprise software. I remember debates and discussions about open-source code with...

Blogroll

To the top