From Excel part lists to digital decision artifacts
A few days ago, Michael Finocchiaro shared a LinkedIn invite announcing an upcoming webinar titled “To BOM or Not to BOM: A BOMversation.” I’m delighted to join the panel of amazing speakers coming with a diverse set of PLM backgrounds – Dr Patrick Hillberg – Oakland University, Brion Carroll – DSG, Jim Brown – Tech-Clarity, Rob Ferrone – Quick Release, Jonathan Scott – Razorleaf.

The title immediately triggered me. Not because it is controversial, but because it points to something that has been sitting in the middle of PLM for decades and still refuses to settle down.
Bills of Materials sit at the core of how we design, build, buy, sell, maintain, and change products. And yet, after 40+ years of practice, we still argue about fundamentals: EBOM vs MBOM, PLM vs ERP ownership, “single source of truth” vs “many versions of truth,” how to propagate changes, how to do traceability, what to do about alternates and substitutes, where cost belongs, how quality and compliance should connect to structure, and why the simplest “part list” still turns into political debate as soon as the product becomes real.
Preparing for this BOMversation made me step back and think about why we keep circling the same discussions, and why, in my opinion, this is not a sign that BOM is outdated. It is sign that our world changed and we didn’t fully update our mental model of what BOM is supposed to represent.
This article is not about defending BOMs and it is not about proposing some “better BOM.”
It is about explaining why BOMs remain unsettled after decades, and why this discomfort is actually a signal that something deeper is changing in how products are designed, built, and governed. I’m also going to reference themes I’ve been writing about for a long time on Beyond PLM and on OpenBOM blog, because for me these topics are connected: product structure, change, context, collaboration, and what happens when we pretend a static list is enough for dynamic decision-making.
Let’s go with some of my questions and thoughts ahead of the webinar..
Starting from a Part List and MRPII planning
BOMs have been around for a very long time and they started as part lists. It was not like “PLM invented BOMs”. BOMs existed long before PLM systems. It was invented first on drawings and later in ERP planning. BOM existed long before many companies even had formal engineering data management. For a long time, BOM was literally what the name says: a list of parts or planning BOM for MRP.
You can still see this origin in many companies today. The “official” BOM might be inside some enterprise system, but the operational habit is the same: someone exports it to Excel, adds a few columns, maybe changes formatting, and sends it to the manufacturing floor or purchasing. People do it not because they want to break the process, but because the original role of BOM was never to be a perfect representation of a product, it was to be a useful communication artifact.
Historically, BOMs lived on drawings, in documents, sometimes on paper. When spreadsheets became common, Excel became a natural home for BOMs. Not because Excel is the best data platform, but because it is simple, flexible, and allows quick adaptation. It supports the “part list” mental model perfectly: rows, columns, quantities, description, maybe vendor, maybe cost, maybe revision.
And this is important: early BOMs were not meant to optimize anything. They were not meant to drive automation or analytics. Their primary role was: “here is what you need to build it.” A human would interpret the list, place orders, check inventory, and deal with exceptions. If something changed, the list changed. If manufacturing had a different reality, they made their own list. It was messy, but it worked because products were simpler and organizational expectations were different.
What is striking to me is how durable this mental model remains. Even now, when products are complex, supply chains are volatile, and collaboration spans multiple companies, many organizations still treat BOM as a static report. A lot of PLM and ERP deployments still revolve around the same assumption: if we can generate the correct part list, we are done.
This is where core tension starts: our tools changed faster than our mental model. We moved from drawings to CAD, from paper to database, from local files to cloud. But in many cases, BOM still carries expectations of a spreadsheet-era artifact. It is like we upgraded infrastructure, but kept the old definition of what this artifact is.
And when you keep the old definition, you keep old limitations, even if you don’t notice them at first.
The Misframed Question Behind “To BOM or Not to BOM”
The title “To BOM or Not to BOM” sounds like a question about relevance. Almost like we need to decide if BOM is still needed. And in 2026, with all the talk about Digital Twins, Model-Based Engineering, and 3D-first processes, it is easy to see why this question comes up.
If you live inside CAD all day, geometry feels “real.” It is rich, detailed, precise. Simulation gives you behavior, not just structure. Digital Twin gives you operational data, not just design intent. In comparison, a BOM looks flat and old. A list of parts is not as impressive as a full 3D model with configuration logic, embedded metadata, and linked requirements.
So the misframed question becomes: maybe BOM is obsolete, because the model is the product.
But models alone do not answer most questions companies actually need to answer. A CAD model does not tell you how many units to order next quarter. It does not tell you whether purchasing can replace a part with an alternate supplier. It does not tell you how to group items for production planning. It does not tell the service team what to stock in the depot. It does not tell sales what configuration was shipped to this customer. It does not tell compliance what was used in the batch that is now under recall. It does not tell operations what is impacted if we change one component because the supplier is out of stock.
This is why BOMs persist. BOM lives in different semantic spaces. It is not geometry and not simulation. BOM is the artifact that translates design into execution and lifecycle. It is a bridge between domains. It connects engineering, manufacturing, procurement, operations, sales, service, and compliance.
I’ve written many times on Beyond PLM about “systems of record” and why they struggle when reality is changing fast. BOM is a classic example. People want it to be a single source of truth, but they also want it to serve everyone, across all lifecycle stages, under constant change. That is not a simple “truth” problem. It is a context problem.
So, for me, the real question behind the BOMversation is not “do we need BOM.” We clearly do. The real question is: what is BOM supposed to represent in a world where products are models, but business is decisions?
And if we keep treating BOM as a competitor to the CAD model, we will always be disappointed. BOM is not a competitor. BOM is a complement. The problem is that we often don’t design BOM thinking as a complement; we design it as a replacement or as a simplified “extract.”
Should We Rethink BOMs as Digital BOMs?
If BOM is not going away, the next question becomes more interesting: what does it mean for BOM to be digital?
For many years, “Digital BOM” meant “BOM stored in the system.” We took a spreadsheet, moved into the database, added permissions, added lifecycle states, added revisions. But the object stayed basically the same: part list with quantities. The digitization happened in storage and access control, not in meaning.
But digital products require more than digital storage.
When people say “digital transformation,” they often underestimate that digital is not just “online.” Digital implies relationships and behavior. It implies multiple views. It implies context. It implies that data can be connected and interpreted depending on the lifecycle stage.
A Digital BOM, to me, is not a static table saved in a database. It is a representation that can adapt as a product moves through stages. Engineering view, manufacturing view, service view, sales view. Same product, different intent. Digital means you can preserve these views without forcing them to fight each other.
Digital also implies lifecycle awareness beyond just revision. Revision is “what changed in design.” The lifecycle stage is “how the product is used right now.” You can have the same revision used in different stages, or different revisions in different service contexts, or temporary manufacturing deviations. The real world is messy, and digital representation should not pretend it is not.
This is where my earlier writing about context graphs and product memory comes in. I keep returning to the idea that systems capture states, but lose reasoning. Digital should not only store end results, it should store context of decisions. If you digitize only the list, you still have a list. If you digitize the decision process, you have something else: you have an artifact of knowledge.
I’m not trying to define Digital BOM too precisely, because once we do that, we will again reduce it to “new best practice.” But I think we need to ask: are we digitizing BOM as a list, or are we digitizing BOM as a decision artifact?
Because if we keep digitizing list, we will keep repeating the same debates.
Why Structural Debates (EBOM vs MBOM) Keep Failing—and What ERP vs PLM Has to Do with It
EBOM vs MBOM is probably the most famous BOM argument, and it is also the one that shows why we keep looping.
On the surface, it looks like a structural problem: engineering has one structure, manufacturing needs another, so we must create mapping. Some organizations try to enforce a single structure and force everyone to use it. Others create separate structures and spend energy synchronizing. Tools offer workflows, mappings, conversions, “release to manufacturing,” and all kinds of transfer processes.
And yet, the debate never ends. Which tells me it is not structural.
EBOM and MBOM debates are symptoms, not root causes. The root cause is that each system optimizes for different moments and different types of work.
ERP seeks stability and execution. It needs numbers that can run production, place orders, calculate cost, plan inventory. For ERP, change is expensive. Variability is dangerous. Exceptions are pain.
PLM seeks collaboration, flexibility, and change management across lifecycle stages. For PLM, change is normal. Variability is reality. Exceptions are part of the engineering process.
Conflict arises when those worlds collide, and especially when data ownership becomes the same thing as control. In many enterprise setups, ownership means “my system is master, your system receives a copy.” This is not just an integration pattern. It is a power pattern. It implies locking concepts: once data is in a system, it becomes harder to move, and changing the system becomes expensive.
So the EBOM vs MBOM debate becomes not only about structure, but about ownership and locking. Who owns the truth? Who controls updates? Who approves? Who takes responsibility for errors? And because these questions are political, technical solutions alone don’t solve them.
This is why I think the real conversation should be about misaligned expectations. We ask static structures to behave dynamically. We ask execution systems to support exploration. We ask collaboration systems to enforce rigidity.
If we don’t address this mismatch, EBOM vs MBOM discussion will keep repeating, regardless of which “best practice” we choose.
Change Is the Stress Test That Breaks Traditional BOM Thinking
If there is one place where BOM limitations become obvious, it is change.
Traditional BOM thinking assumes reporting. It is basically “part list output.” It answers the question “what is in the product now?” That is useful, but it is limited. And as soon as you face real change, the limitations show immediately.
Change management and collaboration expose missing rationale, assumptions, and tradeoffs. ECOs require impact analysis, not just an updated list. Form-Fit-Function decisions require understanding intent, not just replacement. Late supplier changes require evaluating tradeoffs and documenting why we accepted risk. Analytics requires connections: which assemblies, which customers, which service parts, which regulatory impacts.
And here is the part that hurts: revision history usually shows only outcomes. It shows “rev A changed to rev B.” It might show who approved and when. But it rarely captures why. It rarely preserves alternatives that were rejected. It rarely captures the discussion between engineering and manufacturing, or between procurement and quality, that led to the final outcome.
This is where many teams recognize their own experience. They follow the process, but later still end up guessing. They have approval records, but no decision context. They have data traceability, but no reasoning traceability.
And when pressure comes, people rebuild reasoning from emails, meeting notes, spreadsheets, or memory. Which is expensive and risky. In some of my earlier Beyond PLM posts, I called it “archaeology.” We dig through artifacts after the fact, trying to understand why the decision was made, because the system stored only the final snapshot.
So change becomes a stress test. If BOM is only a static list, it breaks under change. And because modern product development is basically continuous change, we should not be surprised that BOM debates are not disappearing.
What is my conclusion?
So where does this leave us? Here is what I think we need to explore next: structure, openness, context, and decision memory.
Structure still matters. Without structure, nothing scales. Without identifiers, relationships, quantities, configurations, you cannot run any serious operations. But structure alone is not sufficient anymore.
Openness becomes required when products are built across organizations. Contractors, suppliers, partners, distributed teams, service networks. If BOM is locked inside one system, collaboration becomes expensive, and people fall back to spreadsheets and email, because these are the only “open” tools they have.
Context turns part lists into understanding. Context means not only attributes, but intent: why this option, why this supplier, why this configuration, why this change now. Context is also a lifecycle: engineering, manufacturing, service, sales. Same product, different meaning.
Decision memory connects data with people activities that created it. Comments, tasks, discussions, alternatives, rejected paths, exceptions. This is where my earlier writing about product memory becomes relevant: if we want systems to help us make decisions, they need to remember decisions as they were made, not after they were approved.
Maybe BOMs need to evolve from being mostly “lists of parts” into richer records of decisions, connected with structure and lifecycle. Not only what was approved, but why. Not only the final structure, but the path, including options that were considered and rejected.
And this is where I want to end with questions, because I don’t think we have clean answers yet:
- What does BOM look like when it remembers intent, not just state?
- How open does it need to be to support real collaboration across companies, not only inside one enterprise?
- How do we keep context useful, without turning it into noise?
- How do we connect structure with human reasoning in a way that is searchable, auditable, and still practical for daily work?
- What happens when AI joins the process and needs more than static snapshots, because it must reason about tradeoffs, alternatives, and consequences?
These are not questions that can be solved in a single webinar, and that is fine. But they are exactly questions worth exploring together.
Here is a real question I want to ask in this BOMversation – whether our BOMs are still aligned with how products are designed, built, and governed today, and what we need to change in our thinking before we change anything else.
Just my thoughts…
Best, Oleg
Disclaimer: I’m the co-founder and CEO of OpenBOM, a collaborative digital thread platform that helps engineering and manufacturing teams work with connected, structured product data, increasingly augmented by AI-powered automation and insights.
