A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

The Future of PLM Webinar Reflection: From Systems to Strategic Solutions

The Future of PLM Webinar Reflection: From Systems to Strategic Solutions
Oleg
Oleg
2 November, 2025 | 9 min for reading

Earlier this week, I had the pleasure of joining a panel with one of my most favorite industry peers — Michael Finocchiaro, Jos Voskuil, Martin Eigner, Brion Carroll, Brion Carroll (II), Juliann Grant, and Rob Ferrone — for another episode of the Future of PLM Podcast. The conversation was vibrant, sometimes provocative (especially for those with voices from a balcony), and deeply insightful. My take on the conversation – we collectively tackled one of the most uncomfortable but important questions in the PLM industry – Is PLM still solving problems, or is it creating new ones?

The End of PLM as We Know It

What became clear during the discussion is that PLM is no longer just a system category. It’s not a box of software sitting in IT’s portfolio. It’s an evolving discipline — one that must adapt to a new digital reality shaped by cloud, data, and AI.

As I said during the panel, “No one wakes up wanting PLM; they want to solve a problem.”
And that statement, simple as it is, captures the heart of the transformation we’re experiencing. The companies that will succeed in the next decade are those that treat PLM not as a technical solution to manage CAD files, but as a strategic framework for solving real business challenges — connecting design, manufacturing, supply chain, and customer value.

So, this article is both a reflection on the last 30 years of PLM and a forward look at how it must evolve. The real question is no longer “What PLM system do you have?” but “What strategic problems is your PLM approach solving?”

The 30+ Year Legacy of PLM Systems

My own journey in PLM began in the mid-1990s. Back then, I was developing architecture libraries and early PDM systems — long before “PLM” became a common acronym. Later, I joined SmarTeam, which became part of Dassault Systèmes, built semantic search for engineering data that was acquired by Autodesk, worked on Autodesk’s PLM 360 and data management platforms, and eventually, started OpenBOM — a cloud-native platform born from the idea that data intelligence should empower manufacturing and construction companies and flow freely between design, manufacturing, and supply chains.

During that time, I witnessed multiple waves of evolution:

  • From PDM to PLM
  • From on-premise to SaaS
  • From document management to data-driven processes
  • And now, toward AI and intelligent automation

As Martin Eigner reflected during the panel, his own 40-year career mirrors this arc — from the early dreams of integrated engineering systems to the pragmatic reality of today’s fragmented enterprise environments. We’ve built incredible technologies, but we’re still chasing the same dream: a connected digital thread across the entire product lifecycle.

Today’s landscape reflects both progress and stagnation. Many enterprises operate multiple PLM systems, sometimes even from the same vendor, each covering a different business function or division. In traditional PLM strongholds — automotive, aerospace, and defense — these systems have matured but are still primarily systems of record, not systems of innovation.

Meanwhile, as I discussed recently in “Where PLM Goes Next: 7 Expansion Markets No One Is Defending Yet”, vast new industries remain underserved. The opportunity for PLM to expand beyond its legacy boundaries has never been greater — but doing so requires a fundamental shift in how we define it.

Expanding the PLM Footprint Across Industries

If we look beyond traditional manufacturing, we can see that PLM principles are quietly at work everywhere. Electronics, industrial equipment, pharmaceuticals, consumer packaged goods (CPG), fashion, and even food industries now rely on PLM-like concepts — even if they don’t call it that. Whether managing formulations, colorways, packaging, or product variants, the core challenge is the same: connecting data, people, and processes across the lifecycle.

As Brion Carroll put it during our panel, “PLM should not be a single system but a broader set of tools and technologies.”

That statement resonates deeply with the way modern companies operate. In many organizations, ERP, MES, or even Excel-based systems perform “PLM-like” functions because they solve local problems effectively. The name matters less than the purpose — managing change, maintaining traceability, and aligning product data with business decisions.

Every company today has its own “version” of PLM. Some implement a commercial system; others build a bespoke platform or rely on integrated cloud services. But the underlying principle remains constant: managing data, decisions, and dependencies across the lifecycle. This diversity shows that PLM is not a product you buy; it’s a philosophy of connected product thinking.

The Problem with “PLM System-Centric” Thinking

One of the most important themes raised in the panel was the persistence of system-centric thinking. Too many organizations still approach PLM as a technology purchase — a system they can deploy and “check off” on a digital transformation checklist.

As Juliann Grant wisely observed, companies must “look at what they are trying to accomplish, not at the technology itself.” The question should always start with why, not what system.

This system-first mentality is the reason many PLM programs fail. The result is often a complex network of tools that do not communicate, mounting technical debt, and user frustration. Legacy monoliths, once intended to connect engineering and manufacturing, now trap data behind walls of customization and proprietary formats.

But the problem isn’t purely technical. It’s also human. Resistance to PLM rarely stems from a dislike of technology — it stems from misaligned goals and poorly managed change. People resist what doesn’t make sense to their daily work. Organizational Change Management (OCM) must be woven into every PLM initiative, aligning strategy, process, and user experience from day one. Without it, even the best software will fail.

The New Paradigm — PLM as a Strategic Solution

We’ve reached the point where incremental upgrades aren’t enough. PLM must now evolve into what I call a strategic solution — a hybrid of business strategy, connected software, and shared data models that enable decision-making across the entire enterprise.

Martin Eigner introduced a great metaphor during our conversation: PLM as an umbrella. Instead of replacing every legacy system, we need a unifying layer that connects them — creating continuity, not conformity. This umbrella concept fits naturally with today’s architectural trends: composable systems, federated data services, and cloud-native deployment models.

The goal isn’t to build another monolith but to connect what we already have — enabling data to flow intelligently and contextually across functions. Cloud-native platforms like OpenBOM and others are built around this principle. They create a shared product data backbone that integrates with CAD, ERP, MES, and supplier systems through open APIs.

This approach forms the foundation for the next era of PLM — one that is AI-enabled, data-centric, and adaptive.

AI and Product Data as a Strategic Asset

Artificial Intelligence is now the defining force of digital transformation — and PLM is no exception. But as I wrote in How to Make AI Work for PLM: It’s All About Structured BOM Data, AI alone is not a silver bullet. Without structured, accessible, and connected data, AI is blind.

The future of PLM depends on how effectively we can structure and contextualize product data — across design, manufacturing, supply, and service. AI will act as an intelligent layer, operating across silos to translate, validate, and correlate data in real time. Imagine AI agents automatically detecting inconsistencies in a multi-level BOM, predicting cost variances, or recommending alternative suppliers — all within the same product context.

This is not science fiction. It’s the direction we’re heading, driven by advances in graph-based data models and AI-driven knowledge systems. As I explored in Product Memory for PLM: Building Agentic AI in Reverse, the concept of product memory — a digital thread that “remembers” relationships, dependencies, and intent — is becoming central to how future PLM systems reason and learn.

In this model, AI doesn’t replace PLM — it amplifies it. It turns data into knowledge, and knowledge into actionable intelligence.

PLM’s Expanding Business Mandate

The most strategic implication of this transformation is that PLM is no longer confined to engineering. Its role is expanding to become the operational backbone for the entire product-driven enterprise.

Modern PLM must reconcile traditionally conflicting goals:

  • Finance wants to control cost and reduce risk.
  • Sales wants to increase revenue and shorten delivery cycles.
  • Engineering wants to innovate and push the boundaries of design.

These vectors no longer need to compete. When product data is unified and accessible, they can align. A well-implemented PLM strategy connects the dots — turning engineering decisions into cost insights, linking supply chain risk to design choices, and informing customer-facing decisions with real-time product intelligence.

AI will accelerate this convergence. By analyzing lifecycle data holistically, AI can identify opportunities to optimize cost, quality, and time-to-market simultaneously. PLM becomes not just an engineering tool, but a strategic nerve system for business performance.

And as companies pursue broader digital transformation — embracing sustainability, traceability, and customer experience — PLM becomes the execution layer for those strategies. It’s where business intent meets product reality.

What is my conclusion? 

I want to talk about the strategic future of PLM. 

After three decades of evolution, PLM stands at a crossroads. The systems we built in the past 25 years served their purpose, but they also created silos, complexity, and inertia. The next generation of PLM must rise above this — not as another tool, but as a strategic, AI-enabled discipline that treats product data as a core business asset.

The companies that will lead the next wave of innovation are those that stop asking, “Which PLM system do we use?” and start asking, “How do we turn product data into strategy?”

This shift requires reimagining PLM as a connected solution layer — blending strategy, open architectures, and intelligent automation. It’s about using data to support strategic decision-making, not just document control. The organizations that embrace this mindset will gain an unprecedented ability to adapt, predict, and innovate.

In our next panel discussion, we’ll continue this journey by exploring the emerging PLM Manifesto — a vision for how data management, AI, and human collaboration will define the next decade of PLM.

Until then, I’ll leave you with a question I often ask myself and others:

What would it take for your organization to treat product data as a strategic asset?

Just my thoughts…

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. Interested in OpenBOM AI Beta? Check with me about what is the future of Agentic Engineering Workflows.

With extensive experience in federated CAD-PDM and PLM architecture, I advocate for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.

Recent Posts

Also on BeyondPLM

4 6
1 February, 2018

Thinking about SaaS application race? You’re not alone. Some companies cannot agree with new landscape of applications. As such, Oracle...

30 January, 2015

We live in the era of changes. Think about the impact open source software (OSS) made on the software industry...

9 April, 2010

I’d like to follow my yesterday post about PLM data modeling and talk about one of the issues that in...

14 February, 2016

One of my passions is to share information and comments related to product lifecycle management (PLM), engineering and manufacturing software....

1 June, 2016

I remember first time I was introduced to AutoLISP. It was many years ago, but I can still recall how...

17 November, 2024

Earlier this week, I attended the DXM 2024 (formerly PLM Innovation) event that took place in San Antonio TX. Organized by UK...

13 April, 2025

Cloud and SaaS is around for the last 20-25 years. Product Lifecycle Management (PLM) industry came late to the SaaS...

6 November, 2022

I was catching up on some PLM social media writing earlier today. Unfortunately, I was not able to attend PDT...

22 May, 2014

Usually, it takes time and money to implement PLM system for a large company. This work requires understanding of customer...

Blogroll

To the top