I’m getting ready for the next round of PLM podcast discussions later today. If you haven’t seen it before, check following link to listen and learn. As part of this preparation, I want to capture some of my recent thoughts about PLM development and opportunities. Here are a few earlier articles worth checking out as well.
From PLM to xLM – thoughts after Prof. Jorg Fischer comments
PLM is Dead? Comments after the previous discussion hosted by Michael Finocchiaro to debate this provocative question.
Beyond Platforms: The Next Chapter in the PLM Story
I also recommend to catch up on the reading from Jos Voskuil about The Future of PLM in 2025 and 2050.
Let’s discuss where product lifecycle management, product data management, PLM software and product lifecycle goes. How new wave of PLM tech can change business processes and where is the low hanging fruits for expansion.
There are many debates these days about engineering data, product related data, product data management (PDM), supply chain management, product quality, product development process, supplier collaboration, how to help project managers, how to centralized data management, improve data quality and business systems overall. There are teams thinking deep about how to reinvent computer aided design (CAD), to improve quality management and provide universal PLM tools. There are tools to enhance PLM with expansions on the left (design and engineering data) an on the right (service lifecycle management). There are companies to improve data integrity and document management. There are opportunities to fucs on holistic change management process. Existing CAD systems and CAD tools are evolving to support the entire lifecycle.
From the previous wave of “cloud innovators” before AI kicks out as a major market trend, we can see companies focused on supplier collaboration, new cloud PLM tools, AI search tools to provide up to date information in a broad scope of data, tools to get to the market faster, improve supply chain collaboration and holistic product lifecycle management (PLM).
The discussions about what a customer needs, what provides significant cost savings, how to simplify engineering change processes, how to help production planning and allow to manufacturing engineers to get access to the right information with a real time data. Engineering processes are getting more complicated with MCAD, ECAD, software an it changes the product life cycle. Product complexity leads to the need for cross functional collaboration. Companies are changing their business model and business strategy to sell “services” and not “products”. In some markets, it is extremely important to capture customer expectations and customization in a real time to shorten the development cycle. Global manufacturers are looking how to balance their product in a market, to stand for regulatory compliance. Earlier planning with digital twin gives a better perspective and balance between engineering effort and requirement management to connect design phase and final product.
Finally, manufacturing companies are looking how to develop competitive advantages and how to build products in globalized supply chain constraints.
All these things sounds very important and can drive enormous amount of debates. But where to start? In my article today, I want to step back and talk about market opportunities first.
Why Investors Start with the Market — and Why PLM Should, Too
When investors evaluate a startup, they almost never begin with the product.
They start with the market.
Because no matter how brilliant the technology, how inspired the founder, or how elegant the architecture — a bad market will kill it. A great product in a small, stagnant, or shrinking market is still a bad investment. The market defines the ceiling for success; the product, team, and timing only determine how close you can get to that ceiling.
I’ve seen this pattern over and over — in technology startups, enterprise software, and manufacturing. Every successful business story begins with understanding where demand is heading. The same is true for PLM. No amount of genius engineering can rescue a product built for a market that has already stopped growing.
A century ago, the American Tobacco Company faced exactly that dilemma. In the 1920s, nearly every man in America who wanted to smoke already did. The market was saturated. Growth stalled. Then the company turned to public relations pioneer Edward Bernays, who proposed something radical — expand the market by convincing women to smoke. He reframed cigarettes as a symbol of female empowerment and called them “Torches of Freedom.”
That campaign didn’t invent a better cigarette. It invented a new market. Within a decade, women’s smoking rates soared, and American Tobacco effectively doubled its total addressable market.
I wrote about this story in my earlier article — What American Tobacco Can Teach Us About PLM Markets — because it perfectly illustrates the point: innovation often begins with market redefinition, not product redesign.
And once the market becomes clear, then it’s much easier to talk about the technology and the product needed to serve it. In American Tobacco’s case, the company began producing smaller, lighter cigarettes and packaging tailored to women — design following market insight.
The same logic applies to PLM. For decades, vendors have competed in the same, well-defined space — CAD integrations, revision control, compliance management for large industrials. It’s a proven market, but a saturated one.
When investors today ask about PLM, their first question isn’t “what’s your tech?” It’s “what’s your market?” — because the existing one limits growth.
The truth is simple but uncomfortable: no genius PLM technology can win in a bad market.
But PLM as a discipline — as a set of ideas about how product data connects design, manufacturing, and operations — is much larger than the current market allows.
The next wave of opportunity will come from identifying and solving new product data problems — in industries and workflows that don’t even call what they do “PLM” yet.
1. Regulated NPI “Speed Cells” Inside Big Companies
In large regulated companies (medical, pharma, aerospace, energy), the “official” PLM systems enforce heavy processes and gated compliance. But new product introduction (NPI) teams — usually small, independent groups — operate differently. They are chartered to move faster, explore new ideas, and deliver prototypes quickly.
These teams often sit at the edge of the formal organization, acting as speed cells for innovation. They can’t wait for multi-step approvals or rigid workflows, so they end up working outside the system — in Excel, shared drives, or ad hoc cloud tools — just to get things done.
They don’t need full-scale governance; they need flexibility, traceability, and collaboration in early design — tools that allow them to iterate rapidly while maintaining enough structure to hand off to compliance later.
Why this is a market: Medical and life sciences PLM is growing fast (≈9% CAGR through 2030), but that growth is almost entirely compliance-driven. The “shadow NPI workspace” — agile innovation inside regulated environments — remains greenfield.
Whoever builds PLM that truly serves these fast-moving, small regulated innovation teams will own the emerging layer between Excel and Windchill — where real product innovation actually happens.
2. Component Economics and Procurement Intelligence
For decades, PLM stopped at the “approved parts list,” leaving everything about cost, availability, and sourcing to ERP. That wall is collapsing — and it’s collapsing fast.
As product lifecycles accelerate, design and production teams can no longer afford the old hand-offs between engineering and operations. Decisions about components, alternates, and suppliers must now happen while the product is still being designed. Supply chain visibility is no longer a post-release concern — it’s a design constraint.
Designers and procurement teams need a continuous, shared view into supply risk, lead time, vendor performance, and cost trends across the entire product lifecycle — from concept to sourcing to production.
Why legacy PLM can’t do it: Traditional PLM assumes static suppliers and predictable sourcing. It was built for stable, linear product cycles. Today’s environment is the opposite — volatile, global, and constantly changing. Static data models cannot represent multi-source, time-sensitive information about real components flowing through manufacturing and procurement systems.
Why this is a market: AI-enabled supply and cost intelligence — the ability to understand how parts, vendors, and costs evolve across the lifecycle — is becoming existential for manufacturers.
This is not “more PLM.” It’s a new kind of PLM, one that unifies engineering, supply chain, and procurement into a single, living dataset — where design choices are made with full visibility into cost, lead time, and production reality.
3. Contract Manufacturers and Build-to-Print Ecosystems
Think about all the contract manufacturers and specialized suppliers who receive customer BOMs, fix data, and manage production — but have no access to the OEM’s PLM.
Today, this is handled by email, Excel, and PDF. No persistent connection exists between OEMs and their build partners.
Why legacy PLM can’t do it: Traditional PLM is single-tenant, enterprise-owned, and license-gated. It was never designed for secure, selective, inter-company data sharing.
Why this is a market: Whoever creates a multi-tenant PLM network that enables real-time, graph-based data exchange between OEMs and suppliers will own the “digital thread between companies.” That’s a $20B+ opportunity hiding in plain sight.
4. PLM for High-Mix, Medium-Size Hardware Companies
Between 20 and 200 employees, building around 2,000 parts — robotics, drones, industrial automation, medical devices, high-tech equipment.
These companies are too complex for spreadsheets and too small for enterprise PLM.
Why legacy PLM can’t do it: Cost, complexity, and deployment friction. Six-month implementations and million-dollar consulting fees are impossible for these teams. Traditional PLM also looks increasingly outdated in a world where new technologies are dramatically improving how companies manage data, search information, and collaborate. Modern cloud tools, real-time communication platforms, and now AI assistants are transforming expectations for speed, usability, and intelligence.
In this new environment, legacy PLM systems look like dinosaurs — rigid, slow, and isolated from the way modern product teams actually work. Yet, despite the general rise in digital capabilities, these companies still need specialized tools designed specifically for managing product data, configurations, and manufacturing relationships. Generic collaboration or AI tools can’t replace structured lifecycle management — they need a PLM built for their scale and agility.
Why this is a market: This is the cloud-native PLM wave — easy onboarding, SaaS pricing, integrations with CAD and ERP, and collaboration built for teams, not admins.
It’s the Tesla effect: 100-person companies building products that once required 5,000 engineers. They need PLM that moves at software speed, speaks the language of modern tools, and fits seamlessly into AI-driven, connected workflows.
5. Service, Maintenance, and Post-Sale Configuration History
Every shipped product diverges from its “engineering BOM” almost immediately. Parts get replaced, service swaps occur, and configurations evolve in the field. Over time, every individual unit becomes its own version of the truth — an as-maintained product that no longer perfectly matches the original engineering model.
At the same time, product lifecycles are getting longer, and business models are shifting. Companies no longer just sell a product and move on — they maintain it, upgrade it, and increasingly sell the ongoing service as part of the value proposition. In aerospace, medical devices, industrial equipment, and robotics, “product as a service” or “maintenance as a service” is quickly becoming the norm. Visibility into what was built, when it was serviced, and what’s inside each specific unit is critical for profitability and compliance.
Why legacy PLM can’t do it: Most PLM systems stop at “as-designed” or “as-built,” not “as-maintained.” Once the product leaves the factory, the data trail is lost — dispersed across service systems, customer portals, or spreadsheets maintained by local field teams.
Why this is a market: Post-sale configuration data is no longer an afterthought; it’s monetizable. Predictive maintenance, upgrade programs, warranty optimization, and sustainability reporting all depend on it.
A PLM that can capture per-unit product memory across its full lifecycle enables entirely new business models in service, subscription, and circularity — turning every maintained product into a continuous source of operational and business intelligence.
6. The New Intelligence Layer: Product Memory, Validation, and Compliance
A completely new category is emerging inside the product lifecycle — an intelligence layer that sits on top of data.
For the first time, product information itself has become dynamic: analyzed, validated, and interpreted continuously by AI systems.
We’re entering an era where PLM data becomes training data — not static records but live context for reasoning, prediction, and decision-making. Engineers are already interacting with copilots and agents that ask the right questions:
- “Is this BOM ready for release?”
- “Are all compliance certificates up to date?”
- “What will happen to cost and lead time if we swap this supplier?”
This intelligence layer didn’t exist before. It’s not another workflow or dashboard — it’s a new kind of capability that connects data quality, validation, readiness, and compliance into one continuous reasoning loop.
Why legacy PLM can’t do it: Traditional systems were built as structured databases for documentation. Relationships are hidden, data is fragmented, and logic is hard-coded into processes rather than expressed in the data itself. They cannot reason over context because they were never designed for intelligence — only control.
Why this is a market: The next opportunity in PLM isn’t about managing more files or revisions. It’s about creating trusted, intelligent data — systems that understand completeness, accuracy, compliance, and readiness before human approval.
This transforms PLM from a system of record into a system of intelligence — where validation, quality checks, and regulatory assurance become automated, explainable, and continuously improving.
7. Beyond Intelligence: Sustainability, Traceability, and Circularity
As the intelligence layer of PLM evolves — where data becomes validated, contextual, and self-aware — a new challenge is rising just beyond it: sustainability and traceability.
Regulation, social expectation, and business accountability are converging to make product transparency a core requirement. Customers, regulators, and investors now demand to know what’s inside a product, where it came from, how it was made, and how it will be reused or recycled.
This isn’t a marketing trend — it’s structural. Changes in environmental legislation (such as EU digital product passports, carbon-footprint disclosures, PFAS restrictions, and circular-economy directives) are forcing companies to maintain deep visibility into every material, component, supplier, and production process. At the same time, society is redefining value: being sustainable and transparent is no longer optional, it’s competitive differentiation.
Why legacy PLM can’t do it: Traditional PLM systems were never designed for this level of traceability. They stop at part numbers and revisions, with no native ability to represent material composition, supply-chain lineage, or regulatory attributes that change over time.
Why this is a market: Sustainability will not be a slide deck — it will be a database query on intelligent product data.
“Show me all assemblies shipped to the EU containing non-compliant surface finish X.”
“Calculate the embodied carbon footprint across our top 20 assemblies.”
This is where the intelligence layer grows into systemic responsibility — connecting product data to real-world outcomes.
The companies and PLM platforms capable of linking design, sourcing, and lifecycle data into verifiable sustainability metrics will become indispensable partners for every manufacturer navigating the next regulatory and social transformation.
Coming Back to Industry Assumptions: A Common Set of Product and Technology Requirements
Now, coming back to the broader industry assumptions — after looking at these seven market directions — what I see emerging is a common set of product and technology requirements that define the next generation of PLM.
Across all of these markets, the same foundational needs repeat:
- Multi-tenant, network-native data sharing.
Value is moving beyond the single-enterprise boundary. The future PLM platform must operate as a network system — connecting OEMs, suppliers, and partners with shared, permissioned access to product data. - Graph-based product memory.
The product is no longer just a hierarchy; it’s a living, connected graph that links design, suppliers, cost, compliance, manufacturing processes, and service history. Graph data models are becoming essential to represent these complex relationships. - Procurement and cost context inside design.
The historical divide between PLM and ERP is dissolving — and this is now one of the central topics in every CTO+ discussion I hear across the industry. Engineering leaders, CIOs, and supply chain executives are all realizing that disconnected systems create blind spots between design intent and production reality.
Modern PLM platforms must integrate supply chain intelligence — surfacing cost, lead time, and part availability as first-class design parameters, not downstream afterthoughts. The next generation of tools will unify engineering and procurement into a continuous, data-driven loop, allowing teams to design with real-time awareness of sourcing, risk, and manufacturability. - AI agents in the loop.
Validation, readiness checks, and compliance enforcement are shifting from manual review to automated, explainable intelligence. PLM is gaining a continuous reasoning layer that ensures data quality and release readiness. - Speed-first experience.
SMBs, startups, and agile product teams won’t wait months for deployment. They expect instant onboarding, real-time collaboration, and cloud-native APIs that connect to the rest of their digital stack from day one.
These principles describe the architecture of the next PLM wave — a system that’s open, intelligent, and built for data continuity across companies, products, and lifecycles.
What is my conclusion?
For the last two decades, PLM vendors have sold “control” — revision control, change control, access control.
The next decade will be about “context” — understanding how design, manufacturing, cost, and sustainability connect across the network of companies building physical things.
That shift — from control to context — will define the next $20B of PLM’s growth.
And as every investor knows, the opportunity doesn’t start with the technology.
It starts with the market.
Just my thoughts…
Best, Oleg
Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. Interested in OpenBOM AI Beta? Check with me about what is the future of Agentic Engineering Workflows.
With extensive experience in federated CAD-PDM and PLM architecture, I advocate for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.
