In two recent articles, I reflected on the lifecycle of engineering software. The first explored 30 years of PDM evolution, how it moved from a CAD add-on feature to a product data backbone and then, in many ways, back again 30 Years of PDM Evolution: What Changed and What Didn’t? In my second article I asked a question about longevity of existing PDM and PLM providers – Are today’s PLM leaders immortal? As much as we love the leaders, the answer, based on history, is no. Even dominant platforms eventually fade How Immortal Are Today’s PLM Leaders?.
In my article today I want to focus on helping manufacturing companies and PDM/PLM practitioners to think about their software and data strategies. We live in the transformative time. What will happen to my IP, data, projects, and other information? How should companies think about the long-term survival of their data? How to operate today and preserve it for the future?
Engineering and manufacturing teams depend on software that is, in many cases, decades old. CAD systems developed in the 1990s still run mission-critical operations (SOLIDWORKS, AutoCAD, CATIA, NX ,Creo, Autodesk Inventor, and many others). Many companies rely on heavily customized PDM and PLM platforms built 20–30 years ago, or even sometimes homegrown legacy tools.
Vendors, meanwhile, promote “new platforms” that often amount to hosted versions of the same old architectures, rebranded as cloud. At the same time, truly cloud-native platforms are emerging, but adoption is partial and cautious. And now, AI has entered the scene, changing the conversation yet again by promising new insights—if only the data is ready.
The core challenge is this: CAD design and product lifecycle data must live for decades, long after individual software platforms have been retired, rebranded, or sunset. This isn’t just a technical problem, it’s a business continuity issue. From legal compliance to customer support to the preservation of intellectual property, companies must ensure their product data survives the lifecycle of any particular system.
And there is another dimension: knowledge preservation. As experienced engineers retire, much of the tacit knowledge about how products were designed and managed risks being lost. Unless the data is structured, connected, and accessible, future teams may struggle to understand decisions, trace designs, or reuse valuable know-how.
The current CAD/PDM/PLM landscape is a patchwork of old and new:
The lesson is simple: applications come and go, but data must endure.
If data longevity is the goal, how do we get there? Here are five strategies to consider.
The first principle is independence. Data should not be locked inside the proprietary schema of one vendor’s application.
Practical steps:
Think of it as “owning your data” rather than “renting it through an application.”
In the past, companies looked for one system to rule them all. But monolithic solutions rarely age well. Instead, resilience comes from connections, not silos.
This means investing in:
The more semantically connected your data is, the easier it becomes to replace or upgrade applications without breaking the chain of knowledge.
Cloud adoption is inevitable, but companies should approach it carefully. SaaS platforms reduce IT burden but can create the ultimate lock-in if data is only usable within the vendor’s ecosystem.
Key questions to ask:
The guiding principle: cloud is good for operations, but your exit plan must always exist.
AI is transforming the conversation around engineering software. Yet AI models are only as good as the data foundation. Garbage in, garbage out.
Companies need to:
In other words: don’t rush to build AI copilots on top of weak, fragmented, or untraceable data.
Historically, companies treated data preservation as a migration project—something painful but necessary once a decade. That mindset is risky.
Instead, treat data longevity as a continuous practice:
Think of this as a backup strategy for product data—not just for disaster recovery, but for long-term resilience.
In the past, the playbook was simple: pick the “right vendor” and commit to their ecosystem. But history shows that no vendor is truly safe from disruption. The shift must be from software-first to data-first thinking.
That means building semantically connected datasets that can survive independently of any single application.
Technical approaches include:
Importantly, redundancy should not be seen as inefficiency. Just as companies keep financial backups and redundant IT systems, redundant data storage is insurance. It ensures that product knowledge can be re-used, re-linked, and re-interpreted in the future, even as applications evolve.
This is also the best way to prepare for AI. By creating semantically rich, connected data, companies ensure that future AI copilots and agents can query, contextualize, and reason over product data without being trapped by the limitations of past systems.
We need to move from software selection to data strategy.
History makes one lesson clear: no software is immortal. CAD, PDM, and PLM systems rise, dominate, and eventually fade. The winners are not those who chose the “perfect system,” but those who built strategies that allowed their data to outlive the applications managing it.
The mindset must shift. Instead of asking: What is the best PLM software to buy today?
Companies should ask: What is my long-term data strategy, and how will it ensure continuity for decades?
Resilience does not come from locking into the biggest vendor or adopting the flashiest new technology. It comes from building a future-proof data layer—independent, connected, redundant, and semantically rich—that can support today’s applications and tomorrow’s innovations.
The message is simple: software dies, but data can live. The companies that embrace this truth will not only preserve their engineering knowledge but also unlock the full potential of AI and digital transformation in the decades ahead.
Just my thoughts…
Best, Oleg
Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.
144 Search Result for: monolithic
Let’s talk about PLM architecture. A recent post and comments by Andreas Lindenthal triggered a fresh wave of discussion around...
A few days ago, I wrote about Data Products. It’s an interesting trend that, in my view, can push new...
Summer is a great time to catch up on PLM reading. What is on the top of the mind of...
The discussion about PLM as a business and PLM as a software are not new. When someone asks me about...
I was catching up on social media reading over the weekend, and my attention was caught by Matthias Ahrens post...
Last year, I published the article 5 Steps To Break up Monolithic PLM Architecture. Please check it out. In the...
Enterprise PLM architecture is a critical component of any manufacturing organization’s overall strategy, as it helps to align technology with...
Manufacturing is in the midst of a digital transformation. And it means that the industry is changing. For more than...
Steve Porter’s article Best in Show- Can ERP Providers do PLM? is a throwback in the mood of thinking about...
In my earlier blog I demystified the notion of “monolithic” PLM marketing and shared some technological aspects related to PLM...
In the past “monolithic” thing has a strong association with a power. Wikipedia article – List of largest monoliths brings...
PLM is getting more competitive these days. Cloud technology development, SaaS applications and new business models injected competitive energy between...