A few days ago, my attention was caught by a picture shared by Jeff Winter – optimization vs transformation. I applied some GPT art and got a picture above. In my view, it perfectly illustrates the stark contrast between PLM software vendors today.
Many existing Product Lifecycle Management (PLM) platforms are like a speedy caterpillar—boosted by new IT infrastructure but still fundamentally operating within the same technological framework. Meanwhile, the new digital transformation in PLM software comes from new technologies and platforms that break away from legacy constraints.
This distinction is critical: optimization improves what already exists, while transformation redefines the way we work.
In my article today, I’d like to explore five aspects demonstrates optimization vs transformation that is happening these days in PLM software development. It shows differentiate between mature (but legacy in the future) traditional PLM optimizations from new digital PLM transformations.
Single-Tenant SQL Database vs. Multi-Tenant Platforms
Problem: Collaboration between teams and companies, cost savings, easy administration
Most legacy PLM systems are built on single-tenant, SQL-based architectures. While these databases have served engineering teams well in the past, they create silos that make collaboration between different teams—and especially across companies—extremely difficult.
A multi-tenant PLM platform, on the other hand, is built from the ground up to support seamless data sharing across organizations. Instead of dealing with disconnected systems and costly integrations, companies can collaborate in real-time, sharing structured product data securely while maintaining control over access and permissions.
Relational Model vs. Graph Data Model
Problem: Robustness, flexibility, and semantics
Traditional PLM systems rely on relational data model, where data is stored in tables tables with fixed relationships. This is a foundation of most “object-relational engines” used by PLM software. This structure struggles with flexibility, complex product relationships, making changes too complex and limiting flexibility of data relationships and capturing data semantics.
A graph-based data model offers a fundamentally different approach. It allows flexible data model, dynamic relationships between object to model variety of product components, suppliers, requirements, and lifecycle stages. The ability to traverse relationships in real time provides far greater flexibility, enabling better product data management, improved traceability, and richer insights.
RDBMS vs. Graph Database and Polyglot Persistance
Problem: Speed, analytics and data science, AI-readiness
Relational databases (SQL) require complex queries to pull meaningful product data insights, leading to performance bottlenecks—especially in large-scale, multi-layered product structures.
A graph database (GraphDB) is designed for fast, contextual data retrieval. Because it inherently understands connections between data points, it excels at AI and data science applications. With manufacturing organizations increasingly looking to leverage AI for predictive insights, a GraphDB-powered PLM system provides a strong data foundation for AI-driven decision-making.
Check-In/Check-Out vs. Real-Time Collaboration
Problem: Simultaneous communication and multi-dimensional change processes
Legacy PLM systems still use old-fashion check-in/check-out mechanisms and row locking, where only one user can modify a record (with a file) at a time. This method blocks real-time collaboration and creates inefficiencies, as engineers must wait to update to critical design data. It also creates logical problems for change management when multi-disciplinary data (eg. mechanical, electronics, software) needs to be updated and revised.
Modern PLM platforms leverage real-time collaboration, similar to how cloud-based productivity tools (e.g., Google Docs) allow multiple users to work on the same document simultaneously. With multi-dimensional change (eg. OpenBOM Collaborative Workspace) tracking, teams can work in parallel while maintaining full traceability of modifications.
Data Control vs. Data Consumption
Problem: Business Transformation, Data Accessibility and Data Interoperability
Traditional PLM vendors focus on data locking as a foundation of the business. This is how enterprise software was developed for the last 30-40 years – create a SOR (system of record), lock it in the proprietary data model, sell vertical applications (features) to monetize the data. This approach is forcing customers into proprietary ecosystems. This makes it difficult to extract and use data across different enterprise systems (ERP, MES, CRM) or to share data with external partners.
New-generation PLM platforms embrace openness and data consumption. Business is larger than a single SQL database and there is an urgent need to disconnect data from applications and turn it into “Data Products”. Instead of locking data into a rigid system, new systems allows to data easy flow between systems, using APIs, integrations, instant data sharing, and flexible data access, ensuring product information flows efficiently across supply chains, partners, and customers.
What is my conclusion: Solving the Right Problems
Companies are not just buying software—they are looking for solutions to their problems. Manufacturing companies are looking for new type of business systems to solve problems of product development process optimization, document management, quality management, supply chain management, orchestrating an entire product lifecycle, allowing to everyone to access up to date information.
Traditional PLM optimizations may improve existing workflows, but they cannot address the fundamental inefficiencies of outdated architectures and address modern needs of business processes of engineering and manufacturing teams.
By contrast, next-generation PLM and other business systems platforms—built on multi-tenant cloud architectures, graph-based data models, and real-time collaboration—are solving problems that were previously considered unsolvable in product development and manufacturing.
The key takeaway? Start with the problem in mind. If your PLM platform is simply a “faster caterpillar,” it’s time to consider a real digital transformation—because the future belongs to those who can fly.
Just my thoughts…
Best, Oleg
Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing Collaborative Workspace with PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.