A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Evolution: Single Source of Truth, and Eventual Consistency

PLM Evolution: Single Source of Truth, and Eventual Consistency
Oleg
Oleg
18 March, 2025 | 5 min for reading

In some of my recent articles, I discussed the transformation of one of the main principles of PLM development that was around since the beginning of PLM vision and system architecture – Single Source of Truth (SSOT). Check some of my earlier articles:

Navigating the Evolution of Single Source of Truth

Rethinking Change Management: Collaborative Workspace Technical Architecture

One of my readers recently posed an insightful question: if you have a single point of change, doesn’t it inherently become the only reliable source of truth? This question is particularly relevant today as PLM moves toward a distributed systems future. Understanding this shift requires a closer look at traditional PLM architectures, the challenges of modern distributed data management, and the implications of concepts like eventual consistency and the CAP theorem.

Before moving forward, I wanted to remind CAP Theorem Trade-offs.

CAP Theorem Trade-offs

Let me start by reminding the main principles of the CAP theorem. In a distributed system, it is impossible to simultaneously guarantee consistency, availability, and partition tolerance. Different system models prioritize different trade-offs:
CP (Consistency + Partition Tolerance): Ensures data consistency across nodes, even if some nodes are unreachable. However, it sacrifices availability, meaning some requests may be rejected during network failures.
AP (Availability + Partition Tolerance): Ensures the system remains operational despite network failures. This comes at the cost of consistency, as some nodes may return stale or divergent data.
CA (Consistency + Availability) – Theoretical Only: Guarantees strong consistency and high availability but sacrifices partition tolerance. This model is not practical for distributed systems because a network partition would cause the system to fail.

The Historical Perspective: SQL-Centric PLM

When PLM systems first emerged, consolidating all data into a single SQL database was a logical approach. It provided a centralized, structured way to manage product information, ensuring data integrity and consistency within an organization. Most traditional PLM platforms still operate on this SQL-based architecture, treating a single database as the “Single Source of Truth” (SSOT). The fundamental assumption was that all changes would be made within this monolithic structure, ensuring absolute consistency and traceability.

However, this model was designed for a different era—when organizations were smaller, data volumes were more manageable, and global collaboration was limited. As businesses scale and operate in increasingly complex environments, the limitations of this approach have become evident. It doesn’t mean that SQL database won’t be used at all, but the architecture of PLM systems will be shifted and evolving.

The Shift Toward Distributed PLM Systems

Today, we live in a world of large-scale, distributed organizations that generate and consume vast amounts of data across multiple platforms. The nature of product development and manufacturing now requires real-time collaboration across various locations, systems, and stakeholders. This reality necessitates a fundamental change in how PLM architectures are designed.

Distributed systems prioritize availability, scalability, and resilience—qualities that are often at odds with the traditional PLM model of a single SQL database. As a result, modern PLM platforms are embracing new architectural principles, including:

  • Polyglot Persistence: Using multiple database technologies (SQL, NoSQL, GraphDB) to optimize different types of data storage and retrieval.
  • Microservices and APIs: Enabling modular, loosely coupled services to manage different aspects of product data.
  • Eventual Consistency: Allowing data to propagate across systems asynchronously, ensuring high availability while tolerating temporary inconsistencies.

The Role of Eventual Consistency in Modern PLM

Eventual consistency is a widely adopted approach in cloud computing, NoSQL databases, and large-scale web applications. It ensures that while data may not be immediately synchronized across all systems, it will eventually converge to a consistent state. This principle allows distributed PLM platforms to prioritize system availability and performance while still ensuring reliable data management.

However, this raises a fundamental conflict in traditional PLM thinking: if PLM is expected to be the Single Source of Truth, how do we reconcile this with a distributed system that permits temporary inconsistencies? The answer lies in distinguishing between Single Source of Truth (SSOT) and Single Source of Change (SSOC).

SSOT vs. SSOC: The Key Distinction

Traditional PLM systems were built on the assumption that SSOT means having a single database where all product data is stored and modified. However, in a distributed environment, this assumption no longer holds. Instead, modern PLM architectures should focus on Single Source of Change (SSOC)—ensuring that changes originate from a controlled and authoritative source, even if the data itself is distributed.

For example, a cloud-native PLM system may allow different services to store and retrieve product data independently, but changes to critical product information (e.g., CAD models, BOMs, or compliance data) should be managed through well-defined workflows, version control, and event-driven synchronization mechanisms. This approach ensures traceability and control while embracing the realities of distributed systems.

The Future of PLM: Rethinking Core Principles

Given the challenges and opportunities presented by distributed architectures, PLM vendors and practitioners must rethink fundamental aspects of PLM technology, including:

  • Collaboration Models: Moving beyond file-based sharing to data-driven, real-time collaboration across multiple systems.
  • Change Management: Implementing robust mechanisms for managing updates, conflicts, and approvals in a distributed environment.
  • Revision Control: Ensuring that different versions of product data are managed effectively, even when stored across various platforms.

What is my conclusion?

It is becoming increasingly clear that managing a digital thread in a single SQL database using a 1990s-era PLM architecture is impractical. Instead, Single Source of Change is emerging as the dominant model for modern PLM applications, allowing for flexibility, scalability, and resilience. However, achieving this requires a shift in mindset—from monolithic, tightly controlled databases to distributed, event-driven, and API-first architectures.

Just my thoughts… What is your take on this transformation? Let’s discuss!

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative services including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
28 August, 2020

How to provide the mechanical design community access to a growing library of products available and connect engineers working on...

13 January, 2022

Has your business fully adopted cloud-based CAD? While the technology has been around for more than a decade, the jury...

5 October, 2010

The following blog article drove my attention yesterday: CAD File Management ≠ PLM. The short blog post published by Peter...

19 June, 2018

I’m attending PTC LiveWorx industry event in Boston. This year it called “Digital Transformation Conference”. Well… I’m not surprised. Digital transformation...

9 April, 2014

Connectivity is a key these days and graphs are playing key role in the development of our connectivity. It doesn’t...

10 August, 2018

In the pre-digital world of the last 100 years, oil was one of the main sources of conflict. Multiple articles...

3 April, 2009

I’m getting back again to the topic of PLM content. I think this is one of the most exciting topics...

16 September, 2009

I want to talk today about the future of PLM in terms of content. My initial thoughts were about SOA/WOA...

8 February, 2025

Saturday is the time to relax, think, and read. It is also a day to plan. Changes are hard and...

Blogroll

To the top