A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Evolution: Single Source of Truth, and Eventual Consistency

PLM Evolution: Single Source of Truth, and Eventual Consistency
Oleg
Oleg
18 March, 2025 | 5 min for reading

In some of my recent articles, I discussed the transformation of one of the main principles of PLM development that was around since the beginning of PLM vision and system architecture – Single Source of Truth (SSOT). Check some of my earlier articles:

Navigating the Evolution of Single Source of Truth

Rethinking Change Management: Collaborative Workspace Technical Architecture

One of my readers recently posed an insightful question: if you have a single point of change, doesn’t it inherently become the only reliable source of truth? This question is particularly relevant today as PLM moves toward a distributed systems future. Understanding this shift requires a closer look at traditional PLM architectures, the challenges of modern distributed data management, and the implications of concepts like eventual consistency and the CAP theorem.

Before moving forward, I wanted to remind CAP Theorem Trade-offs.

CAP Theorem Trade-offs

Let me start by reminding the main principles of the CAP theorem. In a distributed system, it is impossible to simultaneously guarantee consistency, availability, and partition tolerance. Different system models prioritize different trade-offs:
CP (Consistency + Partition Tolerance): Ensures data consistency across nodes, even if some nodes are unreachable. However, it sacrifices availability, meaning some requests may be rejected during network failures.
AP (Availability + Partition Tolerance): Ensures the system remains operational despite network failures. This comes at the cost of consistency, as some nodes may return stale or divergent data.
CA (Consistency + Availability) – Theoretical Only: Guarantees strong consistency and high availability but sacrifices partition tolerance. This model is not practical for distributed systems because a network partition would cause the system to fail.

The Historical Perspective: SQL-Centric PLM

When PLM systems first emerged, consolidating all data into a single SQL database was a logical approach. It provided a centralized, structured way to manage product information, ensuring data integrity and consistency within an organization. Most traditional PLM platforms still operate on this SQL-based architecture, treating a single database as the “Single Source of Truth” (SSOT). The fundamental assumption was that all changes would be made within this monolithic structure, ensuring absolute consistency and traceability.

However, this model was designed for a different era—when organizations were smaller, data volumes were more manageable, and global collaboration was limited. As businesses scale and operate in increasingly complex environments, the limitations of this approach have become evident. It doesn’t mean that SQL database won’t be used at all, but the architecture of PLM systems will be shifted and evolving.

The Shift Toward Distributed PLM Systems

Today, we live in a world of large-scale, distributed organizations that generate and consume vast amounts of data across multiple platforms. The nature of product development and manufacturing now requires real-time collaboration across various locations, systems, and stakeholders. This reality necessitates a fundamental change in how PLM architectures are designed.

Distributed systems prioritize availability, scalability, and resilience—qualities that are often at odds with the traditional PLM model of a single SQL database. As a result, modern PLM platforms are embracing new architectural principles, including:

  • Polyglot Persistence: Using multiple database technologies (SQL, NoSQL, GraphDB) to optimize different types of data storage and retrieval.
  • Microservices and APIs: Enabling modular, loosely coupled services to manage different aspects of product data.
  • Eventual Consistency: Allowing data to propagate across systems asynchronously, ensuring high availability while tolerating temporary inconsistencies.

The Role of Eventual Consistency in Modern PLM

Eventual consistency is a widely adopted approach in cloud computing, NoSQL databases, and large-scale web applications. It ensures that while data may not be immediately synchronized across all systems, it will eventually converge to a consistent state. This principle allows distributed PLM platforms to prioritize system availability and performance while still ensuring reliable data management.

However, this raises a fundamental conflict in traditional PLM thinking: if PLM is expected to be the Single Source of Truth, how do we reconcile this with a distributed system that permits temporary inconsistencies? The answer lies in distinguishing between Single Source of Truth (SSOT) and Single Source of Change (SSOC).

SSOT vs. SSOC: The Key Distinction

Traditional PLM systems were built on the assumption that SSOT means having a single database where all product data is stored and modified. However, in a distributed environment, this assumption no longer holds. Instead, modern PLM architectures should focus on Single Source of Change (SSOC)—ensuring that changes originate from a controlled and authoritative source, even if the data itself is distributed.

For example, a cloud-native PLM system may allow different services to store and retrieve product data independently, but changes to critical product information (e.g., CAD models, BOMs, or compliance data) should be managed through well-defined workflows, version control, and event-driven synchronization mechanisms. This approach ensures traceability and control while embracing the realities of distributed systems.

The Future of PLM: Rethinking Core Principles

Given the challenges and opportunities presented by distributed architectures, PLM vendors and practitioners must rethink fundamental aspects of PLM technology, including:

  • Collaboration Models: Moving beyond file-based sharing to data-driven, real-time collaboration across multiple systems.
  • Change Management: Implementing robust mechanisms for managing updates, conflicts, and approvals in a distributed environment.
  • Revision Control: Ensuring that different versions of product data are managed effectively, even when stored across various platforms.

What is my conclusion?

It is becoming increasingly clear that managing a digital thread in a single SQL database using a 1990s-era PLM architecture is impractical. Instead, Single Source of Change is emerging as the dominant model for modern PLM applications, allowing for flexibility, scalability, and resilience. However, achieving this requires a shift in mindset—from monolithic, tightly controlled databases to distributed, event-driven, and API-first architectures.

Just my thoughts… What is your take on this transformation? Let’s discuss!

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative services including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
30 March, 2012

This week was extremely busy. I was traveling to attend Autodesk Media Summit in San-Francisco. You can read about it...

22 March, 2020

We are in a huge crisis. This is how the situation that happens outside in the world feels now by...

14 January, 2011

I read the article in Yahoo News – ERP investments to slow in 2011. According to the Forrester research, many...

14 May, 2011

It is almost two years passed since I put on my blog a question about FREE as a best future...

4 December, 2012

Last week at AU, I attended Innovation Forum – The Reality of the cloud. The presentation made by Theresa Payton...

14 January, 2020

Professional life is about priorities. And sometimes, choices are very tough and you have to say no. I’ve been invited...

3 December, 2012

The complexity of user interface in PLM applications is a well-known fact that acknowledged by almost all PLM vendors today....

14 September, 2009

I had chance to write about Project Management in the past. However, I decided to get back to this topic...

18 July, 2019

Have you heard about low-code development platforms? You better pay attention to low-code, since this cool buzzword and actually tools...

Blogroll

To the top