A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Evolution: Single Source of Truth, and Eventual Consistency

PLM Evolution: Single Source of Truth, and Eventual Consistency
Oleg
Oleg
18 March, 2025 | 5 min for reading

In some of my recent articles, I discussed the transformation of one of the main principles of PLM development that was around since the beginning of PLM vision and system architecture – Single Source of Truth (SSOT). Check some of my earlier articles:

Navigating the Evolution of Single Source of Truth

Rethinking Change Management: Collaborative Workspace Technical Architecture

One of my readers recently posed an insightful question: if you have a single point of change, doesn’t it inherently become the only reliable source of truth? This question is particularly relevant today as PLM moves toward a distributed systems future. Understanding this shift requires a closer look at traditional PLM architectures, the challenges of modern distributed data management, and the implications of concepts like eventual consistency and the CAP theorem.

Before moving forward, I wanted to remind CAP Theorem Trade-offs.

CAP Theorem Trade-offs

Let me start by reminding the main principles of the CAP theorem. In a distributed system, it is impossible to simultaneously guarantee consistency, availability, and partition tolerance. Different system models prioritize different trade-offs:
CP (Consistency + Partition Tolerance): Ensures data consistency across nodes, even if some nodes are unreachable. However, it sacrifices availability, meaning some requests may be rejected during network failures.
AP (Availability + Partition Tolerance): Ensures the system remains operational despite network failures. This comes at the cost of consistency, as some nodes may return stale or divergent data.
CA (Consistency + Availability) – Theoretical Only: Guarantees strong consistency and high availability but sacrifices partition tolerance. This model is not practical for distributed systems because a network partition would cause the system to fail.

The Historical Perspective: SQL-Centric PLM

When PLM systems first emerged, consolidating all data into a single SQL database was a logical approach. It provided a centralized, structured way to manage product information, ensuring data integrity and consistency within an organization. Most traditional PLM platforms still operate on this SQL-based architecture, treating a single database as the “Single Source of Truth” (SSOT). The fundamental assumption was that all changes would be made within this monolithic structure, ensuring absolute consistency and traceability.

However, this model was designed for a different era—when organizations were smaller, data volumes were more manageable, and global collaboration was limited. As businesses scale and operate in increasingly complex environments, the limitations of this approach have become evident. It doesn’t mean that SQL database won’t be used at all, but the architecture of PLM systems will be shifted and evolving.

The Shift Toward Distributed PLM Systems

Today, we live in a world of large-scale, distributed organizations that generate and consume vast amounts of data across multiple platforms. The nature of product development and manufacturing now requires real-time collaboration across various locations, systems, and stakeholders. This reality necessitates a fundamental change in how PLM architectures are designed.

Distributed systems prioritize availability, scalability, and resilience—qualities that are often at odds with the traditional PLM model of a single SQL database. As a result, modern PLM platforms are embracing new architectural principles, including:

  • Polyglot Persistence: Using multiple database technologies (SQL, NoSQL, GraphDB) to optimize different types of data storage and retrieval.
  • Microservices and APIs: Enabling modular, loosely coupled services to manage different aspects of product data.
  • Eventual Consistency: Allowing data to propagate across systems asynchronously, ensuring high availability while tolerating temporary inconsistencies.

The Role of Eventual Consistency in Modern PLM

Eventual consistency is a widely adopted approach in cloud computing, NoSQL databases, and large-scale web applications. It ensures that while data may not be immediately synchronized across all systems, it will eventually converge to a consistent state. This principle allows distributed PLM platforms to prioritize system availability and performance while still ensuring reliable data management.

However, this raises a fundamental conflict in traditional PLM thinking: if PLM is expected to be the Single Source of Truth, how do we reconcile this with a distributed system that permits temporary inconsistencies? The answer lies in distinguishing between Single Source of Truth (SSOT) and Single Source of Change (SSOC).

SSOT vs. SSOC: The Key Distinction

Traditional PLM systems were built on the assumption that SSOT means having a single database where all product data is stored and modified. However, in a distributed environment, this assumption no longer holds. Instead, modern PLM architectures should focus on Single Source of Change (SSOC)—ensuring that changes originate from a controlled and authoritative source, even if the data itself is distributed.

For example, a cloud-native PLM system may allow different services to store and retrieve product data independently, but changes to critical product information (e.g., CAD models, BOMs, or compliance data) should be managed through well-defined workflows, version control, and event-driven synchronization mechanisms. This approach ensures traceability and control while embracing the realities of distributed systems.

The Future of PLM: Rethinking Core Principles

Given the challenges and opportunities presented by distributed architectures, PLM vendors and practitioners must rethink fundamental aspects of PLM technology, including:

  • Collaboration Models: Moving beyond file-based sharing to data-driven, real-time collaboration across multiple systems.
  • Change Management: Implementing robust mechanisms for managing updates, conflicts, and approvals in a distributed environment.
  • Revision Control: Ensuring that different versions of product data are managed effectively, even when stored across various platforms.

What is my conclusion?

It is becoming increasingly clear that managing a digital thread in a single SQL database using a 1990s-era PLM architecture is impractical. Instead, Single Source of Change is emerging as the dominant model for modern PLM applications, allowing for flexibility, scalability, and resilience. However, achieving this requires a shift in mindset—from monolithic, tightly controlled databases to distributed, event-driven, and API-first architectures.

Just my thoughts… What is your take on this transformation? Let’s discuss!

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative services including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
17 October, 2019

Cloud is here. Most industries are not questioning cloud technologies, applications and business models. But, in manufacturing and PLM, the...

8 April, 2019

I continue to share my thoughts and impression from last week SuiteWorld 2019 event. If you missed some of my...

11 November, 2023

Let’s talk about the importance of digital transformation and the challenges that manufacturing companies are facing in product lifecycle management,...

11 November, 2017

How to overlay competitors. In my view, this is a new name in CAD / PLM competition game. It has...

26 July, 2010

When I’m thinking about any PLM project, I can clearly see the step when data available in the organization need...

23 June, 2014

One day you discover that your PLM implementation project is not doing so well. It happens and it called failure....

3 August, 2009

What is the role of Search in PLM? This is question I want to discuss today. This year is busy...

16 September, 2014

3D printing is buzzing trend these days. If you are not up to speed with the trend, you probably should....

1 June, 2012

Let me start today from asking you a question. How do you run your company? Regardless on size, scale and...

Blogroll

To the top