How to Build a Product Model for Digital Thread

How to Build a Product Model for Digital Thread

In today’s increasingly interconnected world, managing product information efficiently has become a complex challenge. Traditionally, Product Data Management (PDM), Product Lifecycle Management (PLM) and Enterprise Resource Planning (ERP) systems have been key systems managing product data. However, as businesses accumulate data from various sources, including Product Data Management (PDM), Manufacturing Resource Planning (MRP), project management systems, and more, the need for a comprehensive and interconnected data model has never been more apparent. This is where the concept of the Digital Thread (DT) comes into play, aiming to provide traceability and impact analysis across the product development lifecycle.

The problem with DT is that information is managed in silos. Although PLM+ERP scenarios are more frequent, in reality, the number of data sources (databases, excels, etc) is much bigger with many PDM, PLM, MRP, ERP, project and other data management systems. The question of connecting this information in a meaningful way to support DT function such as traceability and impact analysis is coming more and more often. A typical PLM system development developed back in 1990s was not built for such level of complexity to support modern business strategy (eg. CTO manufacturing process use cases and building of interconnected engineering and manufacturing BOM, complexity of supply chain management analysis and others).

Prof Martin Eigner, has started a very important discussion about new PLM technology for Digital Thread in his LinkedIn post. Here is a passage that summarizes the discussion:

1: The implementation of the Digital Thread (DT) over the entire product life cycle (PLc) in graphs is essentially agreed.
2: The set of information elements to be considered essentially refers to the configuration items (CI), i.e. the items to be considered according to EN DIN ISO in the event of a change.
3: The objectives of the DT include reconfiguration in the event of damage and support for ECM in order to find the potentially impacted items (PII).
4: Since an essential element of DT is the linking of different product structures (BOMs), another functional subset of DT is the Multi-BOM implementation.
5: The extension of the DT understanding is to include the consideration of an information item not only in relation to other CIs along the PLC but also the link to processes is supported. So in which other ECM or QMS processes is the item to be changed involved?
6: With a polyglot DB approach, it is unclear how the functions and therefore also the synchronization distribution between the DBs takes place. Assuming we use an RDB for the list-like display of items and a graph DB for linking the items, are the links stored redundantly or only in the graph DB? Will there then be a linguistic layer above the DBs that supports the structure of a polyglot schema?
7: The effort involved in synchronizing several DBs, e.g. for ECM processes, was also discussed.
8: Last but not least, there was a discussion about the use of AI in the processing of graphs. My opinion is that the sole evaluation of the DT in the event of a claim or in impacted item analysis is a quantitative diligence task and not AI. However, my dream is to use AI to find the really affected items from the set of PIIs evaluated from the DT.
9. My next post will deal with the sub-topic of multi-BOM with the subtitle ‘we have far too many multiple redundant product structures’. I will refer to my esteemed colleagues Jörg Fischer and Oleg Shilovitsky and to documents from CIMdata. So how do we find practicable, efficient and redundancy-free solutions for the popular topic of DBOM, EBOM, MBOM, BOP or MPP + variants (:-).

Prof. Eigner comments made me think about what product model can be used to support the complexity of digital thread as well as why existing systems can provide limited capabilities to manage digital thread.

Why Do We Need a New Model?

The first question that comes to mind is why we need a new model when we already have established PLM systems. While existing PLM platforms can technically be used for DT purposes, their efficiency in handling diverse data sources, flexibility, and query capabilities leave much to be desired. This becomes evident when trying to integrate data from multiple systems. The challenge of mapping data between these systems is a debate in itself, one that we will address separately.

The two most important gaps in existing models are (1) data model flexibility (existing PLM are mostly designed for BOM and change management use cases and (2) query and analytical capabilities (existing PLM are not designed to manage various structured queries efficiently and completely lacking graph data anaysis and other analytical capabilities available in modern data management systems.

Core Elements of the Digital Thread Model

To successfully support DT functions like traceability and impact analysis, a product model should encompass core elements such as:

  • Configuration Items (CIs): These are the foundational elements of the model and must be managed with flexibility, granularity, and query efficiency in mind. NoSQL/Document or RDBs databases can be a valuable tool for managing CI records. Historical scale and compatibility can create issues when using RDBs, therefore NoSQL databases can provide a better solution.
  • Bill of Materials/BOM (Product Structure): product structures (especially multiple BOM types) can be complex due to numerous relationships and data structures. Although all PLM systems designed in 1990s-2000s were using RDBs, the efficiency of these systems and queries is going down when you need to scale it. There are growing number of examples confirming efficiency of Graph Database usage to query and traverse complex structures.
  • Links (or general relationships between data): Building links between CIs and other entities is crucial for expressing relationship semantics. The information stored in CI records and the graph database facilitates efficient query processing, helps to develop analytical queries and finding dependencies.

Using Multiple Databases and Microservice Architecture

Embracing a polyglot data management architecture can enhance data management efficiency, flexibility, and processing capabilities. Like we use different programming languages to improve efficiency of product development, different databases can be employed to manage more efficiently characteristics of the product model. So called polyglot persistence is an approach that allows to create more efficient and robust data management platforms. An example could be usage of noSQL document management databases for CI and graph databases to manage product structures and other relationhsps.

The overall architecture of the system for product model can use microservices with specific database configurations as a storage. All together, they can deliver a platform to organize the semantic structure of the data model. A combination of eventual consistency and database transactions to maintain data changes in different operations and services. .

Flexibility on the Customer Level

Building an out-of-the-box model for DT is a near-impossible task due to the diversity of models and information at the customer level. Instead, the basic elements of the model must be highly configurable to suit each customer and their unique data systems. Administrators should have the capability to manage configurations dynamically to adapt to changing requirements.

Adaptivity and Changes Over Time

In the world of DT, change is constant. The data model must provide mechanisms to evolve over time while ensuring compatibility with old data. As the DT system matures, the data model will evolve on the CI, BOM, and data levels.

UX and Query Building

User experience (UX) is pivotal in a successful DT system. Providing a variety of user-friendly interfaces for data definition, import, and reporting/queries is essential. A system must support easy data import, integration with other systems, and user-level data querying. Configurable UI elements combined with robust API capabilities are indispensable.

Tenant Model

In scenarios where data from multiple companies is involved, a tenant model is crucial for segregating customer and supplier data. While the immediate necessity of multi-tenant model might vary, having a foundational tenant model is wise, given its fundamental importance and high level of complexity to introduce multi-tenant model after the system is already implemented.

What is my conclusion?

Building a product model for the Digital Thread is a critical function for providing answers to important questions in product development. Although some current PLM systems can be repurposed for this role, their capabilities are often limited. To make a DT system work effectively, flexibility in data modeling, robust functional capabilities, and advanced query and data analytics are the three key elements that should be prioritized. In an age of increasing product data complexity and interconnectivity, mastering the Digital Thread is a step towards more efficient and streamlined product development processes.

A robust product data model for DT can serve as a foundation for business services that operates across multiple systems (PLM, MRP and others). These services can provide solutions that today cannot be implemented because of data model complexity, required analytical and other complex queries and complex mapping of data between systems. These services will become a new layer to support product development process and holistic product lifecycle, to support transformation of business processes (eg. move from ETO to CTO) and complement existing PLM software. The use of new services with foundations in product data model can be used in solving complex PLM queries (eg. impact analysis and supply chain queries). Moreover, this product model can be used for building new computer aided design software and modern PLM software.

Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital-thread platform with cloud-native PDM & PLM capabilities to manage product data lifecycle and connect manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.

Share

Share This Post