Digital twin of an entire value chain – new PLM mantra, same technologies?

Digital twin of an entire value chain – new PLM mantra, same technologies?

Photo credit: Tesla Robot Dance by Steve Jurvetson

It is hard to argue about value proposition behind “PLM projects”. PLM story is rock solid these days. However, to sell PLM is still a very hard job. Do you remember a decade ago, PLM sales people were obsessed with ‘singe version of truth’ story? The term is not unique for PLM and used in multiple business software applications. It is a concept describing data management solution using one central database or multiple synchronized databases storing all product related information in a consistent and non-redundant form.

If you never heard about “single version of truth” before, you can easy Google it. Here is an example of ten years old publication in Aerospace Manufacturing magazine – A single version of truth. Here is a key passage:

“With Teamcenter, you can establish a single source of product and process knowledge that connects all global team members everywhere, all the time. Your teams can access this single source to find needed information quickly, reducing the time it takes to search for information by up to 65%, resulting in R&D costs being dramatically reduced and enabling companies to maintain an acceptable profit. Teamcenter also provides decision makers with better visibility into the most up-to-date product lifecycle information to make faster, more informed decisions.”

Another article in Manufacturing Business Technology Alert published in 2007 brings similar explanation:

UGS Teamcenter not only creates and maintains those links it also publishes them to the factory floor in a ready-to-use, operator-friendly format that is error-free and up-to-date. “Without waste or duplication, we deliver the right work instructions, at the right time, direct to the point of use, and with full flexibility to refine and change them,” says Thomas. “We’re entering data just once, and then reusing it many times.” By capturing, integrating, and controlling all product, process, manufacturing, and service knowledge in a single environment, UGS Teamcenter expands product knowledge management and process automation capabilities across an extended enterprise of departmental and divisional operations, partners, and suppliers.

Utilizing data collected from the shop floor provides useful information to both the management team and line operators with real-time key performance indices. That provides a single version of the truth to the staff and enables a Lean Six Sigma focus on performance improvement

Also I found the following picture to help you to visualize PLM single version of truth story.

That was PLM engineering and manufacturing story of single version of truth circa 2007.

Fast forward into 2017. Economist article Millions of things will soon have digital twins is a fascinating story how manufacturing companies are turning into factories of robots building robots. Read the article. My favorite passage is about tools and actually the story of term “digital twin”.

The digital twin is not a new invention. The concept of pairing traces its roots to the early days of space travel, when NASA built models to help monitor and modify spacecraft that, once launched, were beyond their physical reach. As computer power increased, these analogue models turned into digital ones.

The powerful systems that have since emerged bring together several elements—software services in computer-aided design and engineering; simulation; process control; and product life cycle management. Some digital twins are gaining artificial intelligence and virtual-reality capabilities, too. They can also help to monitor remotely and provide after-service for products that have been sold. “It is a digital twin of the entire value chain,” says Jan Mrosik, the chief executive of Siemens’s Digital Factory Division.

Cadalyst article few days ago – Siemens PLM Software Sharpens Focus on Digital Factory echo the same new definition – digital twin of an entire value chain.

The intelligent model, also known as a digital twin, represents the systems within a complex, modern product, which can comprise mechanical parts, software, electronic and electrical systems, sensors, communications/networking, and other components, as well as the unique environmental conditions in which it operates. “We need intelligent solutions to model these complexities,” Mrosik said, not only to capture all aspect of the design and how they interrelate, but also to enable design simulation and to collect data from real-world use to help inform future design decisions.

And we can create a digital twin of the entire value chain,” Mrosik continued, so “the hope that [a design] will work can be replaced by the certainty that it will work.” He noted that digital twin delivers the same benefits for machine builders as for product developers.

The following slide from Driving the Digital Enterprise Siemens PLM presentation helps you visualize the new model and what is behind a digital value chain.

As you can see from the following set of slides, the backbone of digital twin is the same Teamcenter integrated data model that was a foundation of single version of truth back in 2007.

Entire value chain of a large ecosystem is big deal.  You can think about it as a single un-separable place digitally represented in a single data. But the reality is different. It combined from multiple players – OEMs, suppliers, contractors, etc. The problem of modeling and management data in an entire value chain isn’t simple. The data handover between companies is very complex and distributed by nature. It is actually a network of companies working together in connected environment. The bigger network – the greater the challenge. Check my earlier article – Challenges of distributed data handover in digital manufacturing. The ugly truth about this environment can be presented in the following way:

To have a factory of robots making robots is possible, but in my view, it requires data management paradigm shift. You need to switch towards connected data and event-triggering  paradigm. The vision where the factory only produces a product when there is customer demand or an operation is only performed when there is a “data” signal. In this model, the factory responds to events as and when they occur. One example is an order for a product configured in a specific way. It requires connection of many environments and data processing.  But the reality of data management is ugly – multiple databases of “single version of truth”.

What is my conclusion? Modern manufacturing is shifting paradigms and create a greater challenges then existing technologies can manage. PLM vision is shifting from single versiosn of truth mantra towards something better – a model of entire manufacturing networking including all participants of enterprise value chain – OEMs, contractors, suppliers, etc. It will require a fundamental technological and paradigm shift in the way data is managed by multiple systems. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Heinz Eisenbeiss

    Congratulations to this very good article. The headline “new mantra, same technologies” literally asked me to add that the Digital Enterprise vision not only involves a better (and more connected) management of traditional PLM data but is also supposed to include realtime manufacturing KPIs out of the production process in order to optimise the overall value chain. The Digital Enterprise merges PLM technologies and Production Automation into a new fascinating symbiosis.

  • beyondplm

    Heinz, thank you for your kind words and your comment! very good point. What technologies can connect production automation to PLM from your experience?

  • Heinz Eisenbeiss

    In my experience this very often works through a Manufacturing Operations Plattform (MoM). Simple example: Get product dimensions and granted tolerances from PLM (product design), schedule the production through MOM, get actual production data including quality data from production automation, check tolerances against PLM specification. There are various communication and IT technologies involved. Main challenge is to (de-) couple the real-time production world from the IT processes where technologies like SCADA or Edge Computing come into play.

  • beyondplm

    Can you see possible webserice-like communication between Edge computing devices and cloud-based IT systems (such as PLM)?

  • Kamal Salwan

    Oleg, Nice article summarising the past, present and the future trend of PLM positioning in the digitised value chain. Combining this with the distributed data handover articulation in the above diagram, I can see another need emerging as “digitised value webs rather than chains” for intra and inter-enterprise service exchanges in the wider eco-system, literally making everything as a service (XaaS) model. What do you think?

  • beyondplm

    Kamal, I agree. “Web” notion should replace existing “chains” and “single version of truth” databases. Check my OpenBOM blog for more articles about digital transformation, manufacturing networks and BOM web services. https://medium.com/@openbom/bill-of-materials-manufacturing-and-digital-transformation-c52dde7b9b62. Best, Oleg

  • Heinz Eisenbeiss

    I think the OPC foundation moves in that direction with their OPC OA standard.

  • Kamal Salwan

    Thanks Oleg, it seems like we share a number of common interests. I read through your open BOM blog. A common information model across different participants / stakeholders would definitely help. I have seen constraints in adoption in other industries such as SID in Telco, agreed by various participants (refer http://www.tmforum.org), what is the adoption/ awareness rate of OpenBOM among the participants in manufacturing industry?

  • beyondplm

    Kamal, Thanks for sharing the link! OpenBOM is still very young (I mean by any means software that in production 16 months only from very first production prototype). But you can see a map of our users on our openbom.com website.

    The trick is not to apply “common” model. In my view, that was a mistake done by many products “forcing” collaboration. OpenBOM gives you a flexibility to design your model and share it with others. There are mechanisms that drive standards and sharing in OpenBOM such as public property definitions and real-time collaboration that facilitating data sharing without forcing it.

    Is there anything I can do to help you?
    Best, Oleg

  • beyondplm

    I’m sorry, what is OPC OA? Is it https://opcfoundation.org/?

  • Heinz Eisenbeiss

    Sorry I misspelled it. It is OPC UA.
    Your link is correct.

  • beyondplm

    Thank you for clarifications! It is encouraging to see companies joining standard initiatives like this. Standards are very important to get broader adoption. But, in general, standards are very hard if there is no financial incentives and correct business models, or it is not clear why companies need to support it.