Photo credit: Tesla Robot Dance by Steve Jurvetson
It is hard to argue about value proposition behind “PLM projects”. PLM story is rock solid these days. However, to sell PLM is still a very hard job. Do you remember a decade ago, PLM sales people were obsessed with ‘singe version of truth’ story? The term is not unique for PLM and used in multiple business software applications. It is a concept describing data management solution using one central database or multiple synchronized databases storing all product related information in a consistent and non-redundant form.
If you never heard about “single version of truth” before, you can easy Google it. Here is an example of ten years old publication in Aerospace Manufacturing magazine – A single version of truth. Here is a key passage:
“With Teamcenter, you can establish a single source of product and process knowledge that connects all global team members everywhere, all the time. Your teams can access this single source to find needed information quickly, reducing the time it takes to search for information by up to 65%, resulting in R&D costs being dramatically reduced and enabling companies to maintain an acceptable profit. Teamcenter also provides decision makers with better visibility into the most up-to-date product lifecycle information to make faster, more informed decisions.”
Another article in Manufacturing Business Technology Alert published in 2007 brings similar explanation:
UGS Teamcenter not only creates and maintains those links it also publishes them to the factory floor in a ready-to-use, operator-friendly format that is error-free and up-to-date. “Without waste or duplication, we deliver the right work instructions, at the right time, direct to the point of use, and with full flexibility to refine and change them,” says Thomas. “We’re entering data just once, and then reusing it many times.” By capturing, integrating, and controlling all product, process, manufacturing, and service knowledge in a single environment, UGS Teamcenter expands product knowledge management and process automation capabilities across an extended enterprise of departmental and divisional operations, partners, and suppliers.
Utilizing data collected from the shop floor provides useful information to both the management team and line operators with real-time key performance indices. That provides a single version of the truth to the staff and enables a Lean Six Sigma focus on performance improvement
Also I found the following picture to help you to visualize PLM single version of truth story.
That was PLM engineering and manufacturing story of single version of truth circa 2007.
Fast forward into 2017. Economist article Millions of things will soon have digital twins is a fascinating story how manufacturing companies are turning into factories of robots building robots. Read the article. My favorite passage is about tools and actually the story of term “digital twin”.
The digital twin is not a new invention. The concept of pairing traces its roots to the early days of space travel, when NASA built models to help monitor and modify spacecraft that, once launched, were beyond their physical reach. As computer power increased, these analogue models turned into digital ones.
The powerful systems that have since emerged bring together several elements—software services in computer-aided design and engineering; simulation; process control; and product life cycle management. Some digital twins are gaining artificial intelligence and virtual-reality capabilities, too. They can also help to monitor remotely and provide after-service for products that have been sold. “It is a digital twin of the entire value chain,” says Jan Mrosik, the chief executive of Siemens’s Digital Factory Division.
Cadalyst article few days ago – Siemens PLM Software Sharpens Focus on Digital Factory echo the same new definition – digital twin of an entire value chain.
The intelligent model, also known as a digital twin, represents the systems within a complex, modern product, which can comprise mechanical parts, software, electronic and electrical systems, sensors, communications/networking, and other components, as well as the unique environmental conditions in which it operates. “We need intelligent solutions to model these complexities,” Mrosik said, not only to capture all aspect of the design and how they interrelate, but also to enable design simulation and to collect data from real-world use to help inform future design decisions.
And we can create a digital twin of the entire value chain,” Mrosik continued, so “the hope that [a design] will work can be replaced by the certainty that it will work.” He noted that digital twin delivers the same benefits for machine builders as for product developers.
The following slide from Driving the Digital Enterprise Siemens PLM presentation helps you visualize the new model and what is behind a digital value chain.
As you can see from the following set of slides, the backbone of digital twin is the same Teamcenter integrated data model that was a foundation of single version of truth back in 2007.
Entire value chain of a large ecosystem is big deal. You can think about it as a single un-separable place digitally represented in a single data. But the reality is different. It combined from multiple players – OEMs, suppliers, contractors, etc. The problem of modeling and management data in an entire value chain isn’t simple. The data handover between companies is very complex and distributed by nature. It is actually a network of companies working together in connected environment. The bigger network – the greater the challenge. Check my earlier article – Challenges of distributed data handover in digital manufacturing. The ugly truth about this environment can be presented in the following way:
To have a factory of robots making robots is possible, but in my view, it requires data management paradigm shift. You need to switch towards connected data and event-triggering paradigm. The vision where the factory only produces a product when there is customer demand or an operation is only performed when there is a “data” signal. In this model, the factory responds to events as and when they occur. One example is an order for a product configured in a specific way. It requires connection of many environments and data processing. But the reality of data management is ugly – multiple databases of “single version of truth”.
What is my conclusion? Modern manufacturing is shifting paradigms and create a greater challenges then existing technologies can manage. PLM vision is shifting from single versiosn of truth mantra towards something better – a model of entire manufacturing networking including all participants of enterprise value chain – OEMs, contractors, suppliers, etc. It will require a fundamental technological and paradigm shift in the way data is managed by multiple systems. Just my thoughts…
Want to learn more about PLM? Check out my new PLM Book website.
Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.
Pingback: GHZ Partners | IT Services, Software()