As the manufacturing industry continues its digital transformation journey, Product Lifecycle Management (PLM) systems are becoming more critical for industrial companies to support their digital initiatives, manage processes, and support customers. Companies need to leverage technology in order to accelerate product design, planning, and development processes, so there is a growing demand for better and smarter PLM solutions. Customers demand to do better, cheaper, and faster in everything that relates to their new product development (NPDI), maintenance of existing products, and optimizing their supply chains.
We’re witnessing an evolution of PLM environments, technologies, and applications. In recent years, advancements in technology have revolutionized the way businesses manage their product data. From traditional Computer Aided Design (CAD) files to advanced devices offering real-time analytics and predictive analysis, the PLM trajectory toward unlimited possibilities is clear.
During the last few years, I’m involved in the development of OpenBOM, a modern digital thread platform for manufacturing companies. I learned many technologies and new data management patterns available today. Also, I had an opportunity to talk to many industrial companies (from very small engineering firms and startups to large industrial enterprises) about problems experienced by engineers, production planners, supply chain managers, and overall organizations in their product development and manufacturing activities.
In this blog post, I want to discuss how modern data management and AI technologies can help manufacturers take advantage of this opportunity by evolving from CAD files to a comprehensive source of product data with knowledge graphs and AI/GPT-driven insights. I will also outline key questions that manufacturers need to ask themselves when exploring new technological opportunities for Product Lifecycle Management.
CAD/PDM – The Frenemies Engineers Can’t Live Without?
When everything you have is a hammer, then all problems look like nails. This is what happened to PDM/PLM in the past 20 years. CAD systems were initially designed to produce documents (drawings), which was a natural thing to do for them. All data management systems such as PDM and PLM followed this paradigm providing an environment to manage CAD files (and related documents). While document-oriented systems have gained popularity during the last decade of the complexity of PDM/PLM environments, they are one of the root causes of PDM/PLM environment’s complexity and problems -CAD files are the root cause of PDM nightmares. PDMs are complex, not allowing to manage of data in a granular way and overall limiting digital transformation. Let’s dig into this a bit more.
How and Why To Leave Document Management Systems Behind?
One of the main limitations of document-oriented systems is that they are designed to store design data, such as non-structured pieces of proprietary data, rather than structured data that can be semantically understood. This can make it difficult to perform complex queries or analyses on product data and may require additional processing steps to transform the data into a usable format such as the typical database used by all enterprise applications today.
Another limitation of document-oriented systems is that they are not well-suited for managing relationships between different types of product data. Document-oriented systems may struggle to represent relationships between parts, configurations, and other elements of data, which can lead to limitations, data duplications, and other inconsistencies.
In addition, CAD File/Document oriented systems lack the flexibility and granularity to support specific workflows such as costing, product configurations, supply chain analysis, and many others. Also, document-driven systems are not good for many data governance tasks – regulation, sustainability analysis, consistency of the data over time, etc.
While document management (PDM) systems were the inevitable enemies of the engineering environment for the last few decades, it is time for them to go if companies want to follow the digital path. PLM must evolve beyond traditional CAD documents. Organizations must adopt new data models and paradigms to manage product data. Let’s talk about them.
Many people can stand up now and say- we cannot design and do their work without files. I agree, the management of files is an important function. We are getting better with file management capabilities including cloud file storage and systems capable to manage and stream large files. But those systems are becoming infrastructure in the growing number of polyglot persistence data management architectures.
Moving from Documents To Data Paradigm
To move beyond CAD document management, organizations should adopt more granular product data management approaches that are focusing on data (not files) as a central element of the system. Here are some ideas on how these systems can be different from traditional PDM systems:
- Storage of data in a way of objects with attributes
- Granular representation of design elements, product structures, and bill of materials
- Connecting between different disciplines (mechanical, electronics, software)
- Granular revision and change control to maintain product data consistency.
- Seamless collaboration and granular data sharing beyond the “document” level
Stepping beyond the limits of folders/files will allow PLM systems to advance toward system design to support complex product development and manufacturing processes.
Network Model and Semantic Web
Traditional PLM data management architecture is limited. One of their limitations is the architecture of data management. Check my article – What is wrong with existing PLM systems? When you check your product and organization data stored in PLM (and many other connected systems), you are likely to see that it is currently scattered in a set of isolated tables including Excel spreadsheets and various databases. Thousands of rows in all databases and tons of tables in every PLM system. Customized PLM systems allowed the creation of tons of data and turned into monsters holding companies hostages. Flexible PLM data management allowed us to parameterize these data tables but still created data silos with tables, IDs, and numerous relationships managed as a separate set of tables.
One of the promising future data models to support the PLM discipline is network data models (or graphs). What can help is to bring network thinking to solve data management problems. This means we need to turn a “connection” to the first-class citizen in the future PLM data architectures. Future product lifecycle management must embrace graph data models and switch to a new way of building product data management and digital tools around these models.
Networks and graphs are very interesting and powerful paradigms that can help to manage more data in a granular way. Read some of my articles about data management trends and semantic models for product development in the post-monolithic PLM world.
Product Knowledge Graph and Graph Sciences
Knowledge graphs offer a powerful means to represent and organize product data, relationships, and dependencies. They can be instrumental in creating a holistic view of your product lifecycle. The development of knowledge graphs was progressing for the last decade. Starting as a semantic web with RDF/OWL standards, it continued as a linked data approach and materialized with the recent advanced development of graph databases and other related technologies.
The research and development of knowledge graphs and graph-based PLM systems is in the very beginning, but it is promising with some systems available in production online using graph databases such as Neo4j (check OpenBOM and Ganister)
Here are some examples of how future PLM systems will leverage knowledge graphs:
- Create a structured ontology that defines the product and its relationships.
- Incorporate domain-specific knowledge (industry standards, regulations, and best practices)
- Utilize graph-based analytics and visualization tools to reveal insights, identify dependencies, and facilitate decision-making.
In the future, the development of knowledge graphs and their application in graph science will play a crucial role in the advancement of PLM (Product Lifecycle Management). Knowledge graphs represent a fundamental departure from traditional PLM SQL-based architecture and have the potential to address many of the challenges that legacy PLM systems struggle to tackle. By enabling advanced data modeling and analysis, knowledge graphs can provide insights and answers to complex questions that were previously inaccessible. Therefore, the adoption of knowledge graphs is likely to be a key driver in the evolution of PLM and the improvement of product development processes.
AI, GPT, LLMs, and other
Large language models, like GPT-4, can analyze and generate natural language text, enabling a more human-like understanding of product data. These models can be integrated into PLM processes to: automate the generation of product documentation, such as manuals, reports, and specifications, assist in the identification and resolution of design and engineering issues, and enhance collaboration and communication by offering intelligent suggestions and recommendations.
Contextualizing AI systems such as LLM (GPT) models to specific customer data, can be an interesting opportunity for PLM research. I can see an opportunity to integrate contextualized LLMs with Knowledge Graphs. I can see PLM vendors are in their early research work to learn how to combine the power of large language models and knowledge graphs together for future PLM systems. So, what are the next steps?
- Develop a framework that connects the large language model with your knowledge graph, enabling context-aware analysis and automatic generation of product data (eg. BOM copilot).
- Develop applications that leverage the capabilities of large language models and knowledge graphs to optimize workflows and support decision-making (eg. change impact analysis)
- Develop technologies to update and refine the knowledge graph and train the large language model based on the new data available publically and internally in an organization.
What is my conclusion?
A decades-old practice of using CAD and other document-oriented data management systems such as PDM and legacy PLM systems relying on SQL databases is coming to an end. The inefficiency of the document-driven approach is becoming obvious to manufacturing companies and PLM professionals. Complex system design, combined with the lack of capabilities to analyze data and provide intelligent decision support and granular data management triggered new innovative thinking in the PLM industry. Innovative approaches such as network data models, knowledge graphs, and contextualized large language models can open new doors for future PLM technologies already today. Integrating these cutting-edge technologies into your PLM process can enhance collaboration, streamline workflows, and drive innovation.
In the next few years, we will be witnessing an evolution in PLM from traditional CAD file management to a powerful software system that provides knowledge graphs and AI/GPT capabilities – allowing companies to collect and share complex product data across the entire organization. Just my thoughts…
Best, Oleg
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital-thread platform with cloud-native PDM & PLM capabilities to manage product data lifecycle and connect manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.
Pingback: Beyond PLM (Product Lifecycle Management) Blog PLM AI and Co-Pilots Catchup? - Beyond PLM (Product Lifecycle Management) Blog()