Exploring the Potential Impact of GPT-3 and AI on Product Lifecycle Management (PLM)

Exploring the Potential Impact of GPT-3 and AI on Product Lifecycle Management (PLM)

ChatGPT and AI topics are trending in the industry. The technology is fascinating and everyone is trying to taste it and check how it can impact and improve existing technologies, products, and processes. Earlier this week, I attended a TEDx AGI event in Boston, which was fully dedicated to the Generative AI topic. Check the event online. I hope the presentation will be published soon. This is TEDx Boston Youtube you can subscribe.

Product Data Management (PDM) and Product Lifecycle Management (PLM) tools exist for quite a long time, but the technology and application didn’t change much over the last 10-15 years. Still, these products are a foundation of the product development process and are required in many applications of business processes, document management, and supply chain management. The recent cloud revolution triggered changes in product data management and companies are looking for tech as an option to improve PDM/PLM systems used by businesses today. Designing and building products is a complex process and to improve it and make it more efficient is a question many professionals in the product lifecycle industry are asking these days. What PLM system will offer something new is an interesting question.

What impact could this new technology have if applied to product lifecycle management (PLM) and a broader scope of PLM? What PLM solution will live up to the hype of AI, GPT, etc? We are now poised on the edge of discovering all that harnessing AI for PLM can potentially do; but first, let us take a moment to explore exactly what this means and just how profound its implications may be.

GPT-3 and ChatGPT

If you never heard about GPT-3 and ChatGPT, you need to catch up immediately.

GPT-3 (Generative Pre-trained Transformer 3) is a state-of-the-art language model developed by OpenAI, which uses deep learning techniques to generate human-like text. With 175 billion parameters, GPT-3 is the largest language model created to date and can perform a wide range of natural language tasks, including language translation, question-answering, summarization, and even creative writing. It has the ability to understand natural language input and generate complex outputs with little to no human intervention, making it a highly versatile tool for a variety of applications in fields such as education, healthcare, and business.

ChatGPT is an AI-based conversational agent that is trained on vast amounts of text data to enable it to understand and respond to user queries in a natural and coherent manner. It can answer a wide range of questions, carry out tasks, provide recommendations, and engage in free-flowing conversations on various topics, making it a versatile tool for businesses, individuals, and developers. Its ability to learn from new inputs and adapt to different contexts makes it an exciting development in the field of natural language processing.

As the pace of technological advancement continues to accelerate, the potential for how Artificial Intelligence (AI) can revolutionize the entire product lifecycle is becoming increasingly clear. GPT-3 has been touted as a ‘generalist’ approach AI tool, whereby it is capable of completing tasks in both a wide variety and broad range – from composing multimodal works such as blog posts or essays, to conversational agents and friendly chatbots within user interfaces.

GPT and LLM- Understand The Logic and Use Cases

Before jumping into magic dreaming, let’s first understand GPT and Large Language Models (LLMs) before. Here is how Chat GPT explains itself.

A large language model is an AI-based model that uses deep learning algorithms to process and understand natural language. It is trained on vast amounts of text data and is designed to generate human-like text in response to user inputs. The model uses a neural network architecture with millions or even billions of parameters to learn patterns and relationships in the input text and generate coherent outputs. Large language models can perform a variety of natural language processing tasks, including language translation, sentiment analysis, text summarization, question answering, and even creative writing. They have the potential to transform how humans interact with technology and open up new possibilities in areas such as education, healthcare, and business.

So, simply said. LLM is capable to provide you with the best match based on the pre-trained dataset. For the largest GPT-3 model created by OpenAI, you can think about it as if you read the entire internet. This is what ChatGPT is using.

A different use case created by Open AI and Microsoft (GitHub) is called GitHub CoPilot. GitHub Copilot is a code auto-completion tool that uses a variant of GPT (Generative Pre-trained Transformer), which was developed by OpenAI specifically for code-related tasks. Codex is a deep learning model that has been trained on a diverse range of codebases, including popular open-source projects, to learn the structure and syntax of programming languages. It can understand natural language descriptions of code tasks and generate code snippets that match the desired functionality. The model is designed to improve over time as it is trained on more codebases and feedback from users, making it a promising tool for developers seeking to improve their productivity and efficiency.

Product Lifecycle Management (PLM) Use Cases

While the potential of LLM and GAI (Generative AI) is huge, there is still a need to understand what can be done and how to differentiate between hype and practical use cases. The number of publications is skyrocketing and it is sometimes hard to get down to specific ideas and pragmatic use cases that can improve existing tools and provide new capabilities.

In my article today, I want to bring 3 use cases that in my view have the potential to become “workable solutions” in the foreseeable future.

Requirement Translation

The ability to create summaries is one of the strongest advertised capabilities of GPT3 and Chat GPT applications. I did it multiple times and found it very useful. One of the interesting applications of these capabilities is to create a structured set of requirements automatically based on the textual description. While it raises many questions about accuracy and traceability, I think it is a valid use case that can be explored by vendors.

Design Process Flow

Processes are at the heart of PLM applications. More specifically, Workflow processes are traditionally complex and demand a lot of work to be created and maintained. Transforming the process of workflow creation using AI tools can be another interesting opportunity. Having the possibility to do it with knowledge of a specific organization can be even more interesting. The latter trigger the question about LLM that needs to be created to support specific organization or/and industry.

Streamline BOM Management and make BOMs more accurate

This one is very fascinating. BOM creation is a tedious process that requires defining many elements and validating information with suppliers (eg. item descriptions, models, etc), online catalogs, enterprise systems, etc. The accuracy of the Bill of Materials is an important element in improving the quality of the process and streamlining the process management. It raises the question about creating of specific BOM LLM, which can be training similar to how Codex deep learning model. Check out more about it in my OpenBOM article (BOM Copilot Research).

What is my conclusion?

GPT and Large Language Models usage in enterprise software applications are just at the beginning. There a lot of improvements that can be done with Product Lifecycle Management tools. The opportunity is fascinating, but whether will it bring immediate success and improvement is not clear yet. GPT models are living up to the level of materials they are trained and therefore the most important question is where to get the data to train models in the way we need it (eg. creating structured product requirements, proposing a recommended process flow, or autocomplete BOMs). Just my thoughts…

Interested in the research? Reach out to me to discuss more.

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a global digital thread platform providing PDM, PLM, and ERP capabilities and new experience to manage product data and connect manufacturers, construction companies, and their supply chain networksMy opinion can be unintentionally biased.

Best, Oleg

Share

Share This Post