In the age of digital transformation, enterprises are constantly striving to harness the vast potential of emerging technologies to stay ahead in an ever-competitive marketplace. For many years, product lifecycle management and product data management have been among the leading technologies to help manufacturing companies manage product development processes, support supply chain management, and find the best way to design and build products, new technologies such as machine learning, natural language processing, neural network, and artificial intelligence are coming. It can be an opportunity to improve business processes and to build better PLM software.
Among these technologies, the capabilities of advanced language models, like ChatGPT, are revolutionizing how businesses manage and interpret the complex sea of data that underscores their operations. As product lifecycle management (PLM) is intrinsically linked to large data sets, the challenge lies in comprehending and utilizing this data efficiently. Enter ChatGPT. With its profound prowess in data analysis, it offers a fresh, intelligent perspective, bridging the gap between data complexity and actionable insights, setting a new benchmark for PLM in the modern era.
Chat GPT Enterprise
AI news is coming fast these days. A few days ago, my attention was caught by the news – OpenAI launches ChatGPT Enterprise, the company’s biggest announcement since ChatGPT’s debut. Chat GPT and other AI applications are taking the world by storm causing fear and excitement at the same time. Here is an interesting data point:
Two months after ChatGPT’s launch in November, it surpassed 100 million monthly active users, breaking records for the fastest-growing consumer application in history: “a phenomenal uptake – we’ve frankly never seen anything like it, and interest has grown ever since,” Brian Burke, a research vice president at Gartner, told CNBC in May.
The article gives you some interesting perspectives on the development of GPT solutions for enterprises, problems the Open AI team experienced, and the cost of the operation.
- The key difference between ChatGPT and enterprise solutions is the ability to input company data
- There are a lot of debates about features, priorities, and what people really want and need.
- ChatGPT is expensive to operate
Here is the most important passage from my perspective:
One key differentiator between ChatGPT Enterprise and the consumer-facing version: ChatGPT Enterprise will allow clients to input company data to train and customize ChatGPT for their own industries and use cases, although some of those features aren’t yet available in Monday’s debut. The company also plans to introduce another tier of usage, called ChatGPT Business, for smaller teams but did not specify a timeline.
PLM AI capabilities
Monica Schnitger, a leading analyst and researcher in the field of PLM, engineering, manufacturing, and construction applications, published an article asking How close are we, really, to AI in PLM. Check it out. Monica speaks about different use cases where AI can be beneficial. Here is one of them (very PLMish)
We see similar projects across the PLMish space, so far creating targeted solutions to specific problems. For example, creating knowledge graphs to help designers analyze bills of material for patterns like most-used/least-used. Or combining physics-based simulation with AI-driven design to surface more and better design alternatives earlier in a project. Or helping construction planners figure out the most logical way to stage their project. The PLMish vendor community is hard at work combining AI technology with their vertical-specific offerings. They have so far stayed away from generic AI solutions that users can apply to generic problems.
Another interesting comment from Stan Przybylinski is where he speaks about the opportunity of the ChatGPT tool to understand the complexity of Digital Thread.
In our spring PLM Market & Industry Forum I gave a presentation on the digital thread, a concept that has been around a long time but is now on the radar of many industrial firms and the software and services providers that support them. If people build the digital threads they “describe” in their survey responses to us they will contain massive amounts of data. I think that something like ChatGPT, trained in PLM-ish, could be a treat tool to help interpret all of this information to help the digital thread reach one of its objectives: learning from past development efforts to improve those in the future.
The question about the usage of AI capabilities in PLM implementation is triggering many ideas. Over the course of the last few months since ChatGPT was released, I used it for different applications and research alongside other AI tools that were available before ChatGPT and earlier this year.
Here are my observations about PLM AI capabilities and what can we see sooner rather than later.
- Content generation, summary in marketing, and sales was the largest driver of AI tools such as ChatGPT triggered many questions about future content creation. How to separate AI-generated content and content created by people also can be challenging.
- Data is a key element for a good AI tool. Having access to data is the first step in the ability of AI-driven solutions to do anything meaningless. Building LLM for enterprise organization or broad usage can be complex
- Enterprises are potential buyers of expensive AI solutions dealing with complex data. Identifying a real problem for an enterprise (eg. data quality validation such as BOM completeness or impact analysis) can be a complex process and many enterprises are just making the first steps
Data – From Excels to Knowledge Graphs and LLM
In today’s data-driven era, companies are more focused on how to put data at the center of their business and operations. This is where the limitations of homegrown Excel-based solutions are becoming increasingly apparent. Companies that transition from spreadsheet-centric methods to leveraging Knowledge Graphs and Large Language Models (LLMs) are positioning themselves at the forefront of data organization and intelligent solution design. Knowledge Graphs offer a structured and interconnected way to visualize and navigate complex relationships within data sets, turning silos of information into interconnected webs of insights. When combined with the natural language processing capabilities of LLMs, these structures not only organize data more efficiently but also interpret and generate actionable insights from it.
What is my conclusion?
The main question companies can ask is how to ensure a more holistic understanding of their data, paving the way for smarter decision-making and innovative solutions. AI and more specifically LLM introduces a new technology that can be applied to get a better understanding of information. Neural networks and machine learning can play a significant role in the process of designing and building products. But how to integrate these technologies into existing processes of document management, product lifecycle, and other technologies.
The challenge for many companies is to translate the technology into the product development process. This is a place where many PLM software failed. Most PLM software vendors offer a solution that I’d call “a single source of truth”. While no one is arguing about the importance of a single source of truth, for many companies it is poorly translated into “place all data in a single database”. But such a strategy doesn’t focus on how to get business benefits.
Learning from past PLM software mistakes, the question is how to identify business problems that can be solved by new AI solutions. I can see most PLM AI researchers are focusing on this type of activity these days.
Just my thoughts…
Best, Oleg
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital thread platform including PDM/PLM and ERP capabilities that manages product data and connects manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.