A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

Manufacturing Knowledge Graph, Why It Matters And What Is PLM Role?

Manufacturing Knowledge Graph, Why It Matters And What Is PLM Role?
Oleg
Oleg
6 November, 2020 | 4 min for reading

Unless you lived under the rock for the last decade, you’ve heard about Knowledge Graphs. In a nutshell, a knowledge graph acquires and integrates information into an ontology and applies reasoning algorithms to derive knowledge. It uses information extracted from systems, subject matter experts, data links, and machine learning algorithms. Sounds complex? Let me bring you some examples.

One of the oldest and mature implementations of Knowledge Graph is Google Knowledge Graph. It was introduced in 2012 and it applies semantic search techniques to organize knowledge or, in other words, turns words into things and connects them semantically. The outcome is a better query result. Check my old article – Why PLM Need To Learn About Google Knowledge Graph?

So, what Google Knowledge Graph does is creates a connected graph of data and associated metadata applied to the model, integrate, and access an organization’s information assets. The knowledge graph represents real-world entities, facts, concepts, and events as well as all the relationships between them yielding a more accurate and more comprehensive representation of an organization’s data.

I like Knowledge Graph because, in my view, it brings power to data management. The graph model is powerful and it allows us to create a model that integrates and access information assets in the company and beyond. The knowledge graph can represent concepts, entities, products, customers, events as well as relationships between them. Altogether it can bring a more accurate and powerful representation of an organization’s data.

I can see the Knowledge Graph as a powerful model that can be used to solve many engineering and manufacturing problems. Here are a few ideas of knowledge graphs I have in mind.

1- All the data generated by a car production line including information about products it built, suppliers, contractors, failures and including all time-series information about production.

2- Multiple years of the data collected from a fleet of vehicles, including all maintenance information, combined with all repairs, suppliers, and operations performed by maintenance contractors.

3- Integrated data set of product information, documents, and relationships from multi-tiers of suppliers from OEM to Tier 1, 2, 3, and down, including supplier relationships, contracts, and dependencies.
.
To build such a complex knowledge graph requires a modern data management infrastructure. The technologies to build knowledge graphs include graph databases, semantic databases, RDF/OWL, and triple stores. Even so, the question about semantic capabilities, scale, and capacity is important and requires more discussion and validation.

How is that related to PLM? Conceptually, PLM systems are very close to the concepts of the knowledge graph, because PLM is supposed to connect and manage various silos of data and includes information about the lifecycle of the product over time. However, the biggest challenge of PLM technologies is the age of all PLM platforms and underlying technologies, mostly SQL databases that were never built to hold such information.

Another aspect of complexity is data ownership, which very often spans across multiple organizations. Think about chains of suppliers, contractors, manufacturers, and customers. It is hard to believe that companies will allow all information in a single database. To solve such a problem, a special network platform architecture will be needed.

This is an opportunity for modern PLM systems built on two fundamental differentiators: (1) the concepts of polyglot persistence and (2) SaaS multi-tenancy. The first allows creating a semantically oriented backend capable of storing all semantic information and scale horizontally using virtual computing resources. The second brings a new network layer capable of managing information from multiple organizations and at the same time to provide access to data across multiple domains.

What is my conclusion?

Knowledge Graph is a promising technology that is capable of boosting PLM performance and business value. However, PLM systems built using old architecture are not capable of delivering data management and scale models for Knowledge Graph. In my view, the opportunity behind knowledge graphs in manufacturing is huge, but the systems capable of making it happen are not available yet. SaaS PLM systems have the potential to grow into Knowledge Graphs using proper data management architecture and leveraging cloud scale. It is the future of network platforms in manufacturing. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network platform that manages product data and connects manufacturers and their supply chain networks.

Recent Posts

Also on BeyondPLM

4 6
16 July, 2010

I think, we are going to see more iPad applications in coming year. Will PLM and other enterprise software vendors...

8 April, 2017

I’m attending COFES 2017 these days in Scottsdale, AZ. If you’re interested in engineering software – CAD, CAE, CAM, PDM,...

2 July, 2011

CAD, PLM and engineering software world is very competitive. Time ago, CAD vendors competed on the number of features. It...

24 February, 2017

It has been almost 5 years since Autodesk turned the switch “PLM cloud” on. Tech-clarity blog – Autodesk’s announced PLM...

14 April, 2019

My last article from NetSuite SuiteWorld 2019 in Las Vegas is about Oracle NetSuite Manufacturing vision. It was presented during...

28 January, 2015

Cloud is trending and it is hard to find a company who is not thinking how to leverage new cloud...

20 October, 2014

Workflows and processes. This is an important part of any company. Like blood goes through your body, workflows are going...

26 December, 2019

Just a few days left before the year 2020 will be here. The past decade was the time for PLM...

19 May, 2016

The amount of digital data is growing at an exponential rate.  We are doubling data every two year. The rate...

Blogroll

To the top