Why PLM need to learn about Google Knowledge Graph?

Why PLM need to learn about Google Knowledge Graph?

Last week was clearly Facebook week. However, if you had a chance to take your head out of Facebook IPO and Mark Zuckerberg and Priscilla Chan wedding, you probably noticed an interesting news that came out of Google. It called Google Knowledge Graph.

If you never heard about it, start with Google’s blog post – Introducing the Knowledge Graph: things, not strings. The following Google video can give you an initial idea of what is that.

What is behind Google Knowledge Graph?

Google Knowledge Graph (GKG) is a database of information about what Google calls “things”. The ideas are going to multiple places – semantic web, internet of things, semantic search, linked data, etc. In my view, Google acquisition of Freebase two years ago, became an important event towards building of knowledge graph. Here is an interesting passage from Google blog:

Google’s Knowledge Graph isn’t just rooted in public sources such as Freebase, Wikipedia and the CIA World Factbook. It’s also augmented at a much larger scale—because we’re focused on comprehensive breadth and depth. It currently contains more than 500 million objects, as well as more than 3.5 billion facts about and relationships between these different objects. And it’s tuned based on what people search for, and what we find out on the web.

It is not so clear how GKG was built and organized. Google clearly mixed information collected from Freebase, CIA Factbook and Wikipedia. You can read Deconstructing Google Knowledge Graph blog post for more information about “How?” it is done. It is still on the level of guesses. I’m sure in the next few months we will see more examples and explanations.

Why PLM vendors should care?

PLM vendors and product are struggling with the high level of complexity. It comes also from the side of semantic richness of data as well as from the side of user interaction complexity. Google Knowledge graph shows an interesting way to simplify knowledge representation and knowledge interaction with end users. Another aspect is related to the large-scale information modeling. The information about products and product lifecycle is getting more and more complicated every day. PLM products running on SQL databases will have to find a better technological foundation for the future scale.

What is my conclusion? Web technologies are moving forward with the speed of light. I cannot say the same about enterprise software. For the majority of people in manufacturing companies, life is a bunch of Excel spreadsheets and databases with applications running for years. The cost of existing IT environments is skyrocketing. The trends like BYOD shows that people cannot tolerate outdated IT anymore. So, how to build a product knowledge graph in your company? This is a question PLM managers need to ask these days. Just my thoughts…

Best, Oleg

Picture credit to Mashable GKG can change search forever article

Share

Share This Post