One of my most favorite chapters in technology for the last decade is related to graph data models. As I’ve been working on multiple applications of product lifecycle for the last decade, the importance of graphs, knowledge and information connection was growing. Back in 2009, I’ve been digging into semantic networks and triple stores. Search was one of the way to to think about graphs and data representations. You probably remember some of my articles from those days.
Search as an interaction model is an obvious outcome of graphs and the simplest user experience to build out of graph models to discover information. Recent publication Sex, drugs and rock-n-rolls about new book sheds some lights on early days of Google – Valley of Genius: The Uncensored History of Silicon Valley (As Told by the Hackers, Founders, and Freaks Who Made It Boom). Forget the sex, the important story I captured is related to what according to Larry Page and Sergy Brin was originally the end game for Google. It was AI.
The book explores a deep-rooted and intense desire that’s driven Google’s founders all these years. Brin and cofounder Larry Page stumbled onto the magic search business but that was never their main interest. From the start, Google was always intended to be an AI company. They’re now closer to that vision than they have ever been, and what comes next could make search look like a footnote in Google’s history.
In the last few years, I shared some of my thoughts about future development of knowledge graph representation and growing potential of graph models in enterprise organization and global manufacturing business. Check some of my blogs – Why PLM should learn about Google Knolwedge graph; PLM knowledge graph and future decision support.
Existing PLM architecture stick with relational databases and they are past due to to make a change to support complexity of data management and scale of data usage.
PLM vendors and product are struggling with the high level of complexity. It comes also from the side of semantic richness of data as well as from the side of user interaction complexity. Google Knowledge graph shows an interesting way to simplify knowledge representation and knowledge interaction with end users. Another aspect is related to the large-scale information modeling. The information about products and product lifecycle is getting more and more complicated every day. PLM products running on SQL databases will have to find a better technological foundation for the future scale.
In some of my recent articles I discussed how graph-aware architecture can play a role of new information model – PLM graph aware architecture; Why graphs are important for social PLM strategy and how graphs can help to organize future global manufacturing vault [disclaimer – to remind you, I’m co-founder and CEO of openbom.com developing online service helping manufacturing companies to collaborate across the globe between engineering, contractors and suppliers).
As I can see changes in manufacturing companies for the last decade, two major trends are becoming very obvious to me. (1) Siloed data problem; (2) Raise of graph models.
Manufacturing data silos
The data in manufacturing companies is the most strategic asset going forward, while it is painful – located in multiple systems, heterogeneous and distributed. Local data silos allows control and governance of the data in the way making data valuable. But, at the same time, data silo is a disconnected island of the data preventing from larger structure to be composed and organized. Silos prevents everything in manufacturing company – application development, reporting, analytics, compliance and AI-driven decision making. Unconnected data is bad.
A graph data structure and model is the only realistic solution to manage manufacturing data in a connected way in a full scope, scale to build the world where connection makes everything. Two important things can happen here – virtualization of data resources, connecting of everything into graph data model.
Rising of graphs
Large global web companies (Google, Facebook and even LinkedIn) are the best examples of knowledge graph systems. Which means if manufacturing wants to go on scale, there only one way to make it happen – to capture all data from multiple silos, query, analyze and reuse global manufacturing data including monetizing of the data in a full spectrum. The last one will change business models of PLM software as we know it today.
The value proposition of manufacturing graph is in the transformation of all data sources and silos of every type into a form of knowledge graph. At the beginning you can count on subset so of data (Eg. Engineering and manufacturing bill of materials and connected information). As system will be future developed, full spectrum of data will be explored and transformed into graph model.
What is my conclusion? Build graphs and connected data structures is the only way for manufacturing companies to move forward towards future optimization of processes and rationalization of decision making. Full data exposed in a way of data graph will be converted into knowledge representation to empower business of small and larger manufacturing companies. The manufacturing networks will be growing bottom ups allowing simplified bill of material management, lifecycle support and intelligence. Just my thoughts..
Want to learn more about PLM? Check out my new PLM Book website.
Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.