Connected PLM transformation is one of the strong trends I can see happening in the market of Product Lifecycle Management (PLM). The main drivers of this process are going deep into manufacturing transformations – new connected products, global markets, competition, supply chain complexity, changes in business models, such as switching to services, and many others. Altogether, these drivers create the need for a new PLM technology, architecture, and capabilities to support modern manufacturing companies from small businesses, contractors, and suppliers going forward to very large global enterprises.
Data is becoming extremely important and everything in PLM is spinning around the abilities of software to provide robust and scalable data management capabilities. Earlier this month I posted about Data Model Evolution for Future PLM platforms and Connected PLM Transformation. Check this article to understand better what data management capabilities can be delivered by modern technologies as well as what functions and value can be delivered by transforming existing monolithic PLM systems into a network-based collaborative connected environment.
It is still hard to get from point A to point B?
While most of the companies tend to agree about the “connected future”, the question about how to get “from point A to point B” is still very hard for most of the companies. Besides the traditional “changes is hard” argument, there is a real technological challenge of how to find the right path between existing systems with multi-million dollar investment, tons of legacy databases, Excel files, and other enterprise systems to a new environment. For most PLM architects, it is still a big question. In my article today, I want to talk about how “connected PLM” transformation can be made and what are elements of technologies can help manufacturing companies create open, robust, and scalable PLM architectures.
The need and value of open semantic data modeling in connected PLM
A Semantic Data Modeling Layer is a data model that allows you to store and retrieve your data as well as the metadata associated with the data. A Semantic Data Modeling Layer can be used by both humans and machines to retrieve information from your system, which is useful for building a variety of queries, reports, or answering questions about what you have stored in your system. The technology behind this layer of software provides a way for people to understand what they are looking at when they view an organization’s dataset.
The semantic layer also gives organizations the ability to store their datasets without having to worry about formats or structures. This means more flexibility on how we want our data structured and formatted so that we don’t need to worry about conversion rates. The last is very important when the discussion comes down to how this unified semantic layer can be organized.
Technologies for Semantic Data Layer
What technologies can be used for the organization of this semantic data layer? Data management technologies were made a huge leapfrog for the last decade by developing multiple ways to manage data using different data modeling abstractions. In my very earlier article about PLM data management for the 21st century as well as in the article – Data Architecture Evolution in PLM, I explained how databases are becoming tools rather than data platforms. It is a big difference from what the PLM industry was doing for the last 15-20 years where single databases were a foundational platform for PLM systems. Still, I can see some people are looking for a database to become a magic silver bullet to solve all PLM problems.
Instead of searching for a single database, I suggest choosing technology and framework to model data that is open, standard, and at the same time powerful enough to provide a foundation for such complex data as it needed for modern design, digital twins, digital threads and other data modeling abstractions needed in engineering, product development, and manufacturing. My favorite candidate for this technology is Semantic Web technologies such as RDF / OWL.
Below is an example of Wikidata, one of the most advanced linked data set.
If you never heard about it, check it out because these technologies are becoming more and more mature over the course of the last 10-15 years and today provide a solid foundation for open data standardization efforts.
How Open Semantic Layer will work with other technologies and systems?
An obvious question is what does it mean about all other PLM technologies, databases, platforms, and systems? Don’t panic, all existing platforms and technologies will not disappear and will not become immediately obsolete. The power of the new and open semantic layer is to provide a connected semantic environment to align information and build future open infrastructure. As I described in my Connected PLM Transformation article, existing systems will be providing interfaces to plugin into new open infrastructure. These plug-ins will allow to semantically annotate product lifecycle data and connect it to new data elements representing a new semantic layer.
Open Source and Semantic Data Layer
Each time the world open data layer is mentioned, the question about open source is coming. While open source created some noise in the PLM industry over the course of the last decade, no big PLM initiative was delivered for PLM open source projects. I’d say open source PLM had good practice in both business model and marketing, so now it is a good time to make a second attempt to create an open-source product lifecycle management initiative, and connected PLM open semantic data layer is a good foundation. The resource description framework RDF combined with ontology data communication can build semantic layers that will overlay existing product lifecycle management and create metadata as well as actual data sources into data collections of product data creating a semantic web of product information. They will facilitate transforming raw data (existing PLM) into a knowledge graph and data that can be used across multiple systems and data sources. The data models can be actually open-sourced and shared in the industry where every company and vendor will contribute to this activity. Moreover, I can incision how modern PLM vendors will be able to deliver building blocs (open source components) capable to work with this open-source data. It is a separate topic and I will come back to this in my blog as well as in my OpenBOM blog as well.
How to benefit the semantic layer today and what to expect in the future?
The transformation process to build open semantic data modeling can take time and involved multiple steps. Like in my examples of a driving paradigm shift, the changes won’t happen overnight. The elements of the new semantic layer will be created by vendors and companies, will be consolidated, turned into mature re-usable data elements and services.
The biggest opportunity is to build a linked data set using such information that combines pieces of semantically annotated data using a basic resource description framework (RDF) and create a data processing chain to form a domain knowledge and to create a mechanism of enabling interoperability already today.
Future steps will bring a layer for the entire product lifecycle that can be used for a PLM system of the future to build a manufacturing business strategy connecting separate domain knowledge of respective lifecycle phase information into a knowledge graph as well as to create a holistic data set that can be connected to multiple systems.
What is My Conclusion?
For the last 25-30 years, PLM systems were developed into big siloed platforms focusing on how to control data elements of a specific company and combine them together to perform lifecycle operations – data management, change management, etc. While it is by itself to have such an information model for each system, there is a super important task to go beyond relational databases and to create an open physical product ontology model that can be across multiple systems. The semantic web enables the building of such a model, but the challenge, as always, is more related to people and organizations to change. Such data as well as search and discovery mechanisms to produce open-source data, data integration with existing systems using online services can be a step to go beyond the existing single-vendor black hole of monolithic platforms. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.