In one of my last blogs last year, I talked about the evolution of PLM system architecture. I’ve got many questions offline and online asking me about specific aspects of architecture, vendors, and approaches. While cloud seems to be inevitable, the question about cloud adoption, cloud architecture, and how it will be supported by vendors raise most of the questions.
I attended the PDT 2020 conference a couple of months ago. The slide that caught my attention was presented by Marc Halpern of Gartner. I think credit for the “single vendor black hole” name goes to him.
Vendors for a very long time were pitching the idea of a single platform (vendor) selection as the most efficient and providing answers to all questions about integrations. I believe, companies are gravitating in between these two extremes and trying to set the right balance. This raises many questions about PLM integrations (this is a separate topic, I hope to talk about it later in my blog).
Cloud technologies have passed the point of technical viability. Cloud solutions are expanding and it is an inevitable future. In such a context, one of the most interesting questions is about multi-cloud. It is certain that these technologies are the future and I can see rapid adoption of cloud applications and platforms. As a result, the reality of the companies is to operate using more than one cloud application. Specifically for PLM, it brings a question of how manufacturing businesses connected into a supply chain and other types of relationships will operate using multiple cloud solutions (potentially from different vendors). Multi-cloud solutions become in high demand as well as openness of existing platforms.
What technologies and architectures can support multi-cloud integrations and how it might all work together? In the past, PLM applications run on top of SQL databases and such a database was an obvious integration point. SQL-injecting, direct access to data was a universal solution to solve most PLM integration problems. Even the situation with PLM openness has significantly improved for the last decade, still it is a big challenge.
Microservices is the right way to think about modern architecture and integrations. Such architecture and the rise of Kubernetes as the cloud OS allowed modern applications to be distributed and optimized. Microservices architecture enabled the replacing of on-premise servers by virtual machines, applications now are converting into globally distributed and logically centralized. Most importantly integrated and connected. Such services can be integrated in a much easier way and stop being dependent on the old school direct database hacking.
What is my conclusion?
The industry is gravitating towards multi-cloud applications with the architecture moving from legacy on-premise servers towards global and distributed applications. Such architecture allows much higher levels of granularity and integrations, which by itself will become a way to escape a black hole of a single vendor. Just my thoughts…
Best, Oleg
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers and their supply chain networks. My opinion can be unintentionally biased.