PLM magicians and future of data management

PLM magicians and future of data management

Magic is one of the old professions in the world. Some articles nominates magician to be a second old profession. However, I found interesting that magic had changed from generation to generation. Many centuries ago, magicians were advisers to kings and emperors. Long before advanced science and medicine, magicians influenced scientists and doctors to turn illusion into reality. Magicians used a very simple set of effects such as productions & appearances, disappearances & vanishes, transpositions, transformations, penetrations, identification and others. What is common to magicians is to to make what seems to be humanly impossible. Typical magician is a very good performer who can understand staging, costumes, theatrics effects and storytelling. Magic is universal, it has no borders and language. It is pure mystery. And it makes people smile…

I’ve got a good portion of smile reading Peter Schroer’s article – PLM without limits is Magic. However, let me take a step back…

Peter Schroer, CEO and founder of Aras, which I respect a lot. For the last 20 years, he created one of the most interesting and capable PLM platforms available today on the market – Aras Innovator. He has vision, execution, technology. Under his leadership Aras became a product breaking into a high society PLM club of 3 top vendors – Dassault Systemes, PTC and Siemens PLM.  Aras created a very interesting PLM modeling engine.

Now, let me get back to the article. What made me chuckle is his rant how Aras PLM technology is created independent on any other technology in the world. He is the passage:

The point is to build a PLM Platform that is independent of Oracle vs Microsoft, of Cloud vs. On-Prem, or Multi-Tenant vs Single-Tenant, and also of SQL vs No-SQL. The Aras architecture is not built on any of these and is not limited by any of the known limitations of any of these.  We don’t care where the services run and where the data store is. All that can (and will) change over time while the customer’s highly customized PLM keeps running, and remains flexible. It’s the Modeling Engine not the underlying IT technology that is the breakthrough.

As much as I like the idea of building universal products and techs, as we know, the devil is in details. So, to get into details, here are 2 slides I captured from Aras architecture presentation – Aras Architecture Principles and Aras Scalability Vision.

Picture worth thousand words. Aras architecture is mature, scalable and capable to hold significant load of data and users because it relies on mature and capable infrastructure of databases, load balanced application servers, replications and other enterprise IT technological stack. The foundation of this stack is relational database, which every IT organization in the world knows how to scale and manage. Being in the market for almost 20 years Aras got all these things. Combined with brilliant business model of free downloads and subscriptoin, it created what I called in my article – Aras PLM momentum.

But here is the place where changes are coming. And these changes in data management. For several decades we lived in the database nuclear winter. When I implemented first  PDM / PLM system, the choice was pretty much obvious – to use database that will be approved by IT department. So, all PLM systems created in 1990s and 2000s are coming with the same concept of building layers on top of relational databases. You can see a very successful and flexible modeling engine build by Aras on the picture above.

But by the end of the 2000s, we entered into Database Thaw period and by the beginning of 2010s a new concept was coined – Polyglot Persistence.

In 2006, Neal Ford coined the term “polyglot programming”, to express the idea that applications should be written in a mix of languages to take advantage of the fact that different languages are suitable for tackling different problems. Complex applications combine different types of problems, so picking the right language for each job may be more productive than trying to fit all aspects into a single language. This same concept can be applied to databases; so that you can have an application that talks to different databases using each for what they are best at to achieve an end goal, thus giving birth to polyglot persistence.

A growing specialization in data management combined with advanced global web platforms created a wave of data management innovation. These platforms are not controlled by enterprise ITs and many of them are driven by Open source and web projects. This is a place where SQL dominance is cracking.

Polyglot persistence doesn’t mean SQL servers are disappearing. It means a new type of solutions are coming. You can ask why is it needed? Why PLM world cannot run by old fashion SQL database. It sure can, but as organization are drowning into data, complexity of integration, high availability, lots of read/write traffic, complex data relationships, a new data management approach can play a key role in future PLM innovation.

From a technological standpoint, use of multiple databases can help (1) to scale data storage and (2) to break existing monolithic application stack into a set of micro-services.

In the picture below, you can see a high-level vision of PLM application built on principles of polyglot data persistence, scalable databases, micro-services and REST web services.  You can check my article about PLM data management to get an idea about what are unique characteristics of each database include pros and cons of specific use cases.

Another aspect is translation of monolithic applications in style (data, backend logic, user facing) into set of independently scalable micro-services.

Every PLM platform is dependent on a specific architecture and technological stack. The level of dependencies can be different. Relational databases can be swapped. In such a way Aras can replace Microsoft SQL server to Oracle or MySQL or even Amazon or Microsoft RDS. Applications built on AWS technological stack can be transformed into Microsoft Azure. Can Dassault, Siemens and PTC apply Aras flexible open source data modeling and change their architectures? Probably yes, but it can be costly and take time. Similar to that, can Aras apply the flexible modeling principles and turn into microservice based scalable global data platform? Of course, it is possible. However, cost and times are factors, similar to the transformation of old PLMs.  Some of these transformations aren’t easy and can be expensive. As one of software architects I’ve been working with always said – there is no magic is software – only bugs. And transformation of existing PLM solution to new a new platform, databases, tech stack is no different and will bring new bugs.

What is my conclusion? Each product has its architecture and build using a specific technological stack. PLM is not different. But technology is changing and coming to manufacturing. Data management is changing and it will change PLM. A combination of databases can be used to optimize for scale and cost. Overall system architecture oriented on use of a diverse data tools will allow future PLM platforms to grow beyond single SQL database and provide an answer on the demands on manufacturing IT. It includes the need to build solutions for high data scale, complexity of integration, high availability, intensive traffic and complex data relationships. Cost optimization will allow to develop future business models in PLM. The timing for these changes is another discussion. But as famous Wayne Gretzky said – “Skate to where the puck is going, not where it has been”. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.


Share This Post