PLM System Architecture Evolution For Dummies

PLM System Architecture Evolution For Dummies

Technology always played a significant role in engineering and manufacturing software. Earlier CAD systems were very much dependent on specific graphic workstations and computing power. Earlier PDM was based on proprietary databases and later standardized on SQL databases. During the last decade of web and cloud development, we’ve seen a massive introduction of a variety of new data management systems, new databases, and cloud architectures. It was also the time when many CAD and PLM developers experimented with various hosted and cloud technologies and architectures. The PLM industry is taking the next step into SaaS application, which causes tons of discussions, debates, and opinions about how these technologies will be combined, re-architected, implemented, and used by vendors and customers.

Recent PTC announcement about the acquisition of Arena Solutions triggered many questions about how platforms can be combined together. Similar questions were coming in the context of other systems and applications developed over the course of the last two decades. It made me think about how to present the system architecture evolution, which can help in the debates about possible synergies and integrations of the systems.

In the following picture, I put four different architectures of PLM systems. Check this out.

The first time PDM systems were invented to manage CAD files and metadata. They used proprietary databases and file managed systems. There are almost no such systems currently in production. However, the most successful and stable applications such as Solidworks PDM Workgroup are still around even though DS discontinued them back in 2018.

PLM systems evolved mostly from early PDM, but also as independent database systems to manage Bill of Materials and related information. The majority of these systems were relying on SQL databases, which are very capable to scale well. The most successful were able to scale to thousands of users and served large OEM manufacturers and enterprise organizations. All mature PLM systems are using this architecture today. Some of them are using web application servers, but all of them are relying on SQL databases. Examples of these applications are Siemens Teamcenter, Aras, and many others.

Hosting of SQL-based architectures became a popular option for the last decade. Some of these applications, especially those that have web applications architecture were successfully hosted using growing IaaS infrastructure and virtual computing. These systems became the first generation of cloud PLM systems. Most of them are single-tenant hosting architectures, but some of them used multi-tenant application servers. Nevertheless, almost all of them are still using SQL databases as a foundation of data architecture, which is a bottleneck to scale these applications. Also, single-tenant data models are still prevented from data easily shared between users, teams, and companies. PTC Windchill is an example of such a system that can be successfully hosted in data centers and used by multiple companies.

Modern PLM systems were driven by new data management technologies and the new architecture of web applications. First and the most important is the usage of new NoSQL databases (document, graph, search, etc) combined with existing SQL databases. Polyglot persistence was growing fast in web applications capable of scale globally thanks to micro-service architecture and modern IaaS platforms enabling global deployments of solutions. Even so, these systems might look similar for hosting, in fact very different also from deployment and implementation. Global platforms like Autodesk Forge, Onshape, OpenBOM (disclosure – I’m CEO and co-founder) are examples of applications using modern databases and web architectures.

Although I brought a few examples of systems and applications, the architecture of many systems is not known and vendors are using very fuzzy marketing-oriented architecture pictures. While I can understand the market competitiveness, it creates tons of confusion among customers and industry analysts.

What is my conclusion?

It is time to clarify PLM system architecture in everything that is related to databases, web applications, tenancy, deployment, and implementation. How systems can be implemented and deployed will have a major impact on customers, ROI, and future ability to integrate these systems. The time when all PLM systems had similar SQL based architecture is over. The next decade will show how some technologies and solution architectures can outperform others and provide unique functions and features to share data, provide analytics and intelligence. It is also an opportunity for PLM architects to learn details of modern architectures in order to avoid costly mistakes. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network-based platform that manages product data and connects manufacturers and their supply chain networksMy opinion can be unintentionally biased.


Share This Post