A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

Future-proof PLMs

Future-proof PLMs
Oleg
Oleg
30 July, 2019 | 6 min for reading

One of the biggest competition is competition with the status quo. Exactly one year ago, I wrote about legacy PLM. I found interesting to see how PLM vendors are attacking so-called “legacy” platforms. The definition of a legacy platform is somewhat vague. Here is the one from last year:

Legacy PLM systems consist of a mix of individual products accumulated from years of acquisitions. At best, they may have been loosely stitched together but more often, the result is silos within the PLM system architecture that impede collaboration. Even while legacy PLM systems may boast full functionality across disciplines, the individual pieces of software often do not work well together and don’t facilitate communication and collaboration.

CIMdata commentary about Platform Architecture caught my attention earlier this week and brought me back the topic of legacy PLM. Check this article. CIMdata is alerting about two possible situations – (1) existing large enterprise solutions and (2) future native cloud platforms.

Here is the first passage telling us – large existing solutions are bad. They built over the decades and created from the acquisitions. They are depending on a specific software stack and databases.

Most large enterprise solution providers in the ERP and PLM marketplace have grown through acquisition and often have to stitch their acquisitions together with integration technology. Each technology has its own architecture, often with differing technology stack elements (Oracle DB vs SQL Server, .NET vs. Java, etc.) It is difficult to re-engineer what wasn’t engineered in the first place. Ultimately, grafting a web service interface onto a monolithic architecture does not make a platform.

Second passage is telling us that future cloud systems when they will grow up, will be as bad as existing enterprise software because they will be tied up to a specific cloud provider and CIMdata haven’t seen a solution, which can run on multiple cloud platforms.

Once cloud-based PLM solutions mature beyond running in virtual machines, their software services or microservices become tied to the underlying platforms such as AWS, Azure, and Google. Cloud native solutions may already have this issue as CIMdata is unaware of cloud native solutions running on multiple cloud platforms. A well-defined abstraction layer is required to enable a solution to be portable across cloud platforms. 

So, what is the solution, you can ask? CIMdata article is clear about the answer:

The Aras PLM Platform provides a resilient platform for digital transformation initiatives and is currently deployed at leading industrial companies such as General Motors, Airbus, and Microsoft.

The commentary is about Aras, so no surprise that Aras is recommended. And for a good reason. I will come back to this later.

What caught my special attention is the way CIMdata is describing the differentiation – future-proof architecture. CIMdata is not providing the definition of future-proof architecture. However, it gives characteristics of architecture assessment. Does the solution use modern software development approach? Does solution have a single data model, common IDs, extendable data types, links? Does the solution have a common workflow engine?  How easy to integrate competitors? Is it portable on different platforms? What is the effort to upgrade the system?

Aras PLM made very impressive progress for the last 10-12 years after Aras enterprise open source pivoting. The architecture assessment made by CIMdata is reflecting many of characteristics that can be attributed only to Aras since Aras is probably one of very few traditional (not cloud) PLM platforms today that wasn’t affected by M&A activities since the creation. You can check my blog about the new way Aras is proposing to manage their own M&A – Will Aras lose their mojo or develop a new way to acquire PLM companies.

Reading CIMdata commentary made me think about future-proofing as a strategy in software development. In the past, I was developing PDM system that was required to support future versions of CAD packages. While it might sound like a very sophisticated task, the ugly truth was to get into some agreement about future table and API interface backward compatibility. Sometimes, it fired back and future tables and/or data structures remained not utilized.

One of the biggest problems in software development is imaginary problems. I’ve seen many projects that failed because software engineers worked to solve a problem, which doesn’t exist. There are tons of examples – how to create the most flexible workflow engine (when 80% of workflows are the same), how to support multiple databases (when 80% of customers are running the same database), how to support multiple browsers (when 80% of customers are using the same browser).

So, what is future-proof PLM according to CIMdata? The hypothesis of future-proof is built on top of imagination that the next 20 years of manufacturing enterprise IT will be the same as the last 20 years. However, the truth is that we don’t know what will come during these coming two decades. The way existing PLM systems were built is not by following future-proof packages, but by acquiring customers that settle for a specific set of functions with the absence of alternatives. Some of these systems are still in use even vendors stopped developed them for almost a decade.

Future-proof differentiation is focusing on how to replace existing systems at the time when PLM system adoption is extremely low. At the same time, 70% of companies aren’t using any PLM system because they cannot wrap around existing PDM/PLM paradigms and they are looking for something completely different from existing PLM systems.

In my view, Aras PLM history is actually a demonstration of how bad is future-proofing can be. The beginning of Aras was bumpy. Check this article:

Aras Innovator was first released in 2000 using the latest IT innovations and the well-established Microsoft enterprise stack. But it got lost in the shuffle after the dotcom collapse. Despite pleasing a few key customers, sales were sluggish until the company decided to make a radical shift. The company fired the sales staff and released Aras Innovator as open source, looking to capitalize on what they saw as the long-term trend of growth for open source enterprise solutions. In an unusual move, Microsoft embraced the open source Aras, giving it special treatment usually reserved for the largest proprietary vendors.

Aras’ innovation was solid architecture, free license, open-source community and time to build awareness. Combined with the aging existing PLM platforms it created Aras opportunity today. Can you call it future proof architectures? I doubt…

The key to future-proofing is flexibility. Not only technical flexibility, but overall flexibility to adapt economically and technically to a specific market moment while addressing some key pain points in the market. The key success factor was the growing adoption of Aras platform by the unhappy customer of existing PLM players.

Next decades of computing can be significantly different from the past 20 years. Look at the speed of changes and you will see that we are going through some paradigm shift that started actually 20 years ago. New brands demonstrated how modern infrastructure can help to acquire customers differently. The future of innovation can be different.

What is my conclusion? Flexibility and customer adoption is the key to success. Breaking the existing status quo is the biggest problems. Computing architectures will be evolving fast and can pleasantly surprise us. However, customers using PLM solutions today and tomorrow will define the sustainability of platforms and future proof of the systems. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud-based bill of materials and inventory management tool for manufacturing companies, hardware startups, and supply chain. My opinion can be unintentionally biased.

Recent Posts

Also on BeyondPLM

4 6
3 March, 2014

The importance of Bill of Material in product development and manufacturing hardly can be undervalued. BOM is a cornerstone of...

7 May, 2011

A week ago, I came back from ACE 2011 (Aras Community Event 2011), which took place in Detroit, MI. Let...

10 June, 2009

I’d like to continue yesterday’s discussion about PLM basics and talk about our ability to manage history and time in...

12 December, 2013

Identification. When it comes to data management it is a very important thing. In product data management and PLM it...

12 May, 2021

Earlier today, I attended the PI DX online meeting – Is Digital Thread Doomed WIthout Open Architecture. The virtual event...

14 May, 2012

Normally, I’m trying to avoid the topic of PLM competition. Not very often, readers or attendees at conference are approaching...

9 February, 2015

The race towards CAD in the cloud is getting more interesting every day. I’ve been watching SOLIDWORKS World 2015 live...

30 August, 2024

The shift to SaaS PLM is rapidly gaining traction in the manufacturing and engineering sectors. As companies move away from...

17 August, 2011

Note. To be honest, I don’t have much to say with regards to the last Google purchase – Motorola. I’ve...

Blogroll

To the top