Almost 10 years ago, Gartner defined IT obsolescence management as major emerging issue. Here is a passage from old the old article about that:
By year-end 2010, more than one-third of all application projects will be driven by the need to deal with technology or skills obsolescence. “Our research with thousands of clients across multiple geographic locations and industries shows that most CIOs are struggling to cope with a set of portfolios in which an overwhelming percentage of the artifacts need to be retired and replaced within a comparatively short period of time — between 2008 and 2015,” said Dale Vecchio, research vice president at Gartner. “The scale of obsolescence in the set of portfolios is a major problem in its own right, but it is compounded by the lack of integrated planning capability within many IT management teams.
And if you think about main contributing factors, such as lack of agility of IT systems and services in responding to business requests, increasing demand for integration and aging of deployed assets you can see why obsolescence is a huge issue for manufacturing companies. The amount of old enterprise PLM software assets is increasing. Think about all major PLM software providers – all PLM platforms were originally developed more than 10 years ago.
The problem of obsolescence in PLM software is hitting large companies – those companies invested large amount of resources in PLM development for the last 1-2 decades. Engineering.com article Boeing, Airbus and the Hardship of Dealing with PLM Obsolescence –TV Report gives you an interesting perspective on the level of complexity in PLM obsolescence in major commercial aircraft OEMs – Airbus and Boeing. Large manufacturing organization are actively looking for a formula how to maintain existing environment and at the same time get access to new development, technologies and projects. And this is a place with lot of conflicts and very costly mistakes for the future development.
The dream situation is explained by CIMdata’s Peter Bilello: “A key question is how these industrial organizations can cost-effectively move to newer technological capabilities. Have they installed and developed a PLM environment that can be upgraded reasonably quickly, at a reasonable cost and ensure that the data can come forward as well?”
As much as it sounds simple the problem is crazy complex. Participants of Verdi Ogewell’s TV report highlighted some of the elements that in their view can solve the problem – standards, data-driven approach and layers. The last one seems like a strategy for Airbus. Here is an interesting quote by Airbus’ Anders Romare:
So interoperability and openness are key, and Airbus’ Anders Romare is clear about the strategy they have to meet the harmonization requirements on the one hand, and the need to integrate new technologies on the other. “We try to work on two dimensions,” he said, adding that, “For the core PLM backbone, yes, we need still to harmonize across the company. It’s important to have one single backbone harmonized across the company. Then we try to build on top of those backbones, more agile layers where we can have faster development, faster cycles.” Basically, Airbus uses PTC’s Windchill as its PDM backbone, Aras PLM for supplier collaboration and Dassault Systèmes’ CATIA for CAD. SAP is the solution on the ERP side (MES, Manufacturing Execution Systems).
I found both passages very much conflicting. If the desire of IT organization to find an architecture to support an easy upgrade and data migration, then to build horizontal layers is probably a mistake. As much as it seems like a way to unblock Airbus IT organization to develop new agile solutions, the architecture is fundamentally wrong and will lock Airbus product data even more in the future. Both layers are build on existing PLM backbones – PTC Windchill and Aras Innovator. Aras was developed a decade later after Windchill and has some interesting technological elements such as open XML driven data model schema. But from fundamental architecture perspective both systems are perfect examples of monolithic PLM backbones that will have similar trajectory of development and obsolescence. While standards like PLCS and JT are very promising to make point-to-point integration easier, they will be helpless to solve a problem of two aging product layers in the future. In the future, so called “stable layer” will become a dead layer. And PLM architects will have to take lessons in data archeology to learn how to get product data and reuse it for new applications.
What is my conclusion? Obsolescence is a big problem in PLM. While all PLM systems are old, to solve a problem by combination of older and newer application can be a dead end. Agile product development layer on top of existing “stable” model is a temporary patch that will solve a problem for few years or maybe a decade. Airbus and Boeing are talking about 50-100 years of data lifecycle, so 10 years is a very short period of time. By creating layers of relational data models and application Airbus, Boeing and other manufacturing companies potentially can create even a bigger problem in the future. A new agile facade of data layers will not solve a problem of old data spaghetti called “stable data”. To solve a problem there is a need to change data architecture. There is no silver bullet solution. However, to move away from traditional horizontal layers to micro-service architecture and vertical components can be one of the first steps in the future. Just my thoughts…
Want to learn more about PLM? Check out my new PLM Book website.
Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.