What to learn from Ericsson PLM failure?

What to learn from Ericsson PLM failure?

Verdi Ogewell, the Editor-in-Chief of VerkstadsForum PLM Magazine and ENGINEERING.com’s European correspondent publish his next PLM failure bombshells – Telecom Giant Ericsson Halts Its PLM Project with Dassault’s 3DEXPERIENCE. If you’re like me often reading Verdi’s publication, you’re not surprised. You can easily find other news similar to this one. Here is a short list:

Volkswagen’s Epic Challenge to synchronize PLM for its Truck Brands

Is Jaguar Land Rover About to Stumble on the Final iPLM Stretch?

A Big Win as Yamaha “Does a Daimler” and Chooses Siemens PLM

PLM at Jaguar Land Rover – The Moment of Truth for Dassault’s 3DEXPERIENCE Platform

The world of “big PLM” is full of dramas. A novice reader not familiar with the PLM industry can think it is a big deal. It is actually a big deal for people involved in this project. But for the PLM industry, it is not a surprise. Here are a few comments, I captured from an open LinkedIn discussion:

Kevin Prendeville, Principal – Product Strategy & Lifecycle Management at Deloitte – Great learnings from this – it’s not the vision nor the SW – its how to transform the processes, people and data to get there.

Tim McLellan, Director, Systems Design – Entrepreneur, Engineer, Author, & Strategic PLM Leader – Not really a surprise. People, process, technology. People 1st with a strong commitment and a, truly, shared long term vision at all levels of the organization are a must!

The failure rate of PLM projects is high. Aras Corp white paper is suggesting a number of reasons why PLM projects fail. Read more here – PLM Frustrations – Why Do Many PLM Projects Fail? Aras suggests the biggest reasons for PLM project failures is lack of flexibility, limited customization, and the fact projects are stuck with no upgrade path.

Engineering.com article hits that complexity of customization and legacy data migration was one of the reasons project was derailed from their original goals.

In general, it can be stated that the toughest problems associated with such a system swap are about migrating legacy data; information that already is in existing systems. This is a delicate and difficult task that partly concerns the quality of existing data and partly requires extensive translation and consulting efforts in connection with the migration of the legacy data.

“Every time we have tested the migrated data, all use cases failed,” says my source within Ericsson. “Not once has it been possible to migrate old data. This, the CIO [had] been fully aware of when he repeatedly confirmed that ’the timetable holds, we will deliver …’ This is an extremely important piece [of the project] and taking care of [data] history has always been high on the agenda.”

Legacy PLM Data Challenge

There is a real challenge behind legacy data acquired by existing systems. I can see business system such as PLM in production for 10-15 years produced huge amount of data. To clean it and to transfer to a new system can be life threatening experience, which was confirmed by a “death toll” of Ericsson/3DS implementation. While flexibility is a key, I’m not sure pushing existing data in a new platform is a good step, because functionality might not match.

PLM Flexibility vs Changing People

This is one of the biggest challenges of PLM implementation is to introduce a company to a business transformation path by changing the way the company works. So, changing people can be a more efficient way than over customize PLM?  I captured an interesting comment in the LinkedIn discussion.

A common route when replacing legacy systems is to attempt to re-program the old system in the new platform. None of the PDM/PLM systems I have witnessed are optimal platforms for such an approach. Either the programming environment is not conducive to enhancement or you risk making upgrades virtually impossible. Best keep to “OOTB” – or as my previous CIO repeatedly reminded us “change the people, not the system”. Easy said, but sometimes, politically, not so easy to carry out …

Changing people can have consequences, but might be more efficient step rather trying to reprogram new PLM with old features. I’m sure not everyone will agree with such a statement. I have a mixed feeling as well.

What is my conclusion? There is no simple conclusion today. Flexibility vs Out of the box products – which one do you prefer? Over-customize a new PLM to follow old processes? To use a new system as an opportunity to clean existing processes? To move 25,000 people from one database to another is not a simple job. It is time to think about no upgrade PLM systems. While a cloud environment is not an option for mega-size OEMs like Ericsson, there is an opportunity for OEM IT together with the PLM vendor to run a migration path. The last one is a very expensive step. But… without this step, the current database oriented single-version of truth PLM paradigm is doomed. It can save few mega-size PLMs, but it will do nothing to help 90% of manufacturing companies to success with their PLM projects. There is a need for a new Beyond PLM paradigm. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud-based bill of materials and inventory management tool for manufacturing companies, hardware startups, and supply chain. My opinion can be unintentionally biased.

Share

Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Jim McKinney

    Did Ericsson have a 3rd party PLM consultancy to help with this project? PLM vendors are notoriously poor at helping with large PLM migration activities. Planning for cultural change, and data migration are two of the items at the top of any list; it is likely no one really gave it a lot of attention until it was too late. I think it is correct to consider if old data should be migrated into the new system at all. There are other ways to access this data, and a full-fledged migration is often not a good idea.

  • Agree with Jim. In the past four years, I have been explaining that migrating old data, in many of the old OEM it is about relational databases / document-driven architectures migrating to object-oriented, data-driven environments is a mission impossible.

    Any PLM vendor will fail.

    I had the experience more than10 years ago where a mainframe application based on custom relational tables had to be migrated to SmarTeam (highly customizable data model). Due to inconsistencies of the data – double data sets, inconsistent attributes the migration took two years instead of two months. The technology was not the issue.

    Next from document-driven to data-driven creates similar issues as it is not possible to migrate the content of a document to data records. During PLM conferences these topics have been discussed (and ignored by vendors).

    My latest presentation at PI PLMx London addresses this topic (to be found on SlideShare) – it is about the difference between coordinated and connected.

    As long as we keep on focusing on vendor-capabilities/technologies only, instead of the real potential transformation as a whole, we will make people happy with this kind of stories, missing the real point https://uploads.disquscdn.com/images/1cf9992741a84aea9b2eebb94b7cd885fa216c7747bc273988d683ac0b14e76b.jpg

  • David Martin

    Does anyone know what Ericsson was using before?

    From what I know about 3DEXPERIENCE, I would find it extremely challenging to migrate data from traditional systems to a database without files or folders.

  • beyondplm

    David, very good point. DS products are usually very specific. Although, I think, Ericsson run Matrix before and it is a foundation of 3DX.

  • beyondplm

    Jim, thanks for the comment! What is the alternative for “full-fledged migration”?

  • beyondplm

    @josvoskuil:disqus indeed migration between incompatible systems is hard. Although Ericsson run MatrixOne before, At the same time, Aras is demonstrating that the key in successful migration is focus on technology and total flexibility allowing to organization to do what they need instead of out of the box migration suggested by some commenters (change people! )

  • @beyondplm:disqus MartrixOne was one of the many systems – the majority are mainframe applications. I cannot disclose the details. I do not see Aras as being successful in migration – they create an overlay to connect to legacy systems – a more clever approach to create traceability of data, but you do not solve end-to-end digital continuity as the data in the legacy systems is not reliable for sharing outside the silo.

    And of course – never change people! Change processes and ways of working with same or other people.

  • Ericsson is running a collection of mainframe applications combined with other systems – two Enovia instances in a highly customized mode – see my comment below on incompatibility of any modern data model

  • beyondplm

    @josvoskuil:disqus why do you think sharing of data outside of the silo is not reliable. This approach is used in many enterprise systems. Actually, data lakes, federation and many others are doing exactly the same – extract data, give it a context and share. To a certain degree, you can think that Google Search does the same.

  • @beyondplm:disqus the main point I try to make in all my discussions is that data created in the silos most of the time has not been designed for sharing. Content in a document does not have the same meaning as content in a database. Documents are managed as containers and indexing like google search does not bring the unique context – it provides a potential context.
    In the connected approach you need to have accurate data to build a federation of information. If not accurate automation or a combined view on data is not reliable.
    As in the past there was no data-quality governance in the PLM domain, this is where the pain for upgrades is the biggest.
    Current and future data-sets are not compatible – therefore forget migration and try to connect as good as possible – this could be search indexing

  • David Martin

    They had trouble migrating their data from Enovia to 3DEXPERIENCE? Wow. It’s worse than I thought.

  • beyondplm

    @josvoskuil:disqus I can see your point. But the chances you bring all silos in order are zero. System can extract data from silos, get extracted data (accurate data in your terms), connect it and make a valuable data sets. At the same, keep connections to silos. This approach proven to be valuable online. The best example I can think about is airline booking systems. Each airline booking service is a complex system, but Kayak, Expedia and others can operate on top of these silos.

  • @creowindchill:disqus i don’t think it is worse. Enovia instances with a highly customized data model are in no way compatible with any standardized PLM system. For example the highly customized Prisma environment from Daimler based on old Teamcenter is also such an environment you do not want to migrate. Daimler even moved from CATIA to NX as this was “easier” then touching Prisma

  • @beyondplm:disqus here there is no problem because these silos are already designed to be transactional. Data in these systems is reliable. The challenge in PLM comes for example where one drawing document describes various variants of a product. If everything defined in the drawing is correct and consistent is not sure.

    Imagine during manufacturing of a variant an error is discovered and can be fixed during manufacturing. The drawing is redlined in the plant (disconnected from engineering) and most of the time the source is not updated – too expensive / time consuming as the problem has been fixed.
    I have plenty of real examples that illustrate PLM data is not 100 % accurate – as long as you do not “translate” the information to database records these errors will not be identified.

  • beyondplm

    Yes, I guess the trouble was to convert from Mx1 to 3DX.

  • beyondplm

    @josvoskuil:disqus completely agree – data is not updated and clean. . However, I can tell you that web data is also in many cases is not transnational. So, I’m not saying it is an easy task, but the assumption that data will be cleaned and organized one day is false positive IMO. Solutions are too big and too complex. So, actually to learn how to live with bad data can be an interesting opportunity.