Why PLM vendors don’t care much to integrate disparate tools and data?

Why PLM vendors don’t care much to integrate disparate tools and data?

enterprise-integration-disparate-data-plm-vendors

This is a time of the year when PLM analysts and journalists are trying to look at crystal ball for future predictions and analysis. Over the long weekend, my attention was caught by Joe Barkai’s 2016 trends, predictions and opportunities. For the last few years, significant effort was made by PLM vendors to expand traditional portfolio of PLM capabilities with new functions, domains and industries. Therefore, the following passage about future PLM trajectories from Joe’s blog was quite refreshing:

Leading PLM companies will continue to redefine and stretch the traditional definition of PLM, breathing a new life into PLM concepts that until now were mostly on paper, and technologies that were unable to articulate a credible use case. I am afraid, however, that this will produce more of those dreaded PLM conference talks in which the opening sentence is “let me give you my definition of PLM.”

While 2016 will brings new functionality into the PLM portfolio, one of the most critical gaps in product innovation and development is not going to be closed. The rush of PLM companies to acquire functionality (often in the form of overvalued early stage companies with no customer base to speak of) only adds to the fragmentation of an already complex product development process, fueled by myriad tools and many Excel spreadsheets. This is an opportunity for PLM and ERP companies to establish leadership by integrating the disparate tools and synthesizing data from multiple enterprise tools and data repositories to optimize product related decisions.

While I agree with Joe- integrate disparate data and tools is a very painful problem, it made me think why most of PLM vendors didn’t do much about it. Clearly, from my experience, enterprise application integration is one of the most hated category of software projects even among enterprise software itself. This is something you don’t want to do, otherwise you have it.

PLM implementation projects are complex and integration is one of the most complex. And “integration” is the most complicated part of it. The traditional PLM implementation approach is holding back PLM business. I’ve come to believe that the solution to enterprise integration problems is typically bottom up as much as top-down. One programmer or small implementation service team who knows the customer and enterprise software involved into the project can do things which would be real science-fiction project for any PLM vendor trying to apply top down approach. Especially, when it comes to to cost and delivery time.

It made me think about most of failed PLM and enterprise software integration projects I’ve seen before. There is one characteristic all these projects have in common – the buyers were not (going to be) the users of the software. Especially when it comes to the data and application integrations. Think more about it – integration of desperate data sources, transactions between PLM, ERP and other databases, record updates. These are important, but unfortunately invisible and less glamours parts of any implementation. At the same time, to bring a dashboard which shows a development progress or to show cool product visualization are examples of visible projects. The same happens with investment in PLM vertical and additional applications – new domains or applications are cool, can bring visibility to project owners, to bring new customers and opportunities. Opposite to that, to connect broken data pipes between multiple databases is not very sexy.

What is my conclusion? Data and application integration is a very hard problem. When it comes to PLM integration projects, it seems much easier to solve a problem using bottom up approach and implementation services team. It works… or at least worked until now for all large manufacturing companies spending mega dollars on PLM projects. It doesn’t work for smaller customers, but these customers are still not on board with the decision to implement PLM at all. It can be even bigger problem for new cloud PLM implementations – especially those that SaaS based and have limited architecture and budget to invest into expensive integration initiatives. Just my thoughts…

Best, Oleg

Share

Share This Post

  • Hi Oleg,

    Thanks for adding your perspective. As you point out, this is a difficult, unglamorous task, and quite often the challenge is more organizational than technical. To benefit from the integration of PLM-ERP data and taming the other tools and data sources (e.g. ALM) and the myriad spreadsheets, more stakeholders need to be involved, and the product development process & culture have to change. This is a hard sell, and from a PLM vendor’s point of view, just complicates the selling process…

    Product organizations need to do a better job in reducing fragmentation and in aggregating and synthesizing product lifecycle information from multiple sources to make better product lifecycle decisions. They need to get over the resistance (which you also mention) to make early investments in the process, knowing they will reap the benefits downstream from the point of investment. And they need to do it soon. Case in point is IoT, which must be integrated with the PLM process. See for example http://joebarkai.com/design-for-iot-or-design-by-iot.

  • beyondplm

    Joe, thanks for your comment! Yes, IoT clearly can be a driver for PLM companies to reconsider their current data integration approach. However, most of IoT data integration efforts I’ve seen are focusing on product usage and maintenance without real involvement into product design. Have you seen projects integrating IoT data into design process? I’d love to see one, especially if you have public references.

    Getting back to my original point about integration, it is much more practical for PLM vendors today to offload integration into PLM implementation phase done by service providers. In that way it doesn’t slow down sales process and removes complexity. But it can fire back… and it does for many PLM implementation. As far as implementation has enough budget to deal with complexity additional expenses of integration, it is not a problem. But as soon as it comes to SMB or cloud /SaaS, it might be an issue.

  • We are very early in the IoT journey. Many good ideas and anecdotal projects, the vast majority in service and maintenance (http://joebarkai.com/2016-trends-predictions-opportunities/) I am researching concepts in both “design for IoT” and “design by IoT” and am talking with major manufacturers exploring these areas. But admittedly this is early pioneering work…

  • beyondplm

    Joe, thanks for sharing the link! I agree – it is still very early in the IoT development

  • recPLM

    Great post Oleg.

    Current (leading) traditional PLM solutions are built by adding more and more functionalities, interfaces, import/export, synchronisations, … at the pace of their big $$$$$$ acquisitions.

    They usually didn’t take the time to take a step backward and rethink their solution. Maybe, a little bit shortsighted, they are rushed to to go and go forward until they hit a wall?

    Best wishes for 2016.

  • beyondplm

    Thanks for your comment! I’d say differently- large companies usually have lot of time and money to experiment. They can acquire, try, through away and bring something new. There is no guarantee of not hitting the wall, but the rules of enterprise business is different from consumer. So, technically it takes longer for enterprise software companies to crush. Large install base and maintenance revenues will keep them afloat.

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog » Dead end of product data ownership PLM paradigm()

  • The large companies have the resources on paper, but when you stop to think of the quantity, complexity, and frequency of these acquisitions and integrations (remember most happen quietly behind closed doors) the mind boggles. Then there’s the wrinkle of additional feature sets thanks to directed development from the largest customers. There are surprisingly few resources left to look at the overall flow and integrity of the product. The criticism lies in how the overall suites are marketed as fully integrated products, but often under the hood are anything but. Usually by the time a customer realizes the bitter reality, they’re already in too deep to do anything about it except keep pushing through and working some science fiction projects.

    Here’s a fun analogy. Think of it like a cruise ship, where you have multiple teams adding things to the boat during the voyage many of which change the direction of the boat. The captain should still keep the ship on course, but let’s assume he’s gotten lazy and is taking a nap in his suite. It’s advertised as a big vacation destination, but ultimately it’s a cruise to nowhere in particular.

    So a better question it why does it have to be this way? It’s about the new shiny thing making sales of additional licenses. The vendors know that and act in a way the market demands. So the PLM market has kind of done this to itself, sadly.

  • beyondplm

    Ed, you reminded me one of my old blog articles.

    PLM, viral sales and enterprise old-schoolers.

    http://beyondplm.com/2013/01/28/plm-viral-sales-and-enterprise-old-schoolers/

    Large companies are rarely developing. They are mostly maintaining, integrating and consolidating.