This is a time of the year when PLM analysts and journalists are trying to look at crystal ball for future predictions and analysis. Over the long weekend, my attention was caught by Joe Barkai’s 2016 trends, predictions and opportunities. For the last few years, significant effort was made by PLM vendors to expand traditional portfolio of PLM capabilities with new functions, domains and industries. Therefore, the following passage about future PLM trajectories from Joe’s blog was quite refreshing:
Leading PLM companies will continue to redefine and stretch the traditional definition of PLM, breathing a new life into PLM concepts that until now were mostly on paper, and technologies that were unable to articulate a credible use case. I am afraid, however, that this will produce more of those dreaded PLM conference talks in which the opening sentence is “let me give you my definition of PLM.”
While 2016 will brings new functionality into the PLM portfolio, one of the most critical gaps in product innovation and development is not going to be closed. The rush of PLM companies to acquire functionality (often in the form of overvalued early stage companies with no customer base to speak of) only adds to the fragmentation of an already complex product development process, fueled by myriad tools and many Excel spreadsheets. This is an opportunity for PLM and ERP companies to establish leadership by integrating the disparate tools and synthesizing data from multiple enterprise tools and data repositories to optimize product related decisions.
While I agree with Joe- integrate disparate data and tools is a very painful problem, it made me think why most of PLM vendors didn’t do much about it. Clearly, from my experience, enterprise application integration is one of the most hated category of software projects even among enterprise software itself. This is something you don’t want to do, otherwise you have it.
PLM implementation projects are complex and integration is one of the most complex. And “integration” is the most complicated part of it. The traditional PLM implementation approach is holding back PLM business. I’ve come to believe that the solution to enterprise integration problems is typically bottom up as much as top-down. One programmer or small implementation service team who knows the customer and enterprise software involved into the project can do things which would be real science-fiction project for any PLM vendor trying to apply top down approach. Especially, when it comes to to cost and delivery time.
It made me think about most of failed PLM and enterprise software integration projects I’ve seen before. There is one characteristic all these projects have in common – the buyers were not (going to be) the users of the software. Especially when it comes to the data and application integrations. Think more about it – integration of desperate data sources, transactions between PLM, ERP and other databases, record updates. These are important, but unfortunately invisible and less glamours parts of any implementation. At the same time, to bring a dashboard which shows a development progress or to show cool product visualization are examples of visible projects. The same happens with investment in PLM vertical and additional applications – new domains or applications are cool, can bring visibility to project owners, to bring new customers and opportunities. Opposite to that, to connect broken data pipes between multiple databases is not very sexy.
What is my conclusion? Data and application integration is a very hard problem. When it comes to PLM integration projects, it seems much easier to solve a problem using bottom up approach and implementation services team. It works… or at least worked until now for all large manufacturing companies spending mega dollars on PLM projects. It doesn’t work for smaller customers, but these customers are still not on board with the decision to implement PLM at all. It can be even bigger problem for new cloud PLM implementations – especially those that SaaS based and have limited architecture and budget to invest into expensive integration initiatives. Just my thoughts…
Best, Oleg
Pingback: Beyond PLM (Product Lifecycle Management) Blog » Dead end of product data ownership PLM paradigm()