Why the future of PLM Integration will go beyond moving data from A to B

Why the future of PLM Integration will go beyond moving data from A to B

ProSTEP article caught my attention by the very interesting title – How do you efficiently implement the digital integration of PLM, ALM, MES, ERP, and other legacy systems? In the overall hype about moving existing technologies to “digital” technologies, I was curious to learn what is so special in digital integrations between PLM, ALM, MES, ERP systems. These systems exist for at least 2-3 decades and most of the large enterprise manufacturing companies are using them. Integration is one of the most painful topics for these large companies. I’ve been following ProSTEP work for quite some time and OpenPDM seems to be doing integration work for quite some time.

ProSTEP article seems to be talking about the challenges of unsynchronized data and problems of point-to-point integrations.

Integrating product data is a challenge for even the most proficient company. Full centralization ‘locks’ you in and prevents future viability of your PLM. Productivity suffers when an end-user is limited to only one tool, which usually requires cumbersome manual integrations.

Point-to-point integrations do not scale properly and eventually leave you with budget constraints and limited ability to respond to change. When data is duplicated throughout multiple systems, compatibility issues and unsynchronized processing time will most likely occur.

Incompatible, unsynchronized data leads to inconsistencies in product development and the rework of incorrectly manufactured parts. This additional work means that you will spend more money, more time and more resources on an integration.

As PLM vendors work to keep your data siloed in their system, how can you work efficiently when your data is stored elsewhere: in other business units, other departments – perhaps even other countries?

An interesting twist is bringing connected products and blockchain technology as part of the integration problem, but doesn’t say exactly why it has an impact on the integration of data between PLM systems developed 20 years ago is not clear.

Smart connected products and blockchain technology have revolutionized product development. These tools provide first rate security and real time data access, but they also add a level of data management to your product development processes.

Check this video provided by ProSTEP. I think it is a superb marketing job demonstrating moving co-centric nuts with data plugs. Fantastic marketing.

 

This video made me think about the new challenges of data integration these days. Here are some aspects of technological changes in IT as we are moving in the world of public and private cloud systems. I’m curious what ProSTEP and other vendors answer on these challenges.

1- Data distribution

I can see an increased level of data distribution between data centers, public and private cloud storage systems and online services. It increases the level of complexity. Centralized data mapping can become a challenge as access to data storage will be limited to specific REST APIs and services. In the situation when data will be still accessible in an old fashion way, synchronizing a large amount of data between multiple systems can be hardly achieved.

2- Real-time data

The demand to maintain real-time data accuracy is another challenge. A traditional PLM-PDM data batching and synchronization transfers might be a thing in the past. While there are many scenarios where traditional data exchange can work, we are going to see a growing demand for maintaining a new level of data availability access silos.

3- Data ownership and speed of change

Back to old good days of PLM, IT departments of large enterprise companies owned an entire data management stack. However, I can see some changes in the way data is owned by a single company. More distribution in manufacturing and a growing number of online platforms will make data integration scenarios more complicated and intertwined.

What is my conclusion? In the early days of point to point integrations data was freely flowing from point A to point B. Then more centralization came into place and multiple EAI (enterprise application integration) options were introduced such as data hubs, integration middleware systems capable centralize routing of data from A to B using mapping and other services. I think we are coming to the next level of integration demands where data can be efficiently shared and consumed by multiple services. New demands, new requirements and potentially new technologies and providers. Everyone else will have to catch up. Just my thoughts…

Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud-based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post