What are top 3 PLM Hadoop Use Cases?

What are top 3 PLM Hadoop Use Cases?

If you are technology savvy these days, you probably know what is Apache Hadoop. It originally came to us from the magic world of Google and was derived from Google Map Reduce and Google File System. In a nutshell, Hadoop is a framework that allows you split data processing of huge chunks of data. It contains of two parts – Hadoop File System (HDFS) and Map Reduce. The role of HDFS is to take large file and split it into small chunks of data stored in many servers. Map Reduce is a framework that allows you to distribute the work between many processes and, by doing so, process vast amount of data in parallel by sharing the work to be completed between multiple servers.

Hadoop became very popular. The interest to Hadoop is growing. Availability to computing power from services like Amazon EC2 combined with the power of Hadoop has a potential to unlock many situations related to information analytic and data processing.

Few days ago, I had a chance to listen to the roundtable of analysts – Top 5 Innovation for Hadoop in Enterprise. The discussion was around booming interest in Hadoop, who is using it and what are the potential of Hadoop application in the enterprise organization related to hidden data. I captured the following opportunities:

Supply Chain optimization

Product Quality investigation

Workflow /Business Rules optimization

Visualization /Hypnotizes around data

Integration and automatic data discovery

High-Availability, data replication and global data access

It made me think about what is the potential of Hadoop in PLM implementations and how PLM vendors can leverage the power of Hadoop for product development and manufacturing.

1. Data high availability

The availability of data becomes very critical nowadays. Globally distributed teams, disparate data sources, networks and mobile access. All these elements together raise the question of making data available to right people in organization. In my view, the ability of Hadoop to process large volumes can solve many problems of data distribution and replication.

2. Product Quality Investigation

The demand for product quality increases significantly nowadays. However, very often, quality issue investigation is not a simple data task. It requires processing of large volumes of data coming from development, manufacturing supply chain, customer relation management system and other data sources. In my view, Hadoop can open new horizons in how product quality data issues can be analyzed.

3. Supply Chain optimization

Supply chain is one of the most critical factors allowing to manufacturing companies to optimize cost and performance. The stream of data related to internal and external suppliers’ performance are extremely complex. To get data about supply chain operation and make optimization can be a very interesting opportunity.

What is my conclusion? The potential hidden in data is huge. Hadoop unlocks the potential we haven’t had a chance to use before. Efficient data processing algorithms and data intelligence will be driving manufacturing companies for the next decade. Just my thoughts…

Best, Oleg

Share

Share This Post