PLM big data projects can fail to deliver fast ROI

PLM big data projects can fail to deliver fast ROI


Data is a new oil. Businesses are learning it every day. For the last 5-6 years, many businesses have moved from ignoring the data assets to hysterically collecting every piece of information.Few months ago, I shared some of my thoughts in the article – Big data expectation and reality.

There are several reasons that manufacturers can become a primary beneficiary of big data boom. Manufacturers are uniquely positioned to benefit from big data. Every industry and individual is touched by manufacturing. Manufacturers were among first industries to make wide data collection in a standard practice. Many examples from automotive, aerospace and other industries. Manufacturing companies typically don’t face the data collection barriers. Whether they know it or not, many consumers readily provide valuable data to manufacturers on a daily basis. The opportunity driven by big data can include improving product quality, help to discover new design for existing products and find new product opportunities.

PLM vendors embarked on big data projects and some of them reported interesting results. I found one of the most notable is Siemens PLM big data analytic project. Navigate here to read more.  Here is an interesting passage from Siemens PLM website.

Your customers demand more intelligent products and intelligent products demand product intelligence. Product data scattered throughout a complex, global value chain that spans from suppliers to customer experience prevents comprehensive data analysis. Big data analytics gathers the scattered data and absorbs the volume required for complete analysis. But you need trustworthy, big data analytics that cleanse, fuse, rapidly search and analyze your big data in a singular platform to provide the critical product intelligence. Omneo delivers this transformative big data analytics technology and generates the product intelligence that will revolutionize the way your business operates.

It is also related to Omneo project, I’ve been blogging about earlier – Siemens PLM cloud services and big data and Why Big data opportunity for product information is huge.

Unfortunately, for most of manufacturing companies, big data is a big dream and research project. In order to turn big data project into reality, companies should develop environments to support continuous analytic and not one time huge effort to collect data and make it usable. And this is not a simple task.

TechCrunch article Why the promise of big data hasn’t delivered yet can give you an interesting perspective on challenges of enterprise big data projects. Here is my favorite passage:

All that’s happened is that technological innovations in data handling capability (made by companies like Google to deal with the scale and complexity of Web 2.0) temporarily leapt ahead of our progress in learning how to apply them — progress we make through experimentation. In the interim, firms have defaulted to leveraging big data in exactly the same way they previously used small data: for reporting and business intelligence. Having invested in purpose-built tools to analyze data at scale, they’ve been rewarded with cool interactive dashboards visualizing it. These are basically auto-generated charts, conspicuously similar to the manually created Excel and PowerPoint reports executives were staring at back in 2005, but far prettier and costlier. It’s easy to see why this approach hasn’t quite delivered on the big data promise.

And one of the core reasons is related to… people.

Big businesses have absorbed Google-style tech, but are only just beginning to adopt Google-style thinking alongside it. Algorithms now detect when drilling equipment in oil fields is about to fail based on thousands of sensor data points, enabling “predictive maintenance.” Imagine if, instead of applying machine learning to the problem, analysts had compiled these complex data sets into summary reports and tried to divine “insights” about why the equipment breaks so they could attempt to stop it from happening.

The beauty of predictive algorithms is that they don’t need to understand the cause and effect behind statistical relationships in order to work incredibly well in practice. For an enterprise to glean the benefits of prediction, it must first give up trying to deduce why things are a certain way, and start trusting the lines of code which tell us that they are. This requires a cultural shift, and all new technologies encounter initial mistrust.

What is my conclusion? To make big data projects work for enterprises the technology and human trust should come together to deliver the results. From technological side, multiple environments need to be connected to existing data management and data processing infrastructure. It will create an automated pipeline between data scientists, to big data engineers and finally product engineers working on specific big data solutions. But, at the same time, companies should change processes of working with data and making decisions based on the data. And this is very hard thing to do. So, don’t jump fast on ROI for PLM big data projects – it might require even longer adoption cycle. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.



Share This Post