PLM and industrial internet trend

by Oleg on January 10, 2014 · 4 comments

industrial-internet-plm

Management of product data was always a first and most important imperative for PLM solutions. Depends on vendor strategy and various historical reasons, vendors are focusing on different dimensions of data – CAD design, bill of materials, manufacturing data, supply chain, etc. Regardless on the priority and marketing differentiation, any PLM solution today is trying to cover all data dimensions I  mentioned above.

The ease and flexibility of  data management is what makes some PLM solutions shine better than others. At the end of the day, customers are expecting PLM solution to provide out-of-the-box yet flexible data model to support CAD, Bill of Material, Part, ECO, Simulation and sometimes other data as well. The last one (Simulation) was actually very challenging piece for PLM vendors. To manage significant amount of simulation data together with CAD and BOM data is not a simple tasks. Some vendors built simulation process management solutions  for that purpose.

However, PLM vendors might be caught by something unknown and unexpected. New tsunami of data is expected in manufacturing world. Yesterday, I was talking about IoT trend here. Earlier this morning I was drinking my coffee and skimming Manufacturing Trends to Watch in 2014 article. One of them, caught my attention – The ‘Industrial Internet’ Will Flourish. Here is the passage I specially liked:

If you think the data generated by today’s back office, MES, control, supply chain, and warehouse management systems is overwhelming, just wait. Increasingly, manufactured products from cars to airplane engines to medical devices are being outfitted with sensors and Internet connectivity that allow them to broadcast back to manufacturers information on things like how they’re being used and why they broke, and when they need to be serviced. In fact, it’s estimated that, by 2020, 40% of all data generated will come from such sensors. GE calls this trend the Industrial Internet and estimates that it will add between $10 trillion and $15 trillion to global GDP in coming years.

It made me think, PLM data architecture can be challenged by the wave of data that can be compared to Google and Facebook scale. To process, store, access and analyze this data will take time and resources. Traditional SQL databases will be probably not an ideal solution, which brings me back to my writeup about PLM and Data Management in 21st century.

What is my conclusion? The amount of data is growing exponentially. Soon we will be coming to Yottabytes of data. Industrial internet alert should be a wake-up call for many PLM vendors to think about future data architectures. Just my thoughts…

Best, Oleg

* image credit to trainordaviesdesign.com

Share
  • pgarrish

    Being blunt, the volume of data in even a big PLM installation is pretty small. A few 10′s of GB in the database, maybe a few 100 GB of CAD models…. When you start looking at sensor data collection and have to look at maybe TB’s a MONTH of new data, it’s clearly a very different environment. Personally, I don’t think PLM is the place to put this data or analyse it, BUT PLM is certainly the place to capture the results of the analysis and respond to the knowledge. If you consider a chemical plant with hundreds or thousands of sensored valves, pumps etc, a clear link is between the actual performance of the components and the design performance. That may be a mis-match in either a negative (failing to meet requirements) or positive (over-performing) direction – both of which could result in design change, plus the actual performance levels should be in the ‘MRO’ system (which may be part of the PLM toolset). The number-crunching involved in collecting and analysing that volume of data is way beyond a PLM installation, but there is clearly a need to interface the systems at some level to not lose the knowledge the data collection and analysis can uncover.

  • beyondplm

    Paul, I agree – PLM databases are VERY small (especially if you are taking CAD data out of scope). The question I’m asking is not “what database” to put information from tracking product activities, but how to “connect it” to the right product requirements, BOMs and other places it might impact. Best, Oleg

  • pgarrish

    Cheers Oleg. I guess the point I was trying to get across is that because of the differences in scale and processing, we need to think about what level of connection – linking the raw data to the PLM record (e.g. holding sensor readings against an ‘as-maintained’ BoM for example) is not helpful, BUT… its this raw data that’s needed to (for example) calculate MTBF’s and hence derive maintenance schedules. I think we PLM ‘evangelists’ have a tendancy to want everything in the PLM system, but there is really no way to do this with genuine Big Data, so we need to think about what we need, where it comes from and, as you say, how best to link it. This tends to live very much in the MRO/As-Maintained world – not a traditional PLM strength – but very much where the requirements are borne out (or not!) and they are very definitely part of the PLM landscape.
    I think there are two levels of linkage to be considered – the obvious ‘actual’ performance of an asset in the field – e.g. actual failure stats for serialised equipment. But also, the aggregated results for a CI – e.g. the overall performance of the model of pump in use in a specific design location – does it meet requirements. It starts to raise one very intersting question (of many) – do we need to serialise everything now we can collect data about it? I can just imagine the response to that suggestion in most engineering departments :-)
    I suppose a left field question here is whether you could run PLM on a Big Data technology – say Teradata, Netezza? After all, PLM doesn’t really require very fast response times…. and if you CAN co-locate the systems, that opens up some very interesting reporting and analytic opportunities which current PLM systems cannot offer.

  • beyondplm

    Paul, you are right… historically, PLM vendors and evangelists are favoriting placing everything into single PLM database. IMHO, this is still how everybody explains PLM concepts, but this approach is getting old. It is time to think about looking on data management tech stack as a toolset – new noSQL and other (big data) tech are available to manage and link massive amounts of data.

Previous post:

Next post: