A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

Will PLM vendors dig into Big Data?

Will PLM vendors dig into Big Data?
Oleg
Oleg
18 May, 2012 | 3 min for reading

Big data is hyping trend these days. Many people is using the term of big data for different purposes and situations. Here is a problem of big data in a nutshell, how I see it. The data is growing. It is growing in organizations and outside of organizational boundaries. It is growing because of application complexity and implementation complexity. My take is that each time we face “data problems” that cannot be solved in a traditional phase, the case of “big data” discussion comes up. To confirm that, take a look on the definition of Big Data you can find in Wikipedia:

In information technology, big data consists of data sets that grows so large and complex that they become awkward to work with using on-hand database management tools. Difficulties include capture, storage,[3] search, sharing, analytics,[4] and visualizing.

So, I wanted to come with some examples of situations where “big data use case” is real and can bring a significant value to manufacturing organizations. My attention caught by the report made by SAS – Data Equity: Unlocking the Value of Big Data. You can grab a copy of the report by registering via this link. Download your copy. I’m sure you find it interesting. Here is a very good explanation about why big data becomes important.

Big data is becoming an increasingly important asset to draw upon: large volumes of highly detailed data from the various strands of a business provide the opportunity to deliver significant financial and economic benefits to firms and consumers. The advent of big data analytics in recent years has made it easier to capitalise on the wealth of historic and real-time data generated through supply chains, production processes and customer behaviours.

Big data can bring value. This is what you can learn in the SAS article. You can see it on the chart SAS presented to show BigData forecast to 2017 (see below).

Thinking about PLM and the impact on specific industry sectors, the example of a supply chain is very appealing. The data in a supply chain is getting really messy. Here is a very insightful take on supply chain and big data from the same SAS report.

Optimal inventory levels may be computed, through analytics accounting for product lifecycles, lead times, location attributes and forecasted demand levels. The sharing of big data with upstream and downstream units in the supply chain, or vertical data agglomeration, can guide enterprises seeking to avoid inefficiencies arising from incomplete information, helping to achieve demand-driven supply and just-in-time (JIT) delivery processes.

Why big data is complicated and why software vendors may consider it? Here is the interesting quote from the report that actually explains that:

A major obstacle to undertaking big data analytics is the level of technical skill required to operate such systems effectively. Although software solutions for tackling big data continue to become more user-friendly, they have not yet reached the stage where no specialist knowledge is necessary. The requisite skills for big data analysis are above those required for traditional data mining, and the cost of hiring big data specialists can be prohibitive for many firms.

Big Data and PLM vendors

I haven’t seen PLM vendors providing examples and mentioning big data.  I think the fundamental problem is technology. The majority of PLM software vendors are running PLM products based on platforms developed 10-15 years ago. All these solutions are relying on RBDMS. As we learned, RDBMS doesn’t scale at the level of big data. An interesting exclusion is Dassault System, which decided to acquire Exalead back 2010 and improve their semantic indexing and search. However, I haven’t seen any implementation of Exalead applied to manufacturing and big data domain.

What is my conclusion? The value of big data is undoubted. To adopt “big data”, PLM vendor needs to go to “unknown” place characterized by a different technological stack. It is not clear how they will do so. The time is running. The ability to dig into big data problem will become an imperative very soon. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
13 May, 2015

I’ve been reading a blog Getting PLM right: no one answer by Monica Schnitger. It is a slick writeup started from...

19 March, 2015

Things are moving fast these days. Vendors are speeding up their plans to catch up with new development. The changes...

24 October, 2022

PTC is moving to SaaS. The move started with the acquisition of Onshape and turning it into the Atlas platform....

19 August, 2014

Do you know what is legacy software? Earlier today,  Marc Lind of Aras Corp. challenged me by his twitter status...

15 January, 2019

I’m not going to CES these days, but I probably should. Time is too big constraint. But I always read...

14 April, 2024

In today’s rapidly evolving landscape of data management and artificial intelligence (AI), the relationship between the two is becoming increasingly...

19 April, 2011

As you know, I spent end of my last week, including the weekend at COFES (The Congress of the Future of...

14 April, 2010

A couple of weeks ago I had healthy debates about PLM and Innovation with Jim Brown of TechClarity. If you...

17 August, 2015

Last year, my attention was caught by CIMdata article – IBM Forms New Watson Group to Meet Growing Demand for...

Blogroll

To the top