The challenges of Product Lifecycle Monitoring

The challenges of Product Lifecycle Monitoring


Collecting data is a pervasive trend these days. 20 years ago people smiled when Google founders promised to index internet. Google knew that Internet is very big. How big? The first Google index in 1998 already had 26 million pages and by 2000 the Google index reached the one billion mark.

Fast forward in 2016 the idea of scanning, indexing and monitoring of large amounts of data feels completely different. Just look around and you will see how many businesses are creating value by collecting information about everything – people, maps, traffic, points of interests, purchases, products.

The last one is interesting. In the past, companies were capturing information from products such as airplanes and other complex devices. Boeing 787 creates half terabytes of data during the flight. Manufacturing is investing into big data and how to improve services and quality of products. These days, the opportunity to capture data from every single product is real.

My attention was caught by HBR article – Using IoT Data to Understand How Your Products Perform. Sensors are getting cheaper and data technologies are better. The following data points is amazing if you think about the scale of data collection

We’ve all seen some eye-bulging numbers in recent years about the internet of things (IoT). Since 2011, General Electric has publicly stated it would spend more than $1 billion on developing sensors, wireless devices, and related software to install on its aircraft engines, power turbines, locomotive trains and other machinery. Companies such as Ford, Toyota, and Caterpillar have invested heavily as well. And our own survey of 795 large companies (average revenue of $22 billion) in North America, Europe, Asia-Pacific, and Latin America found average per-company spending on IoT initiatives — $86 million in 2015 — was projected to grow to $103 million by 2018.

But the problem is that collecting data doesn’t make you company smarter. Now we have data (because we can), but how to recognize pattern in product behaviors and collect to a specific metrics and information that can be consumed by companies and their customers.

Even if you installed IoT technologies, you need – Getting customers to agree to have their products monitored, which in turn means giving them something of value in return; Product performance data must be processed and acted upon quickly. A culture that accepts the truth and, finally, thinking how to re-imagining the business using the data. IoT technologies doesn’t bring an ultimate truth about products. Companies need to do something with that information.

So, the vision of some PLM companies is to transform from data management into “big data” companies. The following article by Siemens PLM – Use Omneo Big Data analytics to gain product performance intelligence brings some information about Omneo product line. I already had a chance to cover it in my blog earlier. Check the following link –here. Here is an interesting passage:

Our vision for Big Data analytics as a service is truly different than traditional business intelligence (BI), and the PLM industry knows this. We are recognized for our Big Data thought leadership. We’ve had an exclusive cover story in Design News Data, which highlighted how Big Data is making its way into product design.

Michael Shepherd, senior strategist for product management at Dell, used Omneo to show how Big Data analytics is revolutionizing the way Dell analyzes data. Shepherd created a video comparing Dell’s capabilities using the Omneo platform and the next generation of data mining versus traditional BI running at high speeds. In his example, Dell needed to analyze 5.7 billion records, and there were 250 million potential dimensions. The major difference between traditional BI and data mining was the ability to put in all dimensions at once rather than going dimension by dimension, Shepherd said. Using this method, he had more than 250 million dimensions across 5.7 billion data sets within 45 seconds, saving a tremendous amount of time.

Such amount of information can put at stress any PLM technology that was mostly designed to manage design data for a single company. This is what Teamcenter, Siemens PLM flagship product does. How much information new “monitoring” paradigm will create and how PLM platforms will be able to manage it – this is an interesting question to ask.

What is my conclusion? Back in 1998, indexing 26 million pages was a crazy task. Today, PLM companies are looking how to analyze 5.7B records and to convert it into meaningful information. What does it mean for existing PLM technologies and how existing PLM platforms will be expanded and transformed, this is a question to ask PLM vendors. Some of their products might be retired and replaced by new big data platforms. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.


Share This Post