A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

How Edge computing can help to manage distributed PLM processes

How Edge computing can help to manage distributed PLM processes
Oleg
Oleg
24 August, 2016 | 4 min for reading

edge-computing

“Single version of truth” is one of the most popular paradigm developed by PLM vendors for the last 10-15 years. The concept is loved by many companies and it has many powerful points. At the same time, the devil is in details and wrong implementation strategy can diminish the value of PLM single version of truth. One of the biggest challenges PLM implementation might have is related to distribution of manufacturing companies across the globe and distribution of product development and manufacturing processes. Replication of data is costly and not always possible. Most of existing PLM architectures are relying on variety of data synchronization cycles. A question how to manage distributed product development processes is one that on the table for many IT and PLM system managers.

Forbes article Will analytics on the edge be the future of big data brings an interesting perspective of “Edge computing”. If you haven’t heard about that, take a look on another article – What is Edge Computing by TechTarget for additional explanations.

In a nutshell, the idea of edge computing is simple – process data near the data source and optimize traffic by splitting data transfer between the data that needs to be synchronized immediately and information that will be transferred later for different purposes such as archiving and record keeping. The following passage can give some examples how edge computing can be applied to IoT domain.

Sometimes known as distributed analytics, it basically means designing systems where analytics is performed at the point where (or very close to where) the data is collected. Often, this is where action based on the insights provided by the data is most needed. Rather than designing centralized systems where all the data is sent back to your data warehouse in a raw state, where it has to be cleaned and analyzed before being of any value, why not do everything at the “edge” of the system?

A simple example would be a massive scale CCTV security system, with perhaps thousands or tens of thousands of cameras covering a large area. It’s likely that 99.9% of the footage captured by the cameras will be of no use for the job it’s supposed to be doing – e.g. detecting intruders. Hours and hours of still footage is likely to be captured for every second of useful video. So what’s the point of all of that data being streamed in real-time across your network, generating expense as well as possible compliance burdens?

Wouldn’t it be better if the images themselves could be analyzed within the cameras at the moment it is captured, and anything found to be useless either discarded or marked as low priority, freeing up centralized resources to work on data of actual value?

It made me think about PLM and distributed processes management. Modern process management for product development, manufacturing and supply chain can apply similar principles to enable data proliferation and distributed data processing. Think about heavy CAD file located on a desktop. Fast data processing can be applied to extract valuable data needed for decision making while rest of data will be delivered later. Similar to that manufacturing process analytic and supply chain optimization can taken faster by recombining fast processed data across the globe. While long synchronization of data can take time, fast processing of data on edges of distributed system can improve decision process and enable processes that took 24 hours cycle before.

What is my conclusion? We are coming to the era of new version of “single point of truth”. The truth is distributed and processed over the time. Companies are going to discover new data processing paradigms relying on availability of powerful computing devices everywhere and network computing. Existing PLM platforms are mostly database driven and relying on centralized data processing. New concepts of parallel data processing can combine edge computing with asynchronous data processing and analysis to deliver better process and decision management to manufacturing companies. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Image credit Forbes article 

Recent Posts

Also on BeyondPLM

4 6
18 January, 2013

PLM vendors are continuing to adopt cloud. I can clearly see a difference between people attitude for cloud solutions now...

19 January, 2009

Social networks and social activities is drawing more and more focus in our life. Crowdsourcing has become very popular in...

7 November, 2017

The days when design was mostly about mechanical assemblies is over. It is hard to imagine a product today that...

15 April, 2016

Hardware is growing these days. According to Bolt VC blog, hardware industry is the fastest growing sector in the market,...

7 September, 2024

What is the next big topic in PLM technology, PLM systems, product data management and product development process modeling? Speaking...

30 December, 2016

Just few years ago “cloud PLM ” was an oxymoron. A conservative position of engineering IT was that cloud is...

26 June, 2021

Efficiency and productivity are at the core of many decisions made by manufacturing companies. How to reduce the cost and...

18 March, 2014

How to integrated PLM and ERP? This is such an old topic. I’ve been discussing it on the blog so...

29 February, 2016

Entrepreneurs are taking manufacturing by storm. Recent HBR article –Entrepreneurs Take On Manufacturing by Mark Muro gives you an interesting perspective on...

Blogroll

To the top