Fog PLM?

Fog PLM?

fog-plm

For the last few years, we’ve said and heard a lot about cloud and PLM. Isn’t it a time for a new buzzword? What about “Fog PLM”? Are you ready? Forbes article – What Is Fog Computing? can give you an idea. In a nutshell, Fog computing (and I’m not sure if I like the term or not) is about how to use the power of distributed devices for data processing.

The following passage can give you some ideas:

Cloud computing refers to the ability to store data and retrieve it from off-site locations. It’s how your smart phone got so smart; phones don’t have enough built-in storage to maintain all the information they need for all their apps and functions, so they’re constantly transmitting and receiving data in order to provide you with the services you want.

Fog computing, also sometimes called edge computing, solves the problem by keeping data closer “to the ground,” so to speak, in local computers and devices, rather than routing everything through a central data center in the cloud.

In our modern, western world, we exist with a huge amount of computing power around us all the time.  How many devices do you have at home at any given time?  A phone for every member of the family, a tablet or two, and probably a laptop in addition?

At first you can think about the idea as something completely crazy in the context of PLM world. The primary goal of PLM systems is to centralized and organize data. What would be the reason to pull multiple local devices into the story?

Here is thing… One of the problems in PLM world is very slow speed of changes. Companies are literally sitting on large amount of devices, data, local servers, desktops and laptops. To centralize all this data can take time and resources. Manufacturing organizations are not entirely ready for that. This is one of the reasons PLM development is dependent on a huge transformation effort changing systems, storage, communications and processes.

Now think about opposite approach. Instead of planning how to transfer all engineering and manufacturing data into cloud servers, company can develop strategies and technologies that will use local systems and data storage to “collaborate” with centralized systems. Such “edge” approach can potentially work with individual computers as well as with entire PLM systems. The key element of such approach is connectivity and data connection. Opposite from big PLM transformation projects, it can provide a reliable way to combine existing PLM assets and new cloud systems in a single connected infrastructure delivering data to decision makers.

What is my conclusion? The original idea of PLM was to centralize data and use to for control and decision process. The idea is nice, but implementation is terrible. Companies that did it spent lot of resources coming to achieve a desired level of centralization. The globalization and cloud disrupted the idea of central PLM for a single company. The goal by itself is not achievable and can take too much resources and time. Also, dependencies on many players in a global manufacturing network are creating an additional barriers to establish single central PLM. An opposite approach connecting resources and systems together can be an alternative. Networks are much more powerful that centralized systems The future belongs to networks. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post