PLM and Disintegration Challenge in 2016

PLM and Disintegration Challenge in 2016


Almost hundred years ago Henry Ford started to build River Rouge factory. That was a giant plant that literally took and iron ore at one end and sent automobiles out the other. Back in that time, it seemed the future. The following passage can help you to imagine the scale of that factory:

The Rouge had its own railroad with 100 miles of track and 16 locomotives. A scheduled bus network and 15 miles of paved roads kept everything and everyone on the move. It was a city without residents. At its peak in the 1930s, more than 100,000 people worked at the Rouge. To accommodate them required a multi-station fire department, a modern police force, a fully staffed hospital and a maintenance crew 5,000 strong. One new car rolled off the line every 49 seconds. Each day, workers smelted more than 1,500 tons of iron and made 500 tons of glass, and every month 3,500 mop heads had to be replaced to keep the complex clean.

Why Henry Ford did it that way? In 1917, doing everything in a single integrated factory seemed to Ford the only possible way to get the scale needed for Ford Motors. Back in the early 20th century, big companies were synonymous with efficiency. Fast forward 100 years. In the early 21st century, big companies is a synonymous of inefficiency.

Manufacturing companies are not operating as Ford Rouge factory anymore. Supply chain is an obvious thing in manufacturing. Each company in the supply chain focuses on what they can do best. If they don’t, they can be swapped out for another supplier.  But here is the problem… If you want to solve a problem using a network of cooperating companies, you have to be able to coordinate their efforts. And you can do it much easy with computer systems that can optimize and coordinate a network of manufacturing companies. This is a fundamental change and it occurred to many industries. It happened to existing industries, but it also changed industries and created new ones. Sometimes the existing companies weren’t the ones who did it and new companies replaced old ones.

Few months ago, I shared my thoughts about future of manufacturing networks. Majority of manufacturing relations today are point-to-point. We can define things, design and engineer them, plan the production and make things at the end. This is how most manufacturing relations work. What if we can connect them together into digital network in the way that each element – virtual or physical will be able to define the network and empower it? Similar to the internet it can grow into the network that won’t require a central control, which will be a departure from current point-to-point environment.

You might think this is what enterprise software in 21st century supposed to do. And if you think about manufacturing, you might expect product lifecycle management is to be in the front line of this change. Let me come back to the reality of PLM architecture and systems today. The current PLM paradigm is built around single database architecture. The fundamental idea of product lifecycle data management was to build a vertical system that can organize product data and processes in a single company. Similar to Ford Rouge factory, it seemed as the only way to scale.

Current PLM paradigm, which served many large companies for the last 20 years created 3 major problems:  1/ PLM implementations; 2/ data silos; 3/ central control.

PLM implementation is extremely complex process. It is heavily depended on company data, processes, agreement between people. It takes time, effort, money. PLM systems are competing for the flexibility to adapt to any manufacturing environment and process. Once implemented, upgrade to a new software becomes to be another challenge. Not many systems can handle that. All together, it limits wider adoption of PLM systems.

Current PLM architecture is creating data silos. Each company is a silo of information, data, files. It is not unusual to see multiple PLM systems at the same large company because of different reasons – historical M&A activity, complexity of implementation, tools and version support.

PLM implementation is centrally controlled. It has hierarchy of administration and management. Once it is done for a company, you have PLM administrators, data models, processes, etc. centralized in a single place. You cannot make two companies’ PLM systems working in coordinated manner.

Here is a next challenge I can see PLM companies will face in 2016 – disintegration. Processes are getting more disintegrated, more suppliers involved, competition is increasing, the demand to scale, mass customize, manufacturing global, manufacturing faster. Basically, think about taking 100 years old Ford Rouge factory and expanding it to the size of the world.

What is my conclusion? Manufacturing singularity. Think about data about all products, manufacturing companies, customers buying these products, using these products, sensors connected to all these products. All this information is live and can impact what every single manufacturing company in the world can do. Now, think about network that can combine and optimize it together. It might a be a New Year dream. But, it can be also a new way to think about manufacturing in 21st century. Just my thoughts…

Best, Oleg

Picture credit Wikipedia Network article and Martin GrandJean


Share This Post

  • Disintegration in PLM and manufacturing is a very interesting idea. I have imagined PLM as growing into a distributed but organized network of data and control, but your comparison of disintegrated manufacturing with the internet is worth exploring.

    Obviously, the internet is held together by standards (HTML, etc) and protocols, but at a fairly low level of end-use specificity. Do you think product manufacturing data could be broken down into such a low, generic level of end-use specificity? Or do you think there is a better level of information that is or could become generic?

    If precise engineering data is not shared (accurately), then I think requirements data would need to be better shared (which is probably the best level to shared product data to allow better specialist innovations). Do you think we have (or can develop) good generic ways to communicate (precise, unambiguous) requirements data? Or is it better to communicate precise design data (and keep the design process central)?

  • beyondplm


    thanks for your comment! You are asking absolutely right question – how to share data.

    In my view, it is possible to share data. The problem is that the mechanism of product data sharing cannot be developed in an abstract vacuum. Some of standard initiatives reminded me an abstract quest for “data exchange and share standard”. It doesn’t work. Standard development is a result of a combination of technology, business model and vendors. Some industries developed better standards. Combined with internet data management principles it has a potential.

    To manage data centralized is not scalable. This is an approach used by CAD/PLM vendors until now. It is walled garden with no business incentive to share data.

    Just my thoughts…

  • Pete Ianace

    Oleg, are you aware of any meaningful manufacturing domain ontologies that can help make this a reality? Since you understand the overall value ontologies have brought to the buying and selling of cars.

  • beyondplm

    Pete, I understand the value, but I haven’t seen anything meaningful for manufacturing. Most of ontology work is rooted back to university researchers. I’ve been developing ontologies to represent CAD data in the past, but not manufacturing.

  • Pete Ianace

    I understand your point but I might suggest that CAD is a pretty critical element within the manufacturing engineering space. We are very involved in the MBSE space and work with all of the PLM vendors to provide a full 360 view. On another note take a look at our Concept Modeller application.