What’s wrong with “analog PLM”?

What’s wrong with “analog PLM”?

analog-computing-devices-plm-old

In engineering world, digital vs. analog differentiation can be explained in a very precise manner –  analog is using electrical signals. Opposite to that digital is using binary format. Things are much more complicated when “digital” term comes into marketing realm.

Accenture article Faster, fitter, better: Why product innovation is going digital is taking a marketing spin and coming with the term of “digital PLM” as a system or approach that can help companies to improve their product development processes and improve innovation. I like the following passage explaining the difference between digital and non-digital PLM worlds:

Such models, where the multiple processes and systems live in silos, inhibit the flow of relevant information needed to optimize product development. Engineers, for example, are often disconnected from the new-product introduction process. As a consequence, new-product launch teams don’t always hear about critical last-minute design changes. And because vital insights are not shared, the solutions that eventually emerge from this fragmented system just aren’t meeting customer expectations for innovation and relevance. Moreover, because of this linear approach, product launch is often delayed.

It made me think more about “digital vs. analog”. An obvious thing to think about companies are using “paper” trays to share information and manage processes. I can see it as an option. But the chances are companies are using at least emails to share information. Still can be very complicated way to collaborate. I guess most of companies are trying to step up from an analog way of sharing information and manage changes using emails into the realm of PLM systems. But it doesn’t go very well in many situations or it gets very expensive. So, what is the problem with inefficiency with “analog PLM” implementations?

One of the core elements of every PLM system is its “data model”. Jos Voskuil provided a good explanation about what is “PLM data model” in his last article – Importance of PLM data models. According to Jos, the success of PLM implementation depends on an efficient organization of PLM data models. Here is the passage that explains that:

My conclusion for both situations is that it all leads to a correct (PLM) data model, allowing companies to store their data in an object-oriented manner. In this way reflecting the behavior the information objects have and the way they mature through their information lifecycle. If you making compromises here, it has an effect on your implementation, the way processes are supported out-of-the-box by a PLM system or how information can be shared with other enterprise systems, in particular, ERP. PLM is written between parenthesis as I believe in the future we do not talk PLM or ERP separate anymore – we will talk business.

Jos is absolutely right in his assumption. The traditional world of PLM implementations is requiring to translate everything into PLM data models. Take every PLM system and you will see this step as an essential element of every implementation. If I refer to the engineering definition of “analog” system, it means to translate the real world – organization, data, processes into “analog world” of data models. With a full respect to flexible PLM data models and architectures, this translation is creating distortion and the process itself is very complicated.

Is there a better way? I think digital PLM (or whatever other name our marketing genius  bring) will have to abandon old “analog PLM” practices of data modeling as a complex and outdated approach. Digital PLM will be able to use native representation of product information and product development processes across silos without translation into variety of PLM data models abstractions. It doesn’t mean data models will disappear tomorrow. Software will have to use data models anyway. But customers and implementers will be excluded from the loop.

What is my conclusion? Think about existing PLM data modeling approach as a way to translate native sound or video signals into electric signal to transmit it into organization. Think about it as an “analog approach” similar to what we had in the past in audio and video recording. In PLM it created a whole level of implementation complexity – a need to define data models and and map organizational reality into these models. Future “digital PLM” will have to bring a better way and exclude people from a formal data modeling loop. It will make PLM systems simpler to understand and easier to implement. Just my thoughts…

Best, Oleg

picture credit Wikipedia article presenting an ancient analog computer

Share

Share This Post

  • Hi Oleg,

    to me that’s why Standards have been defined. Lets have more open standards managed by If they are not good enough to be configured for industries (configuration would just make some objects and relationships unused), then lets defined them and I hope it will be managed by organisms like Oasis and not just the ISO which is just a lame organisation to me (personnal opinion).

  • beyondplm

    Hi Yoann, thanks for your comment! The issue of standards is very controversial. It is hard, expensive and takes time to agree about standards. It must be something REALLY painful, so people agree on standards. Most enterprise vendors’ business models are driven by customer lock. So, standards is not something vendors want to invest. Just an opinion…

  • norsjo

    Hi Oleg,

    (quick disclosure I work for IBM)
    I think it is an important point you raise and in the background a regular theme in your blog, on one side products are incredibly complex and data models are critical to keep control and make them efficiently, hence the PLM and ERP vendors make solutions to cope with that complexity. On the other side design, development, manufacturing etc,. needs to move faster and involve more “silos” of the company hence the digital extension of PLM that can be enabled by other systems.

    IBM, Accenture, Gartner, etc., all have a “Marketing” term for this front office back office trend. Gartner call it Nexus of Forces, IBM list it out as CAMSS (Cloud, Analytics, Mobile, Social and Security) and you link to Accenture’s viewpoint above.
    At IBM we see the adoption of this new wave of IT being realized earlier in industries that don’t have complex products – but now that wave is coming to industries where PLM is core. We will hear a lot more of this in the next year I am sure.

  • beyondplm

    Thanks for comment! Indeed, the frameworks to manage complexity are important. In my view, the frameworks you mentioned are providing very high-level definitions to support data and process abstraction. Maybe this is what you mentioned by industries without complex products?

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog » Crowdhooking PLM to organizational processes()

  • Pingback: Workspreadsheet and workmessegeflow | Daily PLM Think Tank Blog()

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog » How to decouple PLM from our psychic system()

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog PLM and the death spiral of cultural change - Beyond PLM (Product Lifecycle Management) Blog()

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog PLM over-engineering - Beyond PLM (Product Lifecycle Management) Blog()