Digital transformation and PLM data quality

Digital transformation and PLM data quality

Manufacturing companies are preparing to move into digital future. Digital transformation is one of the most hot trends. And as I’ve been thinking about this transformation one thing came to my mind – what digital transformation means for data in manufacturing companies? The reality of manufacturing companies is complex set of data and systems. All together, data is hold by multiple systems historically created by departments and functions. But this is not all. Modern manufacturing is moving towards even greater disintegration. To optimize cost, performance, global access to resources and supplies, manufacturing companies are optimized with a very specific set of functions. The tiers of contractors and suppliers are built to keep it going. And it creates another level of data complexity.

Earlier today, I read an article about data quality – The price of poor data quality. My attention was caught by an interesting phrase: Bad data is not better than no data. My favorite part of the article is about fragmented data. Here is a passage:

The crux of the problem is that as businesses grow, their business-critical data becomes fragmented. There is no big picture because it’s scattered across applications, including on premise applications. As all this change occurs, business-critical data becomes inconsistent, and no one knows which application has the most up-to-date information. It saps productivity and forces people to do a lot of manual work. The New York Times called this being a janitor. too much handcrafted work — what data scientists call data wrangling, data munging and data janitor work — is still required. Data scientists, according to interviews and expert estimates, spend from 50 percent to 80 percent of their time mired in this more mundane labor of collecting and preparing unruly digital data, before it can be explored for useful nuggets.

It made me think about evolution of data and processes in manufacturing companies. Time ago, it was good enough to keep data in department to serve internal processes such as design and engineering. Manufacturing planning was a separate function, which was run by a separate set of systems. Sales and services was separate too. Maintenance is support usually handled by completely different set of systems. The data connection and handover between systems was important, but not so critical. As speed of manufacturing increased, global market  and complexity of supply chain created new realities of data. The fragmentation of data is a real problem for business decisions made by companies.

And now, as industry is moving to a digital transformation, the question of data fragmentation and data quality is becoming critical. On one side, manufacturing company cannot reborn in one day to improve the quality of the data. At the same time, one of the critical questions, PLM implementations are facing is a bad quality of the data. And the question I wanted to ask is how to measure data quality.

In my view, one of the biggest things that affects quality of data is duplication of data across multiple data sets. When it happens the opportunity is ripe for errors and duplicates. The first step toward successful integration is seeing where the data is and then combining that data in a way that’s consistent. Here it can be extremely worthwhile to invest in proven data quality and accuracy tools to help coordinate and sync information across databases. As part of digital transformation process to create a consistent layer of data representing data connection and updates can be extremely helpful.

One of the recent trends is to present PLM as a layer on top of existing data management systems. While this is a good strategy to prevent big bang of replacement, the question of data quality is the one not to miss. They way data is connected and synchronized will define to quality of overall data management system.

What is my conclusion? Digital transformation will require significant reassessment of data quality metrics. Old ways of taking data ‘under control’ might not be enough. The data is becoming more complex and interconnected. To provide an assessment of data quality should be on the top of minds for PLM data analysts in coming years. Just my thoughts…

Best,Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of OpenBOM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Simon Hartl

    Hi Oleg,

    you are absolutely right. Also in my PLM projects for
    production companies the problem of bad data quality was always a key
    finding that most participants neglected until a certain project phase –
    but then it was escalated to a top priority issue.

    If we would have better data quality metrics it would be much easier to give priority and include the necessary data quality improvement work into the project plans (and budget as well).

  • beyondplm

    Simon, thanks for sharing your experience. Data quality improvement is absolutely must as companies are moving towards “digital” future.

    From your experience, what metrics are you using to make an assessment of data quality?

  • Pingback: Beyond PLM (Product Lifecycle Management) Blog Digital PLM will adopt blockchain and eliminate the role of world PLM super-admin - Beyond PLM (Product Lifecycle Management) Blog()

  • Simon Hartl

    Until now we did a requirement based assessment: as an example we knew
    the requirements on a migration perspective and evaluate how the
    requirements could be hit. Some of the required information pieces can
    be calculated/reconstructed out of existing fields or relations, etc. and it was examined how the requirement information could be collected and calculated.

    In my opinion it would make sense to have a well defined assessment template in order to make a base classification. I would propose a number of questions (e.g. how structured is the existing data, is there any data governance in place, number of systems where data is scattered, master data workflow in place, etc.) with a scale for 1-5 for the rating. This would give a first overview (spider diagram) about the current state.

  • beyondplm

    Simon. thanks for sharing! Do you think, it is possible to have some automated assessments about data quality?

  • Simon Hartl

    In general I would say no. Data quality can only be measured against a given goal which is specific to the analyzed data set.
    What do you think? Do you see this different?

  • beyondplm

    There are so much data today. To make manual assessment of data quality sounds very much 1995… So, I’m thinking about a better way to do so.

  • Loïc M.

    Hi Oleg,

    I’m PLM Consultant and most of my customers mean or think to have quite a good data quality in their PLM system. It is certainly true at 99% or more, but specific case emerges during migrations or development of interfaces to other systems.

    I think the PLM vendors are not so enthusiastic to provide tools that could detect inconsistency or “repair” the bad data, just because it would points out the thread that the software can generate or store bad data.

    As tool, I can think about Q-Checker for the 3D-Geometry (Catia / JT?) what are quite well used in the automotiv and aerospace industries. It is based on a set of rules to be checked. You define the rules. The same company (Transcat) did some tool based on the same concept applyid on the metadata for the Enovia platform, but I could not find out if the solution is still provided.

  • beyondplm

    Loic, Thanks for your comment! My hunch most of PLM systems are still used by engineering departments. While the data is pretty accurate inside of the system, the challenge is data that traditional is not included in PLM environment. Can you assess based on knowledge of your customers, how much data lives outside of PLM?

  • norsjo

    Just to add to the discussion points of Loïc and Simon, I would disagree data in PLM systems is 99% correct if you include the lack of a disposal plan for redundant data, and as Simon points out often ignored until a major project highlights it as an issue. Legacy PDM, CAD, etc was designed 10-20 years ago for on-premise and often storage costs were high, but over the last 10 years the price has of servers and storage dropped so much that volume of data is not a problem (economically) so clean up became neglected. Now when you come to “Digital Transformation” and take one of Oleg’s other themes on BPLM, Cloud PLM then you pay for what you use and have less flexibility on the data model. At this point old and poor quality data becomes a real issue in getting out and also getting use from new systems at the prices SaaS PLM should deliver.