PLM and manual tearing down data silos

PLM and manual tearing down data silos

Data is a new oil. You can hear it more often these days. In one of my earlier blogs, I shared my thoughts about using data as a platform. Read more here. And I have some very good news for you – unlike oil fields there is an abundance of data everywhere. Even more… the amount of data is growing enormously. It is true about data online as well as for data in the organization.

Although majority of enterprise systems in manufacturing companies are operating behind the firewall, the boundary of data is getting blurred. According to CIMdata publication, 79% of companies are using cloud technologies today, Which means these companies are using some cloud software and corporate data is transferred outside of the organization to variety of forms.

PLM idea is build around variety of data – requirements, design, engineering, manufacturing, service. But the core element of PLM information is about product data.

I came across an interesting PLM related site – Share PLM. According to the information on the site – Helena Gutiérrez first started to discover the power of Product Lifecycle Management during her experiments with CATIA and SmarTeam.

I remember when I first started working in Digital Product Development and Product Data Management. My father had convinced me to help him administer his PLM tools during my university studies. I remember long nights fighting with CATIA and Smarteam. Attempting different workflows, wondering “What’s an instance?” and “Why can I not delete this part?” I was wasting too much time on basic things and repetitive tasks. I felt like I was in a science lab, trying to discover alchemy.

Share PLM is passionate about PLM and you can find many articles about data, information, intelligent products and business opportunities. Share PLM is selling education and consulting services. But I found blog and other materials interesting and worth reading. My favorite article  – Making sense of your product data speaks about an interesting topic near and dear to my heart – tearing down data silos.

Let’s use the knowledge buried within our data to make better decisions and improve performance. That’s the universal goal – but the reality is far different. The fact is, the Information Revolution hasn’t lived up to its promise for most companies. Isolated islands of unconnected data and a lack of skills and talent are among the major challenges. What can you do to unlock your information? The recipe for success begins with a well-considered architectural plan. Information must be consolidated in a meaningful way, and architecture is the glue that holds it all together.

I found 7 steps checklist to help you get your data out of the silos and put your information in action – Set clear goals for your data initiative; Break down your data into key dimensions; Identify and classify what data is being used, where, and why; Start creating a corporate attribute dictionary; Map existing attributes to the corporate attribute dictionary; Catalogue attribute sets and define its ownership; Plan the implementation, estimate the costs, and define the business case.

While overall idea is absolutely right, it made me think about complexity of PLM paradigms and implementations requiring to do lot of manual legwork to organize data. While I combine it with the graph of data growth at the beginning of the article, it is becoming clear that traditional “analog” PLM cannot catch up the data explosive growth. It is kind of Yahoo age if you compare it to technologies used by global web companies. It must be a better way than manually mapping data islands and information and creating data in PLM systems.

This is why I think manual PLM data modeling should be a thing in the past.For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company.

What is my conclusion? Manual tearing down data silos should be a thing in the past. Classification of data, mapping of data elements, creating of models… These are elements of analog data management promoted and supported by PLM specialists for many years. Unfortunately, this is how today’s PLM systems are operating and therefore six figures consulting budget is required to make this system work for a modesty sized large company. Smaller companies are not doing PLM – it is too complex and too expensive. Future PLM paradigms should bring new ways to capture data and turn it into information. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Dick Bourke

    Yes, “Making sense of your product data” is an informative introduction to the subject. Yet, the unspoken assumption seems to be this cleansing effort can be done manually. Hmm. I wonder.

    I’d suggest looking at electronic aids before charging ahead manually. Alternatives are available. A sampling includes:
    * Convergence Data Services, which I have validated
    * Verdantis offers a cleansing service, though the company seems to focus on cleansing ERP and MRO product data.
    * Capgemini offers a service described by CIMdata as “Product Information Access Framework.”

    Some would call the Capgemini servicet “Complexity Management,” a concept far more than just classifying parts, that is, simplify your product line. In that vein, I highly recommend a book titled “smart simple design – Variety Effectiveness And The Cost Of Complexity,” Gwendolyn D. Galsworth.

  • beyondplm

    Dick, thank you for comments as well as for sharing links to materials and books! I think, people using “cleansing” in different contexts – one of them is related to MDM and search/index. Another one is a synonym of data rationalization. The last one is how it often used during PLM implementations. Best, oleg

  • Pingback: Can we use analytics to gain insight and achieve more value from PLM? | Nick Leeder()

  • Pingback: Can we use analytics to gain insight and achieve more value from PLM? – PLMPulse Survey()