A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM: From Vaults to Values

PLM: From Vaults to Values
Oleg
Oleg
11 May, 2025 | 14 min for reading

In my The Top 5 PLM Commandments (2025 Edition): How to Actually Win at Product Lifecycle Management article I shared my thoughts about how to transform PLM system and technologies. Yesterday article about modern API layers transformation (from REST to MCP) took a very technological perspective, so today, I want to talk about about “human connection”. This article is also part of my preparation for Share PLM Summit 2025, which will take place later in May – From Sherry to Share PLM Summit: Heading to Jerez de la Frontera.

For decades, PLM and product data management software was structured to control documents, enforce formal engineering processes, and automate release handoffs. Although it is still true and important, as companies strive to be more agile and collaborative, this “over-the-wall” approach is no longer enough. At the same time, the demand for human oriented has been growing. Which made me think that it’s time to stop treating PLM as a mechanism for process enforcement and start building systems that connect people to data in meaningful ways.

Modern product development demands real-time context, shared understanding, and continuous feedback across disciplines. Business processes, product development processes, supply chain collaboration and, process management, in general, is becoming more connected. Engineers, planners, buyers, and shop-floor teams all need to work from a common truth—not in silos. Success lies not just in managing data, but in designing systems that foster trust, shared ownership, and transparency. At the same time, the demand for quality data sharing, machine learning and other factors of data management in PLM technology is growing.

What is a business strategy that can be aligned with such software and PLM processes combined with supply chain agility and product quality that can build a new type of PLM solutions?

Why PLM Feels Broken to Many Users Outside of Engineering

At its core, traditional PLM was designed with an emphasis on control and file management and engineering productivity, rather than broad user access and friendliness. What might have worked 20 years ago now feels outdated. Common frustrations include:

  • File Control Over Collaboration: Traditional PLM solutions were built to manage CAD files, enforce revision control, and lock down processes – essentially acting as digital vaults. While this ensured data integrity, it often came at the expense of flexibility. PLM “ends up being the place where engineering files go to rest”, rather than a dynamic system that connects people and decisions across the company. In short, a system built purely for control fails to help users collaborate or innovate in real time.
  • Complex, Unintuitive Tools: Decades of feature-driven development have turned many PLM interfaces into cumbersome mazes. The software was “a synonym for complexity” and notoriously difficult to navigate. Engineers and shop-floor teams are often greeted with clunky forms and hierarchies that only a PLM specialist could love. This unintuitive experience means casual users (and even seasoned engineers) avoid using the system whenever possible, leading to workarounds and lost information.
  • Siloed Data and Poor Flow: Instead of information flowing freely across departments, traditional PLM tends to trap data within engineering. Other teams frequently complain that getting data out of PLM is slow or impossible. In practice, we see engineers emailing spreadsheets because it’s “faster than uploading to PLM,” procurement teams manually re-building Bills of Materials, and executives struggling to get a single product overview. In other words, the very system meant to be the “single source of truth” often ends up bypassed, undermining its purpose.

All of this leaves users feeling that PLM adds overhead without corresponding value. As one industry analysis noted, traditional PLM remained “trapped in engineering” – great at vaulting CAD files and revisions, but offering little to the broader business. It’s a complex beast that hasn’t evolved to support how modern organizations share data and make decisions. To break out of this rut, PLM needs a fundamental shift in philosophy. The path forward starts with reimagining PLM through a more human and data-centric lens.

Let me walkthrough three themes that from my perspective can develop a path for future product lifecycle management (PLM) and product data management development. It can be a way for PLM software to support a human oriented product lifecycle and support a more collaborative product development process.

PLM - from vaults to value

Theme 1 – Built for Humans

One glaring issue with traditional PLM is the absence of a truly user-friendly layer. Most PLM tools were never designed with ordinary humans in mind; they were designed to check boxes for IT and compliance. It’s time to flip that script and build for humans first. Modern software – even in the consumer world – proves that a great user experience drives adoption. Think of how effortlessly Spotify recommends music or how Airbnb’s app lets you intuitively find a place to stay. Those products succeed because they put user needs and decision-making first. PLM should do the same. In fact, the same is true for most of consumer oriented services (opposite to most of business processes oriented systems).

A human-centric PLM means designing interfaces and workflows that empower the user instead of frustrating them. Rather than burying information in forms, folders, and checkboxes, the system should present contextual data in an intuitive way.

In fact, the ideal is for PLM to behave “more like Spotify or Airbnb — [a system] designed for how people make decisions, not just how data is stored”. This might include dashboards that surface the latest design changes relevant to each role, search functions that actually understand engineering context, and mobile-friendly tools for on-the-go updates. The goal is to enable informed decisions at every level, from engineers on the shop floor to managers in the boardroom.

Crucially, built for humans means reducing complexity. Just as popular consumer apps obsess over simplifying the user journey, next-generation PLM must trim the bloat. Every extra click or cryptic form field is a hurdle to productivity. By rethinking the user experience from the ground up – focusing on clarity, responsiveness, and even aesthetic appeal – PLM can transform from something people grudgingly tolerate into something they actually want to use. A humanized PLM would not only make engineers happier, it would also increase data integrity (since more people would input data properly) and overall ROI for the business. In summary, PLM needs to stop being a system that users work around and start being one that works for the users. It’s about turning a control-centric vault into a friendlier, decision-driving companion for everyone involved in product development.

Theme 2 – Data as a Product

For years, “folders” and “documents” browsing was considered as a main user experience paradigm related to the data. It has strong roots in the the data sharing and product lifecycle management is organized around “vaults” of CAD files and Excels.

The key to moving from vaults to values is treating the data inside PLM not as a byproduct of doing business, but as a core product itself.

What does this mean? It means managing your internal engineering and manufacturing information with the same care and intentional design that you would a customer-facing product. The data should be high-quality, easy to find, and ready to serve its consumers (engineers, procurement, quality teams, etc.) when and where they need it.

In practice, treating data as a product comes down to a few critical attributes:

  • Discoverability: Team members should be able to easily find the information they need – whether it’s a CAD model, a material specification, or a test result – without digging through vaults or chasing emails. If data is treated like a product, it will be organized and indexed for quick discovery, much like a well-designed app helps users find features.
  • Usability: Just having data isn’t enough; it must be clean, contextual, and ready for use. This could mean standardized formats, up-to-date revisions, and rich context (like knowing a part’s status or history at a glance). Data presented through the PLM should answer questions, not raise more.
  • Trustability: Users across the organization must trust that the data is accurate and current. This requires governance and quality controls so that, for example, a BOM or a compliance document in the PLM is known to be the single reliable version. When data is treated as a first-class product, maintaining its integrity is a top priority, ensuring everyone from design engineers to supply chain managers can rely on it.

We should think of our internal product data like an API: it must be “discoverable, trustworthy, and responsive” to its users. In a practical sense, this might involve building search and analytics capabilities on top of PLM so that an engineer can query, “What other products use this part?” and instantly get an answer. Or enabling a manufacturing planner to quickly pull up all design changes that occurred after a certain prototype run. When data is packaged and served in a user-centric way, every department can make better, faster decisions.

For example, consider an electronics company launching a new device. The design team, manufacturing team, and quality team all need access to the latest component specifications and test data. In a data-as-product scenario, the PLM would function as an internal data portal: the design engineer can easily search and find a component’s spec sheet and see if it’s approved for use; the manufacturing engineer can quickly retrieve the current assembly process and tooling info; the quality manager can pull up test reports – all in a few clicks, without parsing through cryptic file trees.

Another example, in a machinery company, if a field issue arises with a particular part, the support and engineering teams should be able to trust the PLM to immediately provide the part’s full history, drawing revisions, supplier information, and any related change orders. This kind of discoverable, trustworthy data environment turns PLM into a source of insight rather than a black hole.

It starts by separating “PLM data explorers” from user experience. By treating data as a product, organizations foster a culture where information is accessible and actionable. It breaks down silos between engineering, manufacturing, and business teams. The PLM’s value then comes not from hoarding files, but from enabling knowledge sharing and collaboration. In the end, when everyone can easily find and trust the information they need, the whole product organization becomes more agile and informed – a true competitive advantage in today’s market.

Theme 3 – Build the Manufacturing Graph

The third pillar of PLM’s evolution is a shift in the underlying data model – moving from finding a “single database” to hold everything to a better data modeling. To me, the best model to support product lifecycle management complexity is a flexible graphs data models. Traditional PLM databases often resemble spreadsheets or relational tables: they store a bill of materials here, a list of suppliers there, and maybe link them with predefined relationships. This approach is rigid – it struggles to represent the nuanced, many-to-many relationships that exist in real life. Object modelers developed by traditional product life cycle management architectures makes it good for change management, but hit siloes and complex data constraints when need to address process in another silo like service lifecycle management, supplier collaboration and others.

In contrast, a graph-based data model can capture the richness of how products are actually developed, produced, and maintained.

Imagine the entire universe of your product data as a web of interconnected nodes: a manufacturing graph that links every part, subassembly, material, supplier, requirement, test result, machine, and even people. In such a graph, querying relationships becomes much more powerful. For instance, you could instantly traverse the network to see all products affected by a change in one component, or identify the chain of suppliers connected to a specific raw material and which processes each supplier’s part goes through. Rather than looking at isolated siloed data models and file vaults, you’re looking at a holistic network of information. The goal is to have a “connected data model that links products, engineers, suppliers, manufacturers, machines, revisions, costs, compliance… everything. A network, not a silo. A system of relationships, not a vault of files.”

Why is this graph approach so valuable? Because it mirrors the real world. In reality, manufacturing is “a web of relationships: between parts, suppliers, regulations, design changes, engineering data, customer data”. A change in one place inevitably impacts many other elements. Graph-based PLM embraces that complexity rather than forcing it into rows and columns. It also enables far more flexibility. Need to add a new relationship type (say, linking a software update to a hardware component)? In a graph, that’s just another connection – no massive schema overhaul needed. The data model grows and evolves as your product ecosystem does.

For distributed and fast-moving teams, a manufacturing graph provides a kind of living map of the product. Team members in different locations or departments can all navigate this shared map to understand context and dependencies. An engineer can see not just the BOM of a product, but also related quality reports and supplier info in one view. A supply chain manager can trace the entire lineage of a part through multiple assemblies and vendors to assess risk. This holistic visibility is essential for quick decision-making in today’s environment. When information is connected in graph form, it also becomes possible to run smarter analytics – identifying patterns like which types of parts often cause delays, or predicting the ripple effects of a design change through the network.

Embracing a manufacturing graph means moving beyond the mindset of PLM as a static repository. Instead, PLM becomes a dynamic knowledge graph or digital thread that truly connects across the product lifecycle. It shifts the focus from data storage to data relationship. In doing so, it lays the foundation for more automated insights (think AI traversing the graph to flag potential issues) and more seamless integration with other enterprise systems. In short, building the manufacturing graph turns PLM into the connective tissue of the product organization – exactly what it needs to be to deliver value, not just hold data.

Another reason to develop a Manufacturing Graph is performance and semantic of queries. Those traditional PLM systems, even if they present data in a graph-like visualization are suffering from the performance issues and limitations.

What is my conclusion?

I think the old paradigm of PLM – the vault of files accessible only to a few – is not sufficient for the modern product enterprise. The future lies in a new generation of PLM built around human-centered interfaces, flexible data modeling, and a connected manufacturing graph that spans the entire product lifecycle. By prioritizing usability and human workflows, treating data as a carefully managed product, and leveraging graph-based relationships, PLM can transform from a stagnant archive into a vibrant decision-support system.

For both engineers and IT leaders, this shift offers tangible benefits. Engineers get tools that mirror their needs: intuitive systems that help them do their jobs instead of adding friction. Managers and IT leaders, in turn, see higher user adoption and better quality data feeding into analytics and business decisions. The entire organization gains agility – changes and insights propagate through the “graph” quickly, and everyone can access the information they need in a form they can use.

In calling for PLM to move from vaults to values, we’re really calling for PLM to fulfill its original promise: to be the backbone of product information for the whole company. Achieving that means shedding the old skin of overly complex, siloed systems and embracing designs that put people and data connectivity first. The reward is immense. Companies that adopt human-centric, data-rich PLM will find that their product data is no longer a headache to manage, but a strategic asset to leverage. As one PLM visionary succinctly said, we must build “PLM systems that don’t just manage data — they empower people.” With a renewed focus on user experience, data-as-product thinking, and the manufacturing graph, PLM can finally live up to its potential as an enabler of innovation rather than a bottleneck. It’s time to unlock the vaults and start delivering true value.

Just my thoughts…

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
12 May, 2009

The latest trends in software development definitely lean towards simplification. People tend to avoid complex stuff. I think most of...

25 December, 2008

 As the New Year is very close, I see this ias a good time to dream about the future. Therefore,...

18 May, 2012

Big data is hyping trend these days. Many people is using the term of big data for different purposes and...

29 March, 2016

Cloud is a big buzzword these days and all PLM vendors are trying to use to their own advantage. Last...

10 December, 2009

Some thoughts about Business Intelligence (BI). I found it somewhat under-invested in Product Lifecycle Management field. BI considered as a...

11 August, 2009

Following my previous post about how PLM can go to mainstream, I had chance to discuss this topic with some...

18 June, 2022

I’ve been following Autodesk PLM development for the last decade. It has many turns during that time- from ignoring PLM...

11 June, 2009

I enjoyed to read Simplicity is Hard by Larry Cheng. Had few straitforward thoughts out of this. Our PLM systems...

18 April, 2019

I’m continuing to share my favorite moments of COFES 2019 I attended last week. One of them is Eric Anderson’s...

Blogroll

To the top