I often talk with engineering teams and manufacturing companies about PLM implementation. A common approach I hear about is called “Big Future PLM Planning.”
While it is very exciting to think about “big future”, the problem with this approach is that many companies get stuck between their big plans for a perfect PLM system and the reality of using outdated tools like Excel and old databases.
This gap slows companies down – they have a perfect plan and, at the same time, live in the misserable reality. Here are some of my thoughts about how to make it different. In my view, the new way of thinking is needed. Two aspects of these approach – flexibility and putting data in the center.
Traditional PLM plans often aim to replace entire systems in one big project. It is ok on the paper, but usually fails in the real life. Modern products are more complex than ever, requiring close collaboration between engineering, manufacturing, supply chain, and support teams. Every company has their “PLM” even if they didn’t implement any software. This status quo of tools, emails, spreadsheets, and process is the reality that is hard to change overnight (and this is wrong)
Therefore, these “big” plans are high-risk. Their “all-or-nothing” nature often leads to stalled projects, wasted money, and unmet expectations. The sad results – companies stuck with inefficient processes and outdated tools, while competitors move ahead with faster, more flexible solutions.
I think companies need to rethink PLM implementation. The solution is step-by-step approach that focuses on data. This method builds systems that can adapt and grow with the company’s needs.
A successful PLM system starts with good data planning. A central data system makes it simple to add new tools without disrupting workflows. This ensures everyone works with reliable, consistent data, which improves accuracy across the board. Clean, organized data helps teams make better decisions, leading to more innovation and improved efficiency.
I recommend to break the PLM project into smaller steps reduces risks and ensures steady progress. Which obviously brings the questions – how to identify these “steps”. My recommendation is to think about lifecycle stages as a foundation of these steps. For example – “design phase”, “engineering BOM”, etc. If those lifecycle steps feels too big, make it even smaller and focus on a specific discipline or process (eg. design to manufacturing process).
Focus on high-impact area will allow to maximize the output for the business to achieve significant value early in the process.
Finally, my recommendation always to pick the project, department, or division to start with..
You organize need systems that can grow and adapt. The time of bulky tools that requires 6 months installation and configuration is over. There is really no any good reason why any step of the implementation will be longer than 2 months.
Modern, flexible, SaaS platforms allow companies to create systems with parts that can be configured and customized for specific needs.
What is important is to think about separation of three layers in your implementations strategy:
It will help you to easier update or modify the system as requirements evolve.
A great question to start with :). Here are three reasons why it makes sense to me and why it helped to companies I’ve been working with:
The time is over for monolithic all-in-one systems. We need to re-think the PLM implementations and how tools can be used around the data. The future of PLM implementations isn’t about creating one perfect system all at once. It’s about starting small, focusing on good data, and building a system that can grow and adapt over time.
Instead of aiming for a “perfect” PLM system from day one, focus on creating something flexible and scalable. This way, you can stay competitive, handle challenges, and prepare your company for the future. The key element of this process is to stop focusing being an application centric and, instead, to become a data-centric.
Focus on what data need to be managed first can help to build a more resilient data management and collaborative environment.
Just my thoughts…
Best, Oleg
Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased.
144 Search Result for: monolithic
Let’s talk about PLM architecture. A recent post and comments by Andreas Lindenthal triggered a fresh wave of discussion around...
A few days ago, I wrote about Data Products. It’s an interesting trend that, in my view, can push new...
Summer is a great time to catch up on PLM reading. What is on the top of the mind of...
The discussion about PLM as a business and PLM as a software are not new. When someone asks me about...
I was catching up on social media reading over the weekend, and my attention was caught by Matthias Ahrens post...
Last year, I published the article 5 Steps To Break up Monolithic PLM Architecture. Please check it out. In the...
Enterprise PLM architecture is a critical component of any manufacturing organization’s overall strategy, as it helps to align technology with...
Manufacturing is in the midst of a digital transformation. And it means that the industry is changing. For more than...
Steve Porter’s article Best in Show- Can ERP Providers do PLM? is a throwback in the mood of thinking about...
In my earlier blog I demystified the notion of “monolithic” PLM marketing and shared some technological aspects related to PLM...
In the past “monolithic” thing has a strong association with a power. Wikipedia article – List of largest monoliths brings...
PLM is getting more competitive these days. Cloud technology development, SaaS applications and new business models injected competitive energy between...