Manufacturing is a process that has been around for centuries and is still relied on today to produce products. Despite the many advances in technology, the manufacturing process has remained largely unchanged. In order to keep up with the latest trends and technologies, some companies have turned to digitally transforming their operations. However, even in a digitized world, there can be many different systems operating within a single organization, each with its own data and version of the truth. How can companies ensure that they have one single source of truth? Is there a common strategy across disparate systems that can be used to form this single source? Or is it even possible to achieve such a goal?
Coming to PI DX USA 2022
Just in a little bit more than a week, I’ll be coming to Atlanta, GA to attend the PI DX USA 2022 conference. It is a very exciting moment because PLM conferences are coming back and it will be the first conference for me after the long two years of pandemic break.
I’m super excited to be invited to speak about data strategy and modern trends in data management. I’d be happy to share an experience I gathered in this field and what we learned so far in building a network-based PLM platform OpenBOM.
The rise of data and the new economy has led to a paradigm shift that is redefining our world. In today’s digital age, information reigns supreme as currency for businesses looking towards an accelerated productivity level with advanced technologies in place; this will allow you to be more competitive by boosting efficiency across all departments at once!
The most common data challenges faced by businesses are talked about in detail during this session. You will learn how to overcome them and get practical tips that can help your company succeed.
Here are some of the insights from my session:
- Avoiding breakdowns in information flows throughout the organization
- Optimization processes and connecting data silos
- Making the technology work for your data flow
- Giving people the right tools to communicate and collaborate
- Designing effective education of your users to support data sharing across the business
PLM Single System Dream
The idea of PLM software started in a very idealistic manner as an attempt to cover more and more lifecycle states in product development. The roots of PLM software go deep in the PDM systems and CAD file management processes. As PDM was providing a place to store files and manage and access them in a controlled way, the question of how to extend the information management upstream and downstream, covering more processes.
However, PLM software expansion didn’t do very well because of multiple reasons. Among them, are the complexity of PLM interfaces, the absence of standards that can be easily adopted, and poor licenses. Moreover, systems designed for a single company performed poorly when requested to share data across multiple companies or business processes. Upgrades and keeping systems up to speed was another big factor that makes most of PLM systems challenging. Large OEMs spent a significant amount of resources on implementations experienced version lock and inability to upgrade.
It Is Not About Systems, It Is About Data
The importance of data became obvious in the last decade when companies focusing on product lifecycle management understood that instead of focusing on how to provide a single system, it would be much better to focus on how to make data available across multiple systems and companies. Product lifecycle management turned into a business strategy (instead of a single system) and companies started to spend more resources on how establishing the right data product data management strategy and technologies to make data federated, shared, and transferred to other systems.
Unfortunately, it didn’t change fundamentally the nature of old and large PLM software architecture, companies agreed that a “single system” is a dream and companies need to find a new paradigm for product lifecycle management capabilities to work across multiple company silos.
Breaking Company Boundaries
For many years, product lifecycle management (PLM) software was focusing on how to cover multiple lifecycles stages managing support for process management from early design to engineering, manufacturing, and, later, maintenance, sales, and other product development states. PLM vendors achieved great success, especially in large and traditional industries.
At the same, it became clear to PLM solutions developers that processes are spaning much wider and must cross company boundaries. The only solution available in the PLM software toolbox was expanding the same “big PLM” to other tiers of the supply chain. The approach worked to some degree, but it is far from ideal and doesn’t provide answers to many use cases.
The modern multi-tenant system architecture provides a new type of system capable to serve multiple companies at the same time. And I’m not talking about multi-tenant application servers, but about true multi-tenant data architecture that is capable to provide a logical breakdown between various organizational structures and roles, which helps to connect data and people working on a process spanning multiple companies.
Check my last year’s virtual presentation for VPE PLM Swiss Symposium – From Product Lifecycle To Network Platforms.
The growing development of cloud technology and web architecture brought a new approach to system delivery – Cloud Services. The term “cloud services” refers to a wide range of software services delivered on-demand to companies and customers over the internet. One aspect of this approach was to provide easy, affordable access to applications and resources, without the need for internal infrastructure or hardware. Another aspect of the service approach was their ability to integrate with a wide range of other applications. Among them are PLM solutions, Enterprise Resource Planning, Quality management software, project management, systems to support production processes, computer-aided design, and many others. Services are also a great source of reliable data that can be used in data analysis.
PLM software is evolving from being an ugly enhancement of SQL database and a document management system to store files into an integrated PLM software with flexible data management capabilities to support a distributed product development process and integrates data and business processes across multiple companies and business systems. However, the transformation is not simple and will require retooling of product lifecycle management with new technology and architecture for product data management to create a network of data and processes across multiple organizations and supply chain management systems to cover an entire product lifecycle, capable of analyzing data, find relevant data. All together it will make what I like to call a modern PLM software with up-to-date information available across disparate systems. Just my thoughts…
PS. Stay tuned for more blogs in the next few days before PI DX USA 2022 as I will be sharing more information.
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital cloud-native PLM platform that manages product data and connects manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.