Security is important. Period. However, the topic of data security and data governance was bringing a lot of controversy in the PLM world for the last 10-15 years. How to keep data “close to your chest” and not let it go anywhere sounded to many organizations like a nice strategy. They invented many security mechanisms in the company data centers, ensuring databases are secured and no one can get access to servers, but later discovered that half of the workstations are not locked with passwords, because of a lack of PLM licenses, engineers are regularly offloading data to Excels and USB drives and printed drawings are located in the trash bins in the shopfloor after been used for assemblies.
Cloud product lifecycle management (PLM) was another “chapter” in debates about data and security. It all started by saying that manufacturing companies will never trust their “core IP” to be stored in the cloud because it will create a huge company risk. All debates comparing bank cloud infrastructure and PLM clouds were pointless. Money is not the same as IP. You can ensure your money, but if the IP is stolen, you have nothing to do. While it sounded very reasonable and logical, it was crushed very quickly after IT managers discovered the amount of data uploaded from their corporate IT to cloud file storage like Dropbox, and Google Drive, and saw how many files are sent via attachments to emails. So, you can refuse to use the cloud PLM system, but at the same time, the company data is already in the cloud.
Moving toward 2023, the question of security is still very important and companies must be looking into how to understand the future data management trajectories of the data and data architecture and its impact on data security. Before jumping ahead check my earlier articles where I speak about PLM system architecture evolution and PLM data management evolution.
When thinking about the security of your PLM data, where do you start? Do you consider the data models and how they are implemented? The way users access and use the data? Or perhaps the infrastructure on which your PLM system resides? In this blog post, we will explore some aspects of PLM data architecture evolution and security.
Documents and Other Files
Files and documents are the oldest and most widely used data management paradigms. The majority of CAD design is still done in file-based CAD systems with data managed in a variety of file storages – from local drives to decades-old PDM PLM solutions. Most of these systems are limited to LAN and companies have no choice, but to export data to files and share them. Even the usage of a VPN won’t reduce the risk of files to exposed, because these systems are keeping files in so-called “local workspaces” where data is not encrypted to be accessed by CAD systems. These files can be copied without the need to break the protection of PDM servers and vaults. I’m not even speaking about CAD to BOM export to Excel so much loved by all engineers. The data is local and transmitted via email and a variety of data-sharing mechanisms where data is exposed in a form that allows easy consumption of it.
The last two decades of product lifecycle management is using industry-standard such as SQL databases to manage information. Nothing wrong with PLM, the same practices were adopted by all enterprise systems. The famous PLM system’s single source of truth is not more than placing all the data in an SQL database and watching the borders. Which makes all PLM systems vulnerable because of two reasons. First, the data is located in a single place (SQL database) and can be easily dumped into the archive. Second, because of the limited ability of old PLM architecture to edit and share data, the mechanisms of easy export were developed. It is a common practice to export data to Excel, edit it and sync it back. You can make effort to protect PLM servers, but once data is exported to Excel it is easily available. Various “packaging” of data with files in Zip formats is the most popular way for legacy product lifecycle management architecture to share data with contractors and suppliers.
SaaS, Cloud, and Data Granularity
Cloud product lifecycle management architecture makes data management more granular end less exposed. Modern SaaS PLM system are using polyglot persistence and multiple databases to manage data. In such a way, there is no single place where the data is located. Using DBaaS architecture makes data even more protected with an industry-standard encryption mechanism locking PLM data behind layers of physical and logical security. Virtual file systems with files located in the cloud storages, fully encrypted and multi-tenant SaaS PLM architectures allow to share of data instantly and seamlessly without actual exposing of information in logically available files. Data access is more granular (compared to Excel dump) and files, even if they are available from virtual file storages, can be protected by not allowing users to copy them to external file management systems. Altogether, it makes the modern PLM system more secure compared to legacy data management approaches. Virtual file disks and SaaS data sharing allow to use an old file-based desktop computer-aided design (CAD) systems, but at the same time reduces the risk of data being exposed.
What is my conclusion?
The industry was moving from managing a file as a single source of truth to a database (eg. SQL) into a modern data management approach where the entire data set is distributed between multiple data sources and hidden behind multiple security levels of cloud architecture – physical and logical. The impact on product data management and the way companies are gathering data and product value chain are changing. Data architecture has a major impact on the PLM system and supply chain management. It helps to streamline business processes which are especially important for global manufacturers with complex data management architecture. Modern data management architecture is where the digital thread begins and how real-time data access can be achieved to develop sustainable practices of manufacturing. Companies are looking for up-to-date information, but it is impossible to achieve it with legacy PLM solutions when information is hidden in documents, excels, and legacy databases. However, security is a killer factor that many companies are missing. By leaving their data in legacy product lifecycle management platforms they are exposed to additional security risks. When the entire BOM is in Excel together with the all files exposed via email, it is very easy to get. Old PLM systems were architectures back 25 years ago. Combined file and legacy data management architecture makes the entire data management and data handover process less secure and more vulnerable. Just my thoughts…
Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital cloud-native PDM & PLM platform that manages product data and connects manufacturers, construction companies, and their supply chain networks. My opinion can be unintentionally biased.