PLM and Data Modeling Flux

PLM and Data Modeling Flux

The core fundamental part of every PDM/PLM application is database and related data model. The history of data modeling is going long back to applications with proprietary data models. The cornerstone moment was introduction of RDBMS. Not many of us remember, but the original assumption of RDBMS inventors was to provide a way to make data model transparent and accessible to programmers. Codd’s vision was to make programmers to code against fixed data base schema. Take a look on the original Codd’s paper back in 1970 – A Relational Model of Data for Large Shared Data Banks. Here is my favorite passage:

Future users of large data banks must be protected from having to know how the data is organized in the machine (the internal representation). A prompting service which supplies such information is not a satisfactory solution. Activities of users at terminals and most application programs should remain unaffected when the internal representation of data is changed and even when some aspects of the external representation are changed. Changes in data representation will often be needed as a result of changes in query, update, and report traffic and natural growth in the types of stored information.

What Codd said about internal data representation back in 1970 is true now for many product development solutions. The end of quote emphasize the reality of many data-driven solutions these days. The complexity of solutions, diversity of requirements, mergers and acquisitions, application upgrades – this is a short list of situations when you data model and underline code is going to change.

RDBMS and Dynamic Data Models

The original introduction of RDBMS assumed static data model. The reality of many solutions development introduced a new concept – dynamic data models. Dynamic data modeling is what most of PDM/PLM solutions have today. I can hardly can name any PDM/PLM system that not apply at least some elements of dynamic data models. The ability of data customization is on the short list of every prospect and advanced customer.


Dynamic data model was a technique proposed by many developers of PDM and PLM solutions for the last 15-20 years. New data management solutions grew up for the last decade out of massive web and open source development. Branded under broad name of ‘noSQL’ these days, these solutions provide an alternative way to manage data. In many cases noSQL data models are more flexible and allows changes of data models in much easier way. Programmers are not restricted to define data model schema before start coding.

Data Driven Business Processes

Business processes are driven by data in an organization. Organizational changes, M&A, application upgrades, diversification of supply chain and 3rd parties data, social data, internet logs, internet of things (machine produced data) – this is only a very short list of data sources modern PLM systems need to adapt to. It brings even more the needs for  flexible PLM solutions.

What is my conclusion? Flexibility is one of the fundamental requirements for any PLM system.  A growing number of data related business processes will push your data models will be in a perpetual non-stop flux. Majority of PLM providers built their flexibility around customization of RDBMS schema. Most of these technologies are 15-20 years old. New data modeling approaches will be coming from open source and web to solve the needs of future data modeling and data-related processes. Just my thoughts…

Best, Oleg


Share This Post