Why Do We Need PLM Data Model?

I’d like to come with questions about the topic of PLM and Data Modeling. The idea of this discussion came out of some comments and conversation made on PLM Think Tank. Since, it was presented as a significant differentiation in the capability of PLM system(s) to make their job, I decided it important enough to discuss.

Image by Mediawiki_dbschema

Fundamentally, PLM Data Model is the heart and the core of any PDM/PLM implementation and system. The ability to model design, engineering and manufacturing data as well as processes around, obviously comes as a very important. However, since the topic of modeling is about company products and process, it is always coming as something unique in the organization. In the early beginning of PDM, systems were not flexible and requires physical change (re-build) to handle specific product and process data. Nowadays,  PDM/PLM systems are claiming sort of flexible data modeling capabilities that make them possible to apply to any customer situation. At the same time, cost of this “application” is sometimes very expensive.

PLM Data Model Uniqueness
What make PLM Data Modeling so unique? Why do we need it? Maybe we can avoid this process, by supplying something generic and not requiring change for every customer? There are two extreme examples I want to bring in the context of these questions. One is about Excel (or spreadsheets). Basically, we can model almost everything in the spreadsheet these days. It is absolutely good, since it is damn flexible and can run out-of-the-box. However, to understand these models, you need to keep Chief Excel Officer in your organization for full time job. As an opposite – why we cannot make “the universal PLM data mode”. Since, this is all about engineering and manufacturing, we can finally identify what to put there. It may work, but every time, your customer will ask you about “small changes” to be made in order to support their requirements.

Standards and Best Practices
I can see these two options as a industry try to deliver a compromise between Excel and One-PLM-Model. Standard activities were very popular (and may be still popular) in the engineering and manufacturing world. Standards for product data exchange, supply chain, industry standards, etc. In parallel with that, big software and service vendors tried to come with so called “best practices”- a simplified way to delivery data model for a specific segment of customers, industry vendors. The fundamental difference between standards and best practices, in my view, was at the level of “agreement” achieved between parties involved into this activity.

Where I want this discussion to go? I think, PLM (or engineering and manufacturing) data models are an interesting topic and real problem. In many cases, it defines the success of the implementation or PLM software in general. This is a technical and marketing issue at the same time. At the same level data modeling influence implementation and product architecture, it is always used as part of the marketing story. Do you think a PLM data model is a key of the future success of PLM implementations? Conversely, maybe you think it is a technical term, and it should dissolve into “real conversation about functions and value of PLM systems”?

What is my conclusion today? I want to listen to your opinions. From my side, I’ve seen this topic touched hearts of many people involved into discussions. In my view, PLM Data Model defines the level of flexibility (or new word – “granularity”).

Just my thoughts…
Best, Oleg



Share This Post

  • Charlie Stirk

    There are standard PLM data models. They are based on the STEP PDM Schema that was developed several years ago to create common PDM models for STEP Application Protocols. For instance, the STEP AP239 Product Life Cycle Support information model contains the PDM Schema, but is also extensible using subtyping with OWL to create an OASIS PLCS DEX for specific data exchanges. PLCS is densely relational and can be used as a framework for developing PLM data models with different levels of granularity and data aggregation. PLCS has a messaging model and can be used to develop standard web service interfaces. The scope of PLCS is quite broad, and is as close to a universal PLM model as you can find.

  • Charlie, Thank you for comment and welcome to PLM Think Tank discussion! You are right, probably from all possible PDM/PLM related standards, STEP PDM and PLCS are the most widely adopted. However, when it comes to the specific implementation it rarely used as a blueprint to create customer implementation. The usage of such a modeling schema has a tendency to product data exchange and not to somewhat to be used for product deployment. It is interesting if you have a different perspective on that… Best, Oleg

  • Charlie Stirk

    PLCS can be used for product data exchange, and as the data model for a deployed PLM application. Together, these capabilities (along with a few others like adaptors and data consolidation rules) provide a robust, standards-based method to integrate with legacy PLM systems. For example, take a look at the solutions that Eurostep has developed. At CostVision we have used the PLCS technology to build standards-based PLM applications and integrations with legacy PLM (and other) applications.

  • Charlie,
    There are basically two questions – model and PLM systems modeling capabilities.

    1. Do you know PLM systems that adopted PLCS as a blueprint data model? The interesting thing will be to analyze how these initial models changed and evolved inside of organizations within a time. I had chance to see Eurostep implementation. Their fundamental believe is to use STEP and a basic model. My concern is that it takes a lot of time to make everybody agree on the same data model. What is your view on that?

    2. On the modeling side, it is interesting to see what is the level of flexibility and granularity in the ability of the PLM system to change existing and future models. The last one is important when you think about the cost of changes in the organization.

    Thank you for this great discussion!
    Thanks, Oleg

  • Charlie Stirk

    Hi Oleg,
    Some answers to your questions.

    1. Eurostep and Jotne EPM natively support PLCS, as can probably other EXPRESS tool vendors like LKSoft to some extent. Other PLM vendors have supported parts of the PLCS model. For instance, TeamCenter Systems Engineering used it to migrate requirements data from their earlier version, Slate. The TeamCenter Open PLM XML Schemas bear a strong resemblence to parts of PLCS, but are not as relational. Everybody already agreed on the basic PLCS data model through the ISO STEP standards development process. That is what it is for, to reach consensus. At PDES Inc., we are starting up a PLCS Implementers Forum to test data exchanges, similar to the successful CAX-IF for STEP CAD. Many vendors expressed interest in participating.

    2. Based on feedback from early PLCS deployments, and to bring it back into alignment with the STEP CAD and other modular AP’s, PLCS is being updated to edition 2 and is currently working its way through the ISO standards process. It includes some new capabilities from AP203ed2 for CAD and AP233 Systems Engineering, as well as other improvements. Upward compatibility is a requirement in the STEP standards development process. PLCS edition 2 is stable and provides more flexibility by adding more SELECT extensions to allow additional assignments. Other flexibility and granularity comes through the templates and reference data in the OASIS DEX architecture. The scope depth and granularity is potentially unlimited in the DEX architecture. PLCS is an extremely flexible and adaptable information model.

    The scope breadth of PLCS is larger than any PLM application that I am familiar with. As the Chair of the PDES Technical Advisory Committee, I will be giving an presentation on PLCS at the 3D Interop Conference in Estes Park in early May.

    Regards, Charlie

  • Charlie, Thank you for your answers. I will take my time to review links and references. I had chance to see some of them before, but it is always good to refresh knowledge. Unfortunately, I’m not coming to 3D Interop. However, I’m making my trip to COFES next week. Do plan to be there by chance? Best, Oleg

  • Charlie Stirk

    Never been to COFES. 3D Interop is local for me 🙂

    Forgot to say, but there are other standard data models for PLM. STEP AP214 also has CAD and PLM information like configuration variants, manufacturing process planning, and PDM Schema. It also has web services standardized to many of the non-geometric parts of AP214 model. The standards community is figuring out how to create a superset of AP214 and AP203ed2, which will drive compatibility with PLCS. PLCS has also been used for integration between Manufacturing Process Planning tools like Delmia and TeamCenter Manufacturing. OAGIS has also been used as an interface to Delmia, and some PDM systems. OAGIS has many implementations with ERP and MES. We are working on a project with NIST and others to harmonize STEP and OAGIS.

  • Oleg,

    You bring up a very important point in PLM I think. Here are my thoughts about this. In my view, the One-PLM-model, that can be used in every firm is not possible, nor desirable. The main reason is not a technical or marketing one, but social. When an organization acquires a PLM system, a clash between traditional, established user capabilities (what is in the heads of users) and out-of-the-box PLM capabilities is inevitable. This contradiction must be resolved before the PLM system can become a resource in the organization. Just think about all the times you have stared at the user interface of a new tool and understood nothing… The resolution requires that those involved arrive at some level of agreement about what to implement in the system. This is very arduous to work out. The problems surge with increasing detail of the implementation. In order to speed up this process, the implementation in the PLM system must be very easy to change. Just to mention one example, at Ericsson we estimated that it took us more than a year and about 500 changes before a truly working PLM system was in place to support the coordination of projects developing 3G systems around year 2000. Some 600 items had to be modifies (relations, attributes, cardinalities, access rights, states, etc). Standards like PLCS do not help in this respect since users must make common sense of the constructs in PLCS as well. This is equally arduous, which you also touched on in one of the comments.


  • Colin Clifford

    A perspective from someone who has spent many years on practical PLM deployments across a range of industries on Why Do We Need a PLM Data Model?

    As the term PLM covers such a scope it seems unlikely that a one size fits all approach will work, indeed what we see is that in some areas, CAD data management and Engineering process management, that commonalities across companies means a standardized approach is appropriate, and here we see a set of products aimed at these midsized businesses. These will include some form of data and process model, which addresses these common needs.

    However as the size and scope increases we need more flexibility because the PLM journey and business touch points are different in every case.

    Although it is unlikely that Excel would be practical at anything other than the simplest cases, a solution built on a database with a smart client (and a less structured data model) can address the smaller implementations, but in trying to scale these the complexity again becomes unmanageable – we need a data model that is both flexible and scalable, this leads to a different approach to the problem, where the data and process model is separated from the underlying technology. This allows the solution architect to think in terms of how to solve the business problem rather than how to implement some technology.

    In these cases the PLM data model is all important. It must also be straightforward to maintain and extend to address changes in the business (it is common that significant business changes occur even during the initial deployment).

    While flexibility is vital, we do see that (in the same way as the mid range systems can standardize) that an OOTB data model which addresses up to 80% of the need is possible, and here this implies industry specific variations of the model, with a common core. Such an approach helps with the business imperative of reducing the long term cost of ownership.

    While PLM data model standards are useful for some data exchange situations, I feel the an SOA approach with perhaps BPEL will be more common in the future, to me it seems to address more business requirements and provide more flexibility, although I have no real experience in practice.

    So to summarize, the PLM Data Model is critical and becomes more so as the scale of the deployment increases

  • (I started to write this message before the last two post were sent but i guess i’m going pretty much the same way as colin)

    coming back to your post. when you said “Fundamentally, PLM Data Model is the heart and the core of any PDM/PLM implementation and system.” I went back to the first few messages I left on your blog, i was really about data-model and not really happy with all your articles on google wave and social network tools. I made an article recently (in french) talking about the turn PLM market is having. My aim was to distribute roles between Editors, integrators, IT System admins & Work Process managers. And each role could have a main target. Editors should work on technology, integrators on features & pre-customized Items, and companies on datamodel. Then with standards like STEP you free the customers (maybe not completely, but you save time) of defining data-models. So as an integrator i’m really concerned about how the techology will meet the Standard data-model functionnal requirements and that’s where talking about technologies like google wave or BPM tools was not useless.
    My thought is that until now, the editor was trying to do almost everything. And i think the split between technology and data-model as to be done clearly. Having editors, delivering high flexible technologies (could be like a “Visual PLM Studio” with pre-made project template based on Standard) and on the other hand, having integrators customizing the standard based data-model and features with the client.
    I’ve not been for years in the PLM market so don’t hesitate to correct me!

    A quick tip for editors: please put Transaction Codes input!!! everybody says SAP is ugly and not user friendly, but as soon as you know your transaction code, it becomes a peace of cake to use it. (I was a SAP CO consultant @ Accenture)

    (this is the article i was talking about: http://prodeos.fr/?p=419, not sure the english translation will work well!)

  • Pingback: PLM Data, Identification and Part Numbers « Daily PLM Think Tank Blog()

  • Lars, Thank you for sharing your insight. I agree and support your points related to the process PLM system/implementation needs to pass in order to become a truly working system in the organization. These are the same points I had in my mind when wrote about complexity to apply “best practices” in PLM and product development in general. So, the key capability of PLM Data Model and system supporting this model are 1/flexibility; 2/granularity; 3/low cost of change. I still want to understand better your opinion about PLCS model. Can you please, explain what do you mean by the following – “users must make common sense of the constructs in PLCS”? Thanks, Oleg

  • Colin, Thank you for your comments and sharing of experience. It seems to me that PLM data models came to the point where the next level flexibility and expression need to be achieved. I think all traditional ways either rigid or complex. On a smaller scale OOTB models are the ultimate offering made by vendors. However, customers often abandon these models and either using free-flex-form by Excel or customize everything using their own resources or subcontracting service organizations. On the large scale, cost of PLM data modeling is growing enormously and in most cases becomes one of the top factors defining overall cost of PLM implementation. Just my thoughts… Best, Oleg

  • Yoann, Thank for your message! With regards to you Google Wave/Social comment – I’m trying to diversify blog posts. There is growing interest, in my view, related to social collaboration and networking. So, these are very demanding topics. I think, the person (or role) you called Editor is fundamental in every PLM implementation. I think, the level of complexity in implementations is growing, as well as the amount of time spent on services and customization. It shows a low level of maturity in existing solutions. I think, in coming years, we will need to find “a simpler way” to make PLM (or whatever other name we can put instead of).
    Thank you for the link to your article and for linking to PLM Think Tank. I agree with your definitions of trends and the opportunity to develop Open Source PLM option. However, the most important in this process will be to understand how community around this type of development can grow and become mature. Today it is almost not happening. Best, Oleg

  • you’re right about the community.
    I’ve spent some time on Google Trends this week-end. I love this tool watch those two search:

    – comparing cloud & open source (ok cloud can also be a real cloud)

    – comparing PDM & PLM

    Best Regards,

  • Yoann, I love Google’s trends too! However, you need to be very careful when you pull “buzzwords” and “TLA” (three letter acronyms) in this game. When it is ok for your first example “cloud vs. open source”, I cannot accept the second example. I’m monitoring PDM and PLM keywords online and amount of garbage is more than 50%. Try to search “PLM Blog” in Google. You’ll be surprised to learn it is about “Psychotic Leisure Music” too :). More relevant, but not related is Product Launch Marketing (PLM). So, GIGO (garbage in, garbage out) works… BTW The Cloud created huge hype during the last months or year. So, I’m not surprised, it outperformed “open source”. Best, Oleg

  • Shaoping Zhou

    Hi Oleg

    This is a topic that can always be refreshed. Good post.

    I would like to share my thoughts on this:

    1) At the industry level, there is a growing concensus or convergence of opinions of which aspects of product and process to be modelled in the scope of PLM.

    2) At individual PLM vendor level, a data model caters to two imperatives: a) the need to put a flexible and robust structure underlying all the features, which is literally all IT development houses attemp to do as a matter of business; b) the need to capture the best-practices of their most important customer bases.

    3) At the individual customer level, data model is also essential aid for a customer to and assess and evaluate what is out-of-the-box, what needs extension of the data model, what needs complete new development.

  • Shaoping, Thank you for your comment!
    1. I agree, there is an attempt to converge on industry related aspects of models in PLM scope. However, I see a huge barrier between “what” and “how” in this discussion. Every customer wants to have “a standard system” doing a specific support for specific product development practices” they have in the company. So, at the time when “the industry” agrees on common terms, it cannot be applied “as is” into implementation.
    2. PLM vendor’s trend in data modeling is to go to “best practices” or “industry practice”. It was a reaction on “flexible toolkit development” happened 10-15 years ago. However, the “best practice” trend is destroying “flexibility” and “agility” of the systems. What can bring balance between “flexibility” and “best practice” is to build systems capable to acquire customer’s best practices without people’s involvement into this process. But, I haven’t had chance to see such practices in place.
    3. At the individual level, best practices turn out as a huge cost of the adjustments and changes (http://plmtwine.com/2010/03/10/plm-best-practice-torpedo/). This is what make PLM implementation long in time and brings significant service revenues to vendors and their business partners. The problem of this model is that it is not scalable. When works probably good for large OEM it stacks when applied to smaller customers Tier1-n mid-size OEMs. If a customer feels the cost of PLM is not for their budget, engineering system wizards and IT turns their plan to MS Excel, SharePoint and other alternatives.
    Just my thoughts… best, Oleg

  • Pingback: PLM Model: Granularity, Bottom-Up and Change « Daily PLM Think Tank Blog()

  • Is the problem the data model or is the problem how the user engages the data model? Even though excel is extremely flexible and therefore allows everyone to do something totally different (which make a user feel comfortable), if you look at excel files used across different teams you find they are exactly the same. They all have the concept of a part which in turn is placed in either a parts list or structure. And in each a part has a set of meta data the team is tracking and a set of issues (things that need to be resolved/completed to release the part).

    You can conclude that the data model used in Excel to define a products and it’s parts is completely common and therefore the use of Excel has more to do with how the data model is engaged by the user.

  • Chris, I see your point. The problem I’m talking about is in a data model. I what we have today was good for 10-15 years old requirements. System became complicated since then and next modeling shift is required. It doesn’t mean there are no other problems. It doesn’t mean, also, you cannot cross-compensate. I like “Excel” example. In every company, there is a person – “Chief Excel Officer”. This person is responsible to organize parts, structure and rest of the stuff in appropriated Excel form that can be consumed by users. The data model you mentioned is similar, but every organization made tweaks to implement some specific requirements. Certain requirements will bring you to something more powerful, in comparison to Excel. Best, Oleg

  • Pingback: PLM Think Tank – April’s Top 5 Posts « Daily PLM Think Tank Blog()

  • Pingback: PLM and User Driven Data Models()

  • Java

    I think it all depends on what we plan to accomplish
    Implementation difficulties is directly related to how antcipated were the needs of custoners and how smart was sytem designed.
    Data model sure, every vendor have their own already, some have products that can be implemented, configured, maitained at less cost than others ,…..
    So data model is the smart behind a product and a company solution.
    I do not think we can have a unique model
    But we could think about having a neutral format for transfers
    Looks like i heard about this before !!!!
    Iges, step etc. Etc etc, the stry will just repeat itself at the plm level

  • Java, thanks for your comments. As many other enterprise software products, in the heart of PDM/PLM products is the ability to manage data. The capability of the data model is the smart behind the product. It defines also the uniqueness and competitiveness of the product. So, I can see a contradiction between the need for standards and competitiveness. This is a challenge, PLM vendors live all the time. Just my opinion… Best, Oleg

  • Pingback: everything about java()

  • Pingback: satu mare webdesign tur virtual domenii()