A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Model: Granularity, Bottom-Up and Change

PLM Model: Granularity, Bottom-Up and Change
olegshilovitsky
olegshilovitsky
3 May, 2010 | 2 min for reading

Few weeks ago, I had chance to post about PLM Data Model. I think, PLM space has a real lack of discussions about data modeling. It seems to me, PLM vendors and developers are too focused on process management, user experience and other catchy trends. At the same time, everybody forgot that data model is bread and butter of every PDM/PLM implementation. I want to open some debates about what I see missing in current PLM data models.

Granularity
I’m very happy, this word started to catch up attention of people. It came in multiple discussions I had last time with some of the colleagues in the CAD/PDM/PLM software domain. Chis mentioned in it his Vuuch (www.blog.vuuch.com) blog. Al Dean also had chance to talk about it on his Develop3D (www.develop3d.com). One of the problems in PLM is a diversity of implementation and needs. PLM tools implemented lots of functional goodies over the past decade. However, the customization becomes a mess. It looks to me, current data model organization is outdated in most of PLM systems these days.  The last revolution PDM/PLM made was about 15 years ago when the notion of “a flexible data model” was introduced. Today, the next step should be done.

Bottom-up
How to build an efficient data model for PLM implementation? How to build a model that answers to the specific customer needs. Current vendor’s proposal is to make a selection from the list of all possible “modules”. It comes in a form of “best practices”. In my view, it is really “bad” practices. Selecting of big data model chunks put too many model constraints and create compatibility problems. The idea of bottom-up data modeling relies on the capability to define very granular pieces of data and grow bottom up in building a model that reflects customer needs.

Cost of Change
What is the most killing factor in today’s PDM/PLM software. In my view, it is cost of change. PLM models become not flexible and keep lots of dependencies on PLM system implementations. The future, in my view, is building very granular functional services alongside with the bottom up data model schema. It will allow to decrease cost of change, reduce dependencies between components and in the end, reduce a cost of change.

What is my conclusion? I think, technology matters. Without thinking about technologies, PLM won’t be able to make a next leapfrog. It becomes urgent. PLM model is a natural starting point to improve PLM implementation.

Just my thoughts…
Best, Oleg

Share

Recent Posts

Also on BeyondPLM

4 6
1 December, 2019

I’ve been following Autodesk development in PDM and PLM space for the last decade. You can check some of my...

24 January, 2017

IoT discussions are booming these days. The hype and expectations are so high. According to Gartner’s hype cycle, IoT is...

3 July, 2020

I’m catching up more on Siemens RealizeLIVE and today I want to talk about flexibility and customization. I learned about...

13 February, 2020

I spent the last three very busy days attending 3DEXPERIENCE World 2020 in Nashville, Tennessee. It was my first time...

14 November, 2012

Two words today are raising lots of discussion and controversy – Data and Openness. We live it everyday by hearing...

10 August, 2016

PLM can do everything. IoT is PLM, Digital Thread is PLM. The unfortunate behavior of sales people is to over-promise...

1 September, 2019

For the last few years, I’m following the implementation of the PLM system and related infrastructure at  Facebook. If you...

24 April, 2020

Microsoft Azure blog brings the story about how PTC and Azure partnership empowers industrial organizations. Check out the article –...

19 January, 2015

Platform is a topic, which comes quite often in the discussion about future of PLM. CIMdata recently came with a...

Blogroll

To the top