A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Network Effect and Single Point of Truths

PLM Network Effect and Single Point of Truths
Oleg
Oleg
22 October, 2010 | 3 min for reading

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend  to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth.  So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
19 August, 2010

One of the interesting trends in PLM is growing amount of vertical integrations between components of PLM portfolio. The following...

24 May, 2025

It’s amazing to think about the concept of memory in modern AI models. These large language models “read” all publicly...

11 January, 2011

I had a chance to watch Jason Green’s video interview by TechCrunchTV Sarah Lacy. Take a time, watch and make an...

3 January, 2017

PLM implementations are slow. Cloud PLM reduced the level of complexity, but still there are things that cloud PLM cannot...

30 December, 2013

End of the year is traditionally associated with “top stories” and “next year predictions”. So, it is hard to resist…...

4 June, 2018

In my article last week – How to avoid “focus on business” cliche in PLM sales , I introduced typical PLM...

22 October, 2010

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine....

9 March, 2010

I think, it is always fun to talk about next disruption. When you are doing so, you have a real...

5 September, 2013

The power of search giants like Google is enormous these days. Think about the amount of information Google, Twitter and...

Blogroll

To the top