A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Network Effect and Single Point of Truths

PLM Network Effect and Single Point of Truths
Oleg
Oleg
22 October, 2010 | 3 min for reading

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend  to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth.  So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
29 July, 2016

Unless you’re taking digital detox, you probably heard about big news in enterprise software world – Oracle buys Netsuite. The...

17 March, 2010

I found the following “device” in my mail box today- Google Apps Saving Calculator.The Out-Of-The-Box functionality is damn simple. You...

6 October, 2016

Cloud technologies and applications are trending. Just few years ago the most of cloud PLM debates were about “cloud vs....

29 September, 2009

Interesting blog post from Google Enterprise. Import, Export and more with Google Sites API. What turns Google Sites to much...

8 October, 2009

Do you remember, PLM Think Tank is about future crazy ideas? So, here you go – one that I liked...

12 May, 2011

The definition of PLM is one of the topics I’m discussing on blog. I think, in the modern enterprise software,...

12 February, 2021

3D EXPERIENCE WORLD 2021 was live earlier this week and I was watching a few sessions. Moving from products to...

18 December, 2010

Innovation is a popular word these days. It sounds modern and trending. Everybody wants to jump to this bandwagon. I...

7 October, 2018

Altium is a software outfit developing design applications (CAD) for PCB and electronics. In the world of growing “PCB everywhere”...

Blogroll

To the top