A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Network Effect and Single Point of Truths

PLM Network Effect and Single Point of Truths
Oleg
Oleg
22 October, 2010 | 3 min for reading

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend  to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth.  So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
18 March, 2016

The amount of data around us is skyrocketing everyday. We are overloaded with the information coming from personal and business channels....

21 May, 2019

It is a time to come back to Open Source software discussion. Open source was not the hottest buzzword for...

20 November, 2015

Manufacturing companies, analysts and software vendors are sharing excitement about huge potential of IoT and connected products. This is a very...

7 October, 2013

There are almost as many cloud storage services as clouds in the sky these days. Cloud opens a huge opportunity...

10 July, 2012

Experience. We love this word nowadays. I can hear many discussions about “experience” these days. The last five years of...

3 September, 2021

Data is one of the most valuable assets of the 21st century and businesses are desperately looking for how to...

26 January, 2017

Artificial Intelligence (AI) is having a renaissance moment these days. With the risk to disclose my age, I can remind...

6 July, 2013

Graphs are fascinating. I think, we like graphs more and more these days. I like graphs a lot because in...

26 December, 2017

Dear Friends, It is time to take a break and think about what happened in 2017. Many of my friends...

Blogroll

To the top