A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Network Effect and Single Point of Truths

PLM Network Effect and Single Point of Truths
Oleg
Oleg
22 October, 2010 | 3 min for reading

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend  to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth.  So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
21 March, 2016

One of the buzzwords you can hear in U.S. manufacturing for the last few years is reshoring. Reshoring (opposite to...

5 June, 2015

Whatever you can hear about complexity of PLM, there are lot of manufacturing companies that created outstanding PLM implementations. These...

28 September, 2016

Engineers don’t like PDM (product data management) and consider it as an unnecessarily evil. At the same time, complexity of...

9 July, 2015

You cannot stop innovation. Global manufacturing environment and changing technological landscape is a good foundation to think about how to...

24 March, 2012

Autodesk PLM 360 is widely announced and promoted “new cloud alternative” from Autodesk to disrupt PLM market. After initial announcement,...

22 January, 2015

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology,...

6 May, 2019

I’m coming to the Digital Factory Boston conference tomorrow. The conference is organized by Formlabs (formlabs.com). The program will be...

30 May, 2011

Let’s talk about PLM software development today. Rewind pre-Web 2.0 and pre- iPhone era. Life was simlpe. After SolidWorks finally...

28 May, 2022

Data is a very fascinating word. A few weeks ago, I was invited by Nina Dar to her Change Troubleshooter...

Blogroll

To the top