A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM Network Effect and Single Point of Truths

PLM Network Effect and Single Point of Truths
Oleg
Oleg
22 October, 2010 | 3 min for reading

Few weeks ago, I had a chance to attend a webinar – Learn How PLM Propels Innovation at Mercury Marine. The webinar is available now on demand. Navigate you browser to the following link and can see a recorded version of this webinar (requires an additional registration). This webinar provides a comprehensive study about Siemens PLM implementation at Mercury Marine. I recommend  to spend your time and watch this. If you are running PLM implementation you can find some interesting hints of the decisions that were made by Mercury Marine people and implementation team. However, I want to focus on a specific issue covered by this implementation – Single Source of Truth. The notion of a single source of truth is a popular one in PLM world. In the past, I wrote about it in few of my posts: PLM and A Single Point of Disagreement and Back to basics: PLM and Single Point of Truth and My slice of PLM Single Version of Truth.  So, why I decided to come back to the story again? I want to add few perspectives on this in the context of some industry and technological trends that became clearer over the past year

PLM and Single Model

The idea of a single point of truth based on few concepts that were developed by PLM industry during last 10-15 years: flexible data repository, common data model, data integration and federation. The fundamental belief of these concepts relies on the ability to manage a central database that provides a universal and scalable data storage used for product lifecycle. In addition it assumes that company data will be integrated into this repository and all people in the company will have an access to this data. However, in case of data located in other systems, data federation can be used to connect external data sources into this repository, which assumes a central data model consolidating these parameters. All major PLM vendors are using these concepts. Implementations can vary and it differentiates one PLM system from another.

In the case of Mercury Marine, the following picture presented a clear view on how product lifecycle integrated around central data model and data representation in TeamCenter.

This model represents a stable way to implement data lifecycle with the ability to control global changes, access and proces orchestration. However, this architecture has few potential downsides: 1- it requires definition of single model and agreement about this model in an organization; 2- additional cost of integration and 3- high sensitivity to change requests.

Networks vs. Databases

In my view, we have seen a significant growth into network-based architectures over the last decade. The boost of networks caused by development of Internet technologies and wide adoption of web resources and mobile expansion. If you come today to any organizational IT, you easy can find people that understand RDBMS. However, try to find people that understand Networks. Not so easy… Remember client-server technologies 10-15 years ago? Everybody understood mainframes… Today, the same happens with networks. Try to find people that understand networks of data, RSS, mobile data connectors, on demand replications, complex network data architectures. In my view, network organization will become dominant in the next 10 years. A significant growth in data and high demand to lower the cost of data management solutions will require to find a new reliable way to manage a product data lifecycle. Think about the following “dream architecture”:

Yes, it lacks of details, for the moment. However, network effect can change current product data life cycle towards new ways to keep consistency of data and will provide an alternative way to data organization.

What is my conclusion? The biggest question enterprise IT will need to answer is the cost of IT servers. Data management and data lifecycle is one of the most strongest data consumers in the organization. Central PLM databases and consolidated data storage can be too expensive for 2010s. Organizations need to learn about successful PLM implementation and understand how to make them more efficient in the future. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
12 April, 2017

AI and Machine Learning are trending these days. It is revolutionizing the world of computers and make possible to discover...

14 August, 2022

Although many manufacturers still rely heavily on document-based Bill of Materials (BOM) Excel files, the days of these archaic systems...

8 September, 2011

For the last couple of days, I’m digesting materials from the Dreamforce conference organized by Saleforce.com. I wasn’t able to make it...

30 March, 2012

This week was extremely busy. I was traveling to attend Autodesk Media Summit in San-Francisco. You can read about it...

22 June, 2017

The history of PLM is heavily intertwined with aerospace, defense and automotive industries. Back those days, these industries introduced the...

24 June, 2009

I watched many videos and demos related to Microsoft Bing during the last few weeks. Actually, I liked the idea...

24 May, 2019

PLM companies are full speed ahead in digital transformation. Every vendor is advertising how their technologies will be significantly improved...

28 April, 2016

In software application programming interface (API) is a set of routines, protocols and tools to build a software. An API...

12 January, 2015

PLM business and software vendors are transforming. Manufacturing companies are looking for new type of solutions that can give a...

Blogroll

To the top