Process vs. Data. I think, this topic not requires a special introduction. In my view, every PLM implementation is facing this discussion and requires to take a decision about how to proceed. Few conversations with customers during DSCC 2011 last week and some articles I read on the long flight from Boston to Europe during the weekend made me think again about this process vs. data controversy, and I wanted to share my thoughts with you.
I was reading Capgemini blog post Business process management and mastering data in enterprise by Nicholas Kitson. Nicholas is talking about interesting aspects in failure of Business Process Management (BPM) implementations he experienced with customers. In the beginning of the artcicle, Nicholas quoting Gartner analyst Michael Blechar: “A failure to address service-oriented data redesign at the same time as process redesign is a recipe for disaster.”
I found this notion of “recipe for disaster” as something very important. In people’s mind, PLM system was a recipe for disaster. Even today, after the value of PLM was confirmed by many organizations and implementations, lots of people are still questioning about how to approach PLM in a right way. To continue with Capgemini article, I found the following passage very interesting:
While BPM tools have the infrastructure to do hold a data model and integrate to multiple core systems, the process of mastering the data can become complex and, as the program expands across ever more systems, the challenges can become unmanageable. In my view, BPMS solutions with a few exceptions are not the right place to be managing core data[i]. At the enterprise level MDM solutions are for more elegant solutions designed specifically for this purpose.
I found an interesting connection between this statement, and the presentation made by Bell Helicopter during Dassault Customer Conference last week in Las Vegas. Bell Helicopter embarked on the journey to implement Dassault newest V6 platform, and I was impressed by the presentation they made. You can see the following slide introduced one of the biggest problems in Bell’s organization back in 2005 was a significant need to modernize processes in the organization. They found that processes are too fragmented, and 467 legacy systems create a significant data and enterprise complexity.
The critical strategic decision made by Bell was to make PLM implementation first. Part of this strategy was so-called “get the core [product] data right first”.
PLM – Focus on Process
Since the industry focus move from PDM to PLM over the past 5-10 years, the question about what is the focus of PLM implementation emerged as something important. Until that time, most of the companies understood the value of PDM. Even despite PDM implementation complexity, the value of having the ability to vault CAD data and manage changes was mostly not disputable.
At the same time, I cannot say the same about management of product development processes. Let’s take Item / BOM and Change Management. Many PLM systems were “pushed” to manage BOM and Changes. However, in practice, it creates many problems. Bill of Material data (especially if you think not only about BOM from your CAD drawing) normally spread out multiple systems- PDM/PLM, ERP, Supply Chain Management. ECO is a process which clearly crossing multiple departments and data islands in an organization.
So, PLM system was pushed to be “focusing on processes”, and this push was very problematic. Sales and marketing were focused on promoting of the values of PLM to the companies. In practice, many organizations faced significant level of complexities to have, for example, change management process implementation across the entire organization. Why so?
PLM: How to streamline the data access
In my view, every manufacturing organization experiences a complexity of data. Data is overwhelmed. According to some industry researches, the amount of data volumes in organizations will be growing x44 times for the next 10 years. The question of managing data is long time in the spot of all PLM implementations. Very often, this question presented as “who owns part, BOM, etc.?”. The same question, but asked in a more intelligent way can sound like “who is mastering Part, BOM, etc. information”. The hidden question, I hear is the need to streamline data access related to these processes. This is a vital part of every PLM implementation.
The latest trend in this space is “unification”. PLM vendors are trying to push everybody to so-called “unified PLM platform” that will consolidate all data in a single place. For PLM vendors like Dassault, PTC, Siemens, it was “all except ERP”. For ERP-based PLM providers it gives even stronger voice of why PLM-ERP bundle may have an advantage.
The question, “how to streamline access to data” in the organization before you embark to the journey of process improvements is the key question that needs to be asked by all manufacturing companies. Without that, most of the “process improvements” and implementation will stack forever or will turn to a nightmare.
PLM and the promise of cloud applications
Cloud is hyping these days. It is not unusual to hear that cloud will solve the problem of complexity related to existing software in the enterprise. Here are few examples:
Dassault is talking about their V6 platform as a unique cloud platform (last week Bernard Charles, DS CEO mentioned $2B investment made into re-architecture of Dassault platform).
Another large company in engineering domain – Autodesk is just a week before making a significant announcement (see more details here). I found this quote interesting: Autodesk will forever improve the way you manage your business processes and workflows when we unveil a modern, zero deployment solution that makes collaboration, data, and lifecycle management accessible to anyone, anytime, anywhere.
Another new comer in this market – Kenesto (according to COFES 2012 registration, Mike Payne is CEO of Kenesto) promising to “revolutionize process automation“.
What is my conclusion? I think the failure to design data access in organizations, was a recipe for disaster for many PLM implementations. PLM programs were focused on “how to improve processes” and forgot about how to put a solid data foundation to support cross-departmental process implementations. So, I’d like to put a quote from Bell Helicopter’s presentation during DSCC 2011 as something PLM vendors and customers need to remember – “to get the core data right first”. Just my opinion, of course. YMMV.