Posts tagged as:

Future

building-the-grid-of-data

I had a chance to visit The Art of Brick exhibition in Boston’s Faneuil Hall Museum few days ago. If you following me on social media websites, there is a chance you noticed few pictures. Afterwards, I read more about LEGO artist Nathan Sawaya. What impressed me is a power of “simple LEGO brick”. A simple plastic brick and huge amount of imagination is allowing to create such an incredible models.

lego-bricks-sawaya

You can ask me – how is that connected to engineering, manufacturing and product lifecycle management? Here is the thing… It made me think about ways PLM systems are implemented these days. I’m sure you are familiar with the “best practices” approach. The topic isn’t new. I found my old post - PLM best practices torpedo. After five years, I still like my conclusion – PLM best practices are good to show what PLM technology and software are capable to do. However, for real implementation, it is not very useful. You have to come back to a “simple bricks” of PLM technology – data models, documents, lifecycle statuses, bill of materials, processes.

I captured a bit different perspective about PLM best practices. Navigate to PLM cultural change blog – PLM design patterns. It took me back into thinking about best practices. How to define implementation patterns and make a real PLM implementation much easier? The article speaks about general way of PLM implementation can be done, organizational transformation and change. Read the article. I found the following few passages interesting:

In general you can setup all required supporting procedures in using the PLM design patterns. Even for specific supporting procedures of a business process pattern like Engineer to Order (ETO) you can derive patterns, which consists of a framework of general PLM design patterns and are adapted to the specific business needs. There is enough freedom to derive based on these patterns supporting procedures to fulfill specific business needs.

If some organizations would have implemented supporting procedures based on patterns already, then consultants in introducing PLM to an organization could refer to “state of the art” implementation examples of these organizations. The target is to convince an organization, that the decision for a new practice requesting organizational change is required and works. Only then the organization can enable the full potential of the PLM methodology without remaining stuck in the current practice.

Instead of inventing a Ping-Pong table “from scratch” with a cabinetmaker we can make a clear decision based on all the options available, fulfilling and probably exceeding our originally perceived needs (with a safe and easy-to-use folding mechanism). And we can afford it, because a stock table is cheaper than a custom built one.

The time saved in avoiding the endless discussions and continual redesign of processes because of paradigm paralysis, based on current methods, could be better used in a well-planned, strategic deployment of the new processes leading to an improved business solution.

plm-design-patterns

The idea and vision of configurable patterns and best practice is interesting. In my view, it was invented earlier as PLM toolkits, flexible data models and process templates. The key problem here is not related to technology- software does what it does. The problem is related to people and organization. Remember, technology is simple, but people are really hard. What called “to convince people” is actually a process that takes organization and people to understand their business and product development patterns. Without that understanding the chances of successful PLM implementation are very low and probability of PLM project failure is high.

So, what could be 21st century solution for that problem?

My attention today caught by a new startup – The Grid. The tagline states – AI websites that design themselves. The vision of The Grid is to change the paradigm of website building. The idea of self-building websites driven by artificial intelligence and data analysis is something worth to think about. Watch the video.

Now let me back to manufacturing companies and PLM implementations. All manufacturing organizations are different. The approach most of PLM vendors are taking these days is to classify companies by size (small, medium, large), industry (aero, auto, industrial equipment, etc), manufacturing model (mass production, configured to order, engineering to order, etc.) and many others such as locations, supply chain, existing enterprise systems (ERP, SCM, etc.). The decision matrix is huge. To make analysis of existing manufacturing company, processes, existing systems, requirements – this is what takes time and money during PLM implementation.

What is my conclusion? The opportunity we have today is coming from new way to process data. Call it cloud computing, big data, whatever. Facebook is reporting about a capability to index trillion posts. Would it be possible to capture data from an existing manufacturing company and ask PLM system to build itself? Is it a dream or a future of PLM? Just my thoughts…

Best, Oleg

pictures credit to The Grid website and PLM cultural change blog

Share

0 comments

The foundation for next PLM platforms

by Oleg on August 29, 2014 · 6 comments

PLM-software-platforms

Platform. This is a sweet word in a lexicon of every developer. The desire of software vendors is to become a platform to fuel the development of other products and serve needs of customers. In my debates with Chad Jackson about granularity and integration earlier this month, I outlined what, in my view, can differentiate tools, bundles and platforms. That discussion made me think even more about what PLM platforms are made today. In my view, there are two major foundations for most of PLM systems and tools developed today: 1- 2D/3D design platform and 2- object database modeling abstraction. Let me speak more in details about each of these foundations.

2D/3D design platform

Geometric paradigm provided strong foundation for design and engineering since early beginning of CAD/PLM. Therefore, CAD systems are deep in roots of PLM vendors today. Historically, all major PLM vendors today developed their software and businesses from CAD and related engineering applications. As a result of that, 2D/3D geometry, design, modeling and related information is a foundation of their products. Geometry modeling combined with PDM (product data management) created core foundation of these platforms.

Object Database Modeling

Object data modeling paradigm used by many CAD agnostic PLM vendors. Many of these vendors started as PDM companies expanded to support product development processes. Therefore, flexible data management approach became a main foundation layer for these products. Most of these systems were developed on top of relational databases (RDBMS). The flexibility of these platforms to manage any product information and related processes is a key strength.

Next PLM platform

What do you think will happen in the future of PLM platform? Are we going to see new elements and technologies to fuel future PLM development? In my view, last decade of innovation in open source, data management, web and cloud technologies created a new foundation for future PLM platforms. At the same time, the maturity of product lifecycle management implementations can provide a better understanding of functional architecture of PLM products. It made me think about what can become a foundation of future PLM platform development. Below, I put my four candidates to play a role of next PLM platform foundation.

1. MBSE (Model Based System Engineering).

As products are getting more and more complex, the approach that helps us to support product development becomes more visible and important.  Product is going much beyond 3D mechanical design and contains information about system architecture, requirements, functional decomposition of mechanical, electronic and software elements. From that standpoint, MBSE is a good foundation to create a platform and I can hear many voices these days about future of MBSE approaches.

2- Unbundled 3D service

3D was born as part of CAD design. Engineers need to use 3D CAD system to create actual product. However, there are many people in manufacturing ecosystem that just need to consume 3D data or information in the context of 3D data.  Think about 3D service unbundled from CAD system providing ability to visualize and re-use 3D information, combine it with other non-3D information. In my view, such approach can create a good foundation for future PLM platforms. I can see PLM vendors taking some elements of this approach today.

3- Product Development Standards

The level of dependencies in a modern manufacturing eco-system is huge. You can hardly find a single manufacturing company solely responsible for the development of their products. Companies are relying on development partners and hundreds of suppliers. Therefore, standards are getting more and more important. Some of product development and vertical industry standards can provide a functional foundation for future PLM platforms too.

4- Database technologies, big data and web infrastructure

Data technologies is a key element of any PLM system. We need to be able to manage a diverse set of information about product – visual, structured and unstructured. Functional requirements are different from the ability to create and maintain the information as well as ability to make analysis and re-use the information in a very scalable way. Modern data management software stack can become a foundation for future PLM platforms.

What is my conclusion? Product and technological development are going together. New platforms can arise from as a result of maturity of product and technological innovation. I see these four sources as a list of core elements of platform innovation sources. This is of course not an exhaustive list. I can see potential mix of these approaches together as well.   These are just my thoughts and I’m looking forward to your comments.

Best, Oleg

Share

6 comments

plm-componentizing

Product Lifecycle Management is not a software. It is business strategy and approach. One of my blog readers mentioned that in the discussion few days ago. Nevertheless, manufacturing companies are usually talking about PLM systems and platforms as something solid and unbreakable. The same picture you can see when looking on PLM online marketing materials and brochures. Despite recent changes in broad PLM acceptance and value proposition, companies still see PLM as a software mostly for engineering domain or driven by engineering IT. One of the dreams many PLM vendors developed for the last decade is how to reach the C-level management such as CIO and engineering executives. In other words, how to reach ERP level of acceptance and awareness.

Earlier today, my attention was caught by Toolbox.com article about modern ERP trends. Navigate to read ERP Trends: Shifting from Big ERP Systems to Componentized ERP Environments.  Cloud is changing the face of ERP. The technology is breaking ERP into pieces. One of the results – two tiers ERP configuration. Here is the explanation I captured from the article.

Because of the coinciding innovations in cloud technology, instead of deploying and implementing traditional ERP infrastructure, organizations started adopting a two-tier, or hybrid, ERP model. Two-tier ERP is a method of integrating multiple ERP systems simultaneously. For instance, an organization may run a legacy ERP system at the corporate level while running a separate ERP system or systems, such as cloud ERP, at a subsidiary or division level for back-office processes that have different requirements. To facilitate the adoption of the two-tier methodology, vendors increasingly opened core databases and application programming interfaces and provided customization tools, thus spurring the advent of self-contained, functional ERP components or modules.

So, what does it mean for existing and future PLM strategies and products. More specifically, it made me think about the possibility to break large and heavy PLM platforms into sets of re-usable components. ERP componentizing example speaks about splitting ERP system into modules such as – supply chain, financial, management, human resources. So what potential PLM split can look like? I can see two possible ways here – business process and lifecycle. The first one is something probably we can see a lot in existing PLM platforms. Requirement management, Design Collaboration, Change Management, NPI, etc. I’ve been thinking about Lifecycle as an alternative approach to the traditional business process oriented approach. Lifeycle approach means to develop applications to serve people with their everyday tasks based on maturity of product in the development or services. Think about manufacturing assembly line. Different tools and operations are applied to manufacturing product to bring it to life. Now think about PLM and software tools. PLM components will be used to create product (actually product data and related information).

Toolbox article also speaks about difficulties of componentized approach. The main one is a potential growth of TCO because of the need to integrated data coming from different modules. Here is the passage I specially liked:

The data from the second-tier cloud ERP or modules typically require normalization to integrate with the legacy ERP system at the corporate level. Although direct cost is associated with master data management to ensure consistency and no redundancy, by extending the life of the legacy system, the intention is to reduce the total cost of ownership (TCO) while meeting additional needs for flexibility and functionality. However, the shorter duration of implementing and deploying a two-tier ERP model can actually lead to increased TCO if the indirect costs, such as training, hiring staff, and vendor support, are not taken into to account, as well.

The same problem will arise if we try to break PLM into components. With no solid data foundation, ability to bring and integrate various PLM components will be questionable. The integration cost will skyrocket. Compatibility between PLM components versions will make it even harder. Nevertheless, I can see growing business requirements, customers’ demand and shorter lifecycle for software products as something that will drive future PLM technological changes. Componentizing will be one of them.

What is my conclusion? To break large and heavy PLM suites into configurable and flexible components is an interesting opportunity to satisfy today’s dynamic business reality. However, two fundamental technologies are required to make it happen – scalable open data platform and reliable integration technologies. Just my thoughts…

Best, Oleg

Share

1 comment

How not to miss PLM future?

March 23, 2014

The world around us is very disruptive these days. Nothing stands still. You cannot stop innovation and progress. Engineering and manufacturing software is not fastest changing domains. It explained by slow changing process, high level of complexity in product development and significant capital investment manufacturing companies made in existing PLM and other enterprise software. Nevertheless, […]

Share
Read the full article →

PLM Software and Open Source Contribution

February 11, 2014

Open source is a topic that raised many controversy in the last decade. Especially if you speak about enterprise software. The trajectory of open source software moved from absolute prohibition to high level of popularization. In my view, the situation is interesting in the context of PLM software. The specific characteristic of PLM is related […]

Share
Read the full article →

The future of PLM Glassware?

April 19, 2013

Technological predictions are tough and nobody wants to make them. Back in 2010, I came with the following post – Who Can Generate 3D/PLM Content For Apple iPad? Back that time, the value of iPad was questioned by many people. Speaking about manufacturing companies, people were very skeptical by the ability of iPad to bring […]

Share
Read the full article →

Interoperability will play a key role in a success of future CAD/PLM

January 26, 2013

Data. Conversion. Interoperability. Translation. The discussion about these topics is endless in CAD/PLM world. Customers are looking for interoperability between different product versions, competitive products, data models, data formats, databases and geometrical kernels. Customers were always first impacted by problems of interoperability. The lifecycle of engineering and manufacturing work is longer than typical lifecycle of […]

Share
Read the full article →

PLM: Ugly vs. Cool

April 8, 2012

Do you think PLM software must be cool? More than two years ago, I posted FREE and COOL trends in CAD/PLM. I’m observing an increased amount of discussions about “PLM coolness” in the past few weeks. The release of PLM 360 by Autodesk just amplified the interest to the “cool” side of PLM. In my […]

Share
Read the full article →

How to re-invent “PLM collaboration” world?

March 23, 2012

What do you think about “PLM Collaboration”?… Yes, I can hear you – boring. However, what if I tell that collaboration can be cool again? Over the past year, I was tracking few vendors investing and playing with a collaboration topic. Today I decided to give you my perspective on what I believe will re-invent […]

Share
Read the full article →

Dassault V6, 3D Experience and “After PLM” Party

February 28, 2012

You are probably familiar with the following statements “beavers do what beavers do“. I’ve got the confirmation of that last month when visited PTC HQ in Waltham, MA. It was amazing to see how PTC is focused on moving PLM story to the higher level of maturity. At the same time, one of the PTC […]

Share
Read the full article →