Posts tagged as:

PLM

plm-small-big-future-1

The debates about small vs. large PLM implementations are probably as old as PLM software. Joe Barkai recently came with several very controversial blog series – Is PLM Software Only for Big Guys? One of these posts – Do PLM Vendors Think SMBs are Just Like Large Enterprises, Only Smaller? Note the following passage:

In my market research in PLM, PDM and related fields, and consulting work with engineering organizations, I often find that SMBs don’t think of themselves as being just like the “big guys”, only smaller. They believe they possess different culture, work habits and operational models, and perceive PLM as a tool ideally suited for large organizations with sizable engineering teams designing complex highly engineered products.

Another Joe’s post is questioning – Can PLM software benefit small company?

Looking at the profile and size of engineering companies using PDM software, especially those showcased by mainstream PDM and PLM vendors, one might easily reach the conclusion that these systems are, indeed, designed with the “big guys” in mind. This perception may be reinforced by PLM and ERP vendors that have announced products designed for the SMB market and abandoned them a few years later, when rosy revenue expectations weren’t achieved. Remember, for example, PTC’s ProductPoint and SAP’s Business By Design? Small engineering teams have come to think of PLM software as unnecessarily complex and limiting operational flexibility, not to mention the high cost of the software, IT overhead, and the pain of keeping the software up to date.

It is true, that historically of CAD and PDM systems came from large defense and aerospace industry. Since then, lots of innovation in PDM and later in PLM domains was about how to simplify complex and expensive solutions and make it simple, more usable and affordable. 80% of functionality for 20% of price… It worked for some CAD guys in the past. Is it possible in PLM? PLM system fallen into the trap of the simplification many times. As soon as new affordable solution came out for SME companies, it was demanded by large enterprises as well. You can hear an opinion that price was a key factor and PLM vendors didn’t find a way how to sell both enterprise and SME solution with right packaging and price differentiation. Not sure it is true, but to shutdown SME focused PLM solution is not very uncommon in PLM industry.

I shared some of my thoughts about why PLM vendors failed to provide solutions for SME. One of my conclusions was that cost and efficiency are key elements that can help PLM vendors to develop a solution for this challenging market segment.

However, Joe’s posts made me think one more time about “small vs. large” PLM challenge. I want to share with you my 3 hypothesis why size won’t matter for the future PLM solutions.

1. Horizontal integration

Large monolithic businesses with strong vertical integration are displaced by granular and sometimes independent business units with diverse sets of horizontal relationships. Businesses are looking how to optimize cost in everything – development, supply chain, manufacturing, operation. I imagine these businesses will demand a new type of PLM solution that can be used by network of suppliers and business partners rather than by single vertically integrated organization.

2. Technological transformation

In the past, many PDM and PLM vendors assumed SME solution as something that shouldn’t scale much, can run on a cheaper hardware and low cost technology and IT infrastructure. Cloud, web and open source technological trends changed the landscape completely. While most of existing PLM solutions are still running on the infrastructure developed 10-15 years ago, I can see them looking for new architectures and technologies that with no question can scale to cover a diverse set of customers – small and large.

3. Business dynamics

Business environment is changing. Businesses are more dynamic. New requirements are coming often and the demand to deliver a new solution or changes went down from years to months. In such environment, I can hardly imagine monolithic PLM solution deployment that can sustain for a decade as it was before. I would expect PLM vendors to think about new type of platforms and set of agile applications serving variety of business needs.

What is my conclusion? Business, technological and organization changes will affect future landscape of PLM platforms and applications. Small is new big. New technological platforms will be able to scale to support a diverse set of customers. Vendors will be moving from shipping CDs to provide services out of public and private clouds. As a result of that, the difference between PLM for SME and Enterprise PLM will disappear. Future PLM solutions will come as platforms with diverse set of agile applications. Just my thoughts…

Best, Oleg

Share

0 comments

legacy-software

Do you know what is legacy software? Earlier today,  Marc Lind of Aras Corp. challenged me by his twitter status about companies complaining about legacy PLM systems and upgrading. Here is the original passage from twitter here and here.

“a lot of people complains about legacy PLM and a lot of companies that have legacy PLM are throwing in the towel and switching these days”.

marc-lind-legacy-plm-tweet

The part of statement about “legacy software” is really interesting. Last week, I wasn’t able to update a game on my son’s iPad. After few minutes, I discovered that Apple is not supporting the original iPad hardware manufactured 4 years ago. Does it mean iOS software run on that iPad is a legacy? Good question. At the same time, what about properly functioning ERP software that company runs already for the last 10 years without any plans to upgrade? Is that a legacy software?

Wikipedia gives me the following definition of legacy system:

In computing a legacy system is an old method, technology, computer system, or application program,”of, relating to, or being a previous or outdated computer system.”[1] A more recent definition says that “a legacy system is any corporate computer system that isn’t Internet-dependent.”[2]… The first use of the term legacy to describe computer systems probably occurred in the 1970s. By the 1980s it was commonly used to refer to existing computer systems to distinguish them from the design and implementation of new systems. Legacy was often heard during a conversion process, for example, when moving data from the legacy system to a new database.

Software upgrades is an important topic in engineering and manufacturing. Very often, systems can be in use very long time because of product lifecycle and the need to maintain existing data. It happens a lot in defense, aero and some other “regulated” industries. Also, because of significant investment, the ROI from upgrade can be questionable, which leads companies to keep existing outdated systems in operation. I’ve been posted about problems of PLM customization and upgrades before – How to eliminate PLM customization problems and Cloud PLM and future of upgrades.

PLM vendors are aware about the issue of upgrades and difficulties of software migrations . For long time, industry recognized it as something unavoidable. However, in today’s dynamic business environment, the issue of software upgrades cannot be ignored. Customers demanding flexible and agile software that can be deployed and updated fast. At the same time, changes of business models towards services and subscriptions pushed the problem of upgrades back to vendors.

Earlier this year, my attention was caught by CIMdata publication – Aras Innovator: Redefining Customization & Upgrades. Aras enterprise open source model is predominantly subscription oriented. Which provides lots of incentives for Aras  engineers to solve the issue of upgrades and new versions deployment. Here is the passage from the article confirming that:

For several years, the Aras Corporation (Aras) has included no-cost version-to-version upgrades in their enterprise subscriptions, independent of how the solution has been customized and implemented. This is a rather bold guarantee given the historic challenges the industry has experienced with upgrading highly customized PLM deployments. With more than 300 upgrades behind it, CIMdata felt it appropriate to find out how Aras’ guarantee was playing out, and discovered that there was much more to the story than just a contractual guarantee. Fundamentally, Aras Innovator is engineered to be highly configurable—even customizable—without resulting in expensive and complex version-to-version upgrades and re-implementations.

One of PLM software leaders, Siemens PLM is also thinking about What is the best release cycle. The article speaks about SolidEdge release cycle.

A few years ago we moved from an irregular release cycle for Solid Edge, maybe 9 months in one cycle to 15 months in the next, to a regular cycle of annual releases (of course there are also maintenance packs delivered in the interim). I believe our customers much prefer this, they can plan ahead knowing that there will be a significant Solid Edge release available to them in August each year.

At the same time, the article confirms that CAD/PLM vendors are looking how to solve the problem of upgrades. As I mentioned earlier, cloud software model is one of the most promising technical ways to solve the issue of upgrades. It is true, but can be tricky in case both desktop and cloud software are involved. Here is the passage from the same Siemens PLM blog:

Working in the PLM area we try really hard to provide our customers with a good upgrade experience. PLM software is itself dependent on both the operating system and database software, and it has to work with specific releases of CAD software  (sometimes with more than one CAD solution for our multi-CAD customers) and with office software as well! Moving PLM software to the cloud could potentially take some of the upgrade issues away from the end user, but PLM software does not work in isolation from your data files, or your other software and systems so I believe there is much work still to be done before the cloud really impacts the upgrade situation for real-world customers.

What is my conclusion? From customer perspective, the best option is to make release cycle completely transparent.  In my view, this is really high bar for PLM vendors. Customer data migration, customization and sometimes absence of backward compatibility make release transparency questionable. However, since industry moves towards cloud software and service business model the demand for agile release management and absence of upgrades will be growing. So, my hunch, in the future we will not see “legacy software” anymore. New type of enterprise software will manage upgrades and migrations without customers paying attention. Sound like a dream? I don’t think so. For most of web and consumer software it is a reality already today. Just my thoughts…

Best, Oleg

Share

6 comments

ipad-bom-assy

To manage Parts and Bill of Materials is not a simple tasks. I shared some of aspects related to the complexity of Part Numbering last week in my post – Existing data prevents companies to improve Part Numbers. The discussion in comments took me towards the complexity of Part Numbers in supply chain. Here is the passage (comments) made by Joe Barkai

…multiple BOMs with inconsistent numbering schema often hide a bigger problem: inconsistent attributes and metadata. I [Joe Barkai] worked with a global automotive OEM on issues surrounding architectural complexity reduction and global quality management. I discovered that each product line was using different part numbers. This was obviously difficult to manage from a supply chain perspective. But, not less importantly, other metadata and data attributes such as failure modes, labor operation codes and other important information were codified differently, rendering cross product line reporting and analysis difficult and potentially lacking, if not erroneous

Product lines and multiple configurations is a reality of modern manufacturing. The customization level is growing. On the other side to manage parts and BOM globally becomes one of the most important and challenging tasks. I found another example of that in today’s news . This is an example of a potential impact on Apple from management of bill of material  across multiple product lines and supply chain. Navigate to Seeking Alpha post – Apple iPhone 6 Will Pick Up iPad Sales Slack. Here is the passage I captured:

Apple still generates the majority of profits in mobile, despite the slight declines in market share. Last November, research firm IHS estimated  $274 in bill of materials and manufacturing costs for the 16GB iPad Air with Wi-Fi connectivity that retails for $499. Going forward, Tim Cook, operations man, will likely leverage Apple’s immense buying power to further drive down costs for component parts shared between the iPhone 6 and eventual iPad upgrade.

I have no information about PLM system used by Apple to manage bill of materials across product lines. However, I guess, re-use of components among different product lines is a very typical approach used by many manufacturing companies.

What is my conclusion? The complexity of bill of materials management across product lines and supply chain are skyrocketing these days. To manage part numbers, bill of materials, cost and multiple product lines can become a critical part of PLM solution to support manufacturing profitability. Just my thoughts…

Best, Oleg

Share

0 comments

How long will take GrabCAD to develop full-blown PLM solution?

August 18, 2014

Time is running fast. It has been two years since I posted GrabCAD: from Facebook for engineers to PLM. If you are in the engineering community, the chances you will come to PLM are very high. Like in the past all roads lead to Rome, I guess all future development roads for PDM solution lead […]

Share
Read the full article →

Existing data prevents companies to improve Part Numbers?

August 15, 2014

Part Numbers is a fascinating topic. I’m coming back to blog about what is the best approach to manage Part Numbers. My last post about it was – Part Numbers are hard. How to think about data first? was just few weeks ago. In that article, I outlined few principles how to keep PN separate from […]

Share
Read the full article →

How to visualize future PLM data?

August 12, 2014

I have a special passion for data and data visualization. We do it every day in our life. Simple data, complex data, fast data, contextual data… These days, we are surrounded by data as never before. Think about typical engineer 50-60 years ago. Blueprints, some physical models… Not much information. Nowadays the situation is completely […]

Share
Read the full article →

PLM: Tools, Bundles and Platforms

August 11, 2014

I like online debates. The opportunity to have good online debates is rare in our space. Therefore, I want to thank Chad Jackson for his openness to have one. I don’t think Chad Jackson needs any introduction – I’m sure you had a chance to watch one of his Tech4PD video debates with Jim Brown of […]

Share
Read the full article →

PLM workflow dream

August 8, 2014

Process management is a very important part of any PLM software. You can find one in every PLM system. There are so many ways to define and manage process. Few years ago I captured some of them here – PLM Processes: Flowchart vs. Rule-based? While, I believe, we can agree about importance of processes management, I […]

Share
Read the full article →

The end of single PLM database architecture is coming

August 5, 2014

The complexity of PLM implementations is growing. We have more data to manage. We need to process information faster. In addition to that, cloud solutions are changing the underlining technological landscape. PLM vendors are not building software to be distributed on CD-ROMs and installed by IT on corporate servers anymore. Vendors are moving towards different […]

Share
Read the full article →

Importance of data curation for PLM implementations

August 4, 2014

The speed of data creation is amazing these days. According to the last IBM research, 90% of the data in the world today has been created in the last two years alone. I’m not sure if IBM counting all enterprise data, but it doesn’t change much- we have lots of data. In manufacturing company data […]

Share
Read the full article →