Posts tagged as:

PLM

How PTC is delivering PLM in the cloud?

by Oleg on January 28, 2015 · 0 comments

ptc-plm-cloud-1

Cloud is trending and it is hard to find a company who is not thinking how to leverage new cloud technologies and business models. However, just to say “cloud” these days means probably nothing. The right question is how to implement cloud. I guess many companies these days are coming to that question. It goes in parallel with the discussion about what is “cloud” and what is “not cloud”, which has some technical and some marketing aspects.

Long time ago, PTC introduced PLM “On Demand“. You should remember this marketing name that later was replaced by SaaS and cloud. According to the PTC website, the solution is available and hosted by IBM. I noticed some indication of PTC move to the cloud back in 2013 after acquisition of NetIDEAS. My writeup about that is here. According to PTC press release NetIDEAS allowed to PTC to develop a better foundation to offer multiple deployment options.

Earlier today, my attention was caught by PTC announcement – PTC Introduces PTC PLM Cloud New PTC Windchill SaaS offerings for small and midsized companies. The following passage is explaining the reason why PTC is coming with cloud product offering

Recognizing that many SMB organizations may lack a dedicated IT staff but still want to adopt a proven PLM environment, PTC designed PTC PLM Cloud specifically to enable team collaboration and data management in the cloud. This flexible offering eliminates the typical, but risky, SMB practice of shared folders and file naming conventions which hamper product development. With more effective and reliable data sharing in the cloud, customers are able to improve product development across teams in different locations, teams working with varying CAD applications, and with external teams such as partners and suppliers who are a growing part of the product development process.

I tried to dig inside of materials available online to see how PTC will provide cloud PLM and what options are available. Navigate here to learn more. It is available with 3 options – standard, premium and enterprise. While names mean nothing, the following definition caught my attention – “instant access” for standard vs. “dedicated database” for others. In addition to that, the differences between options down to “workflow customization” in premium “new business objects and UI customization” for enterprise. It looks like PTC recognized the importance of MCAD data management – all versions are coming with integrated viewing solution and support for Creo, AutoCAD, Inventor and SolidWorks.

ptc-cloud-28-jan-2015

The questions that remaining open me at this moment are price and cloud (hosting) architecture. It is essentially important for customers today as I mentioned earlier in my post – Why you should ask your cloud PLM vendor about Devops and Kubernetes.

What is my conclusion? Manufacturing companies are not implementing PLM because of high cost and availability of IT resources. To many customers and vendors today, cloud seems like a right path to remove IT cost and make implementations less painful. From that standpoint, PTC is taking right trajectory by delivering Windchill based PLM solution using cloud. However the devil is in details. I’m looking forward to learn more about “how” PTC on the cloud will be delivered and how it will be different from other PLM clouds from Autodesk, Aras, Dassault Systemes and Siemens PLM. Just my thoughts…

Best, Oleg

Share

0 comments

The anatomy of PLM upgrades

by Oleg on January 26, 2015 · 1 comment

plm-migration-upgrades

Software upgrades is a fascinating topic. It has been with us from a very early beginning of software. Seriously, we hate upgrades. On the other side, very often, this is the only way to make a progress. The main problem of upgrades is related to existing dependencies – migration of data, file formats and data incompatibilities, hardware incompatibilities, etc.

As software is getting more complex, the complexity of upgrades are increasing. Enterprise software is a very good example. Talk to people about ERP, PLM and other enterprise software upgrades and you can learn a lot about effort and cost of upgrades for an organization.

For a long time, enterprise software upgrades were considered as something inevitable. Which led to many problems for customers. One of the extreme situation is when a specific configuration of a system becomes non-upgradable. It is known as “version lock-in”. Most typical reasons – features and customization incompatibility between new software version and the one customer is still running. As much as customers are discovering the complexity of upgrades, we can see software vendors are trying to leverage it to demostrate their differentiation.

For last few years, I can see an increased focus of PLM vendors around “upgrade and migration”. My hunch, too many customers stuck in previous versions of PLM software or outdated PLM systems. Random PLM (future) thoughts article by Jos Voskuil speaks about PLM systems upgrades complexity. Read the following passage:

Not every upgrade is the same! Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing. Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework. In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.

I think, the overall trend in quality of enterprise software is positive. Consumer software mentioned by Jos is only one factor why enterprise software vendors are investing more in quality. Jos’ article made me think more about how customers should approach the topic of PLM migrations and upgrades. In general,  I think, it can be applicable not only to PLM systems. PLM vendors are trying to make migrations easy from both economical and technological standpoints. Here are some of my thoughts about anatomy of PLM software migration.

Migration technologies

While some technologies can give you an advantage during migration and upgrades, from a technical standpoint you cannot avoid upgrades. Very simple – from time to time you need to restructure database to bring new features or optimize for performance. Since PLM is relying on OS and database technologies, you need to get upgrades to bring PLM system into compatible state with new OS/RDBMS. If you PDM/PLM system is integrated with other CAD systems, this is another aspect of migrations.

From technological perspective, migration is always sort of extract, transfer, load type of things. It can be minor or major. It can happen in a single database or may require a separate set of application or database servers. PLM system architecture designed with “upgrade in mind” can make it easier, but won’t eliminate it completely.

PLM vendors and economic of migration 

PLM vendors are starting to pay attention to migration and upgrades. While the status of PLM systems is far from an ideal when it comes to migration, some vendors are proposing to cover upgrades and migrations as part of PLM service and licensing offerings.

SaaS (cloud) is providing another way to hide migration and upgrades. Since customer is not buying software to install it in their data centers, the problem of migrations and upgrades eventually is part of PLM vendor responsibility.

Technical elements of migration

There are 3 main elements that can increase PLM system vulnerability to upgrades and migrations – 1/ custom data model; 2/ code customization and scripting; 3/ integration with other system. The amount of specialization in each of them, can increase a potential cost and complexity of migration.

What is my conclusion? You cannot avoid migrations and upgrades. So, to plan ahead is a good idea. You should evaluate vendor and product for “updatability”. It is not simple, especially when it comes to on-premise software. Product architecture evaluation should be an important element of your system selection process. If you think about SaaS /cloud as a universal solution for upgrades and migration, I recommend you to take it carefully as well. It certainly removes a pain from a customer. However, take into account it won’t eliminate upgrades from technological standpoint. Upgrades are essential part of SaaS product development. Depends on SaaS architecture and development methodology, the system can be in an upgrade mode all the time. Which is a good thing because it will be become part of product delivery. Just my thoughts…

Best, Oleg

Share

1 comment

plm-data-modeling

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in any PLM implementation. This is a process, which creates an information model of product and processes in a specific company. To get it done is not simple and it requires lot of preparation work, which is usually part of implementation services. Even more, once created data model needs to be extended with new data elements and features.

Is there a better way? How other industries and products are solving similar problems of data modeling and data curating. It made me think about web and internet as a huge social and information system. How data models are managed on the web? How large web companies are solving these problems?

One of the examples of creating a model for data on the web was Freebase. Google acquired Freebase and used as one of the data sources for Google Knowledge Graph. You can catch up on my post why PLM vendors should learn about Google Knowledge Graph. Another attempt to create a model for web data was Schema.org, which is very promising in my view. Here is my earlier post about Schema.org – The future of Part Numbers and Unique Identification. Both are examples of curating data models for web data. The interesting part of schema.org is that several web search vendors are agreed on some elements of data model as well as how to curate and manage schema.org definitions.

However, it looks like manual curating of Google Knowledge Graph and Schema.org is not the approach that makes web companies to feel happy about and leapfrog in the future. Manual work is expensive and time consuming.  At least some people are thinking about that. Dataversity article “Opinion: Nova Spivack on a New Era in Semantic Web History” speaks about some interesting opportunities that can open a new page in the way data is captured and modeled. He speaks about possible future trajectories of deep learning, data models and relationships detecting. It can extend Schema.org, especially in the part that related to automatically generated data models and classifications. Here is my favorite passage:

At some point in the future, when Deep Learning not only matures but the cost of computing is far cheaper than it is today, it might make sense to apply Deep Learning to build classifiers that recognize all of the core concepts that make up human consensus reality. But discovering and classifying how these concepts relate will still be difficult, unless systems that can learn about relationships with the subtly of humans become possible.

Is it possible to apply Deep Learning to relationship detection and classification? Probably yes, but this will likely be a second phase after Deep Learning is first broadly applied to entity classification. But ultimately I don’t see any technical reason why a combination of the Knowledge Graph, Knowledge Vault, and new Deep Learning capabilities, couldn’t be applied to automatically generating and curating the world’s knowledge graph to a level of richness that will resemble the original vision of the Semantic Web. But this will probably take two or three decades.

This article made me think about the fact manual data curating for Freebase and Schema.org is a very similar process to what many PLM implementers are doing when applying specific data and process models using PLM tools. Yes, PLM data modeling happens usually for a specific manufacturing companies. At the same time, PLM service providers are re-using elements of these models. Also companies are interconnected and working together. The problem of communication between companies is painful and still requires some level of agreement between manufacturing companies and suppliers.

What is my conclusion? Data modeling is an interesting problem. For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company. Sounds like a dream? Maybe… But manual curating is not an efficient data modeling. The last 30 years of PDM/PLM experience is a good confirmation to that. To find a better way to apply automatic data capturing and configuration for PLM can be interesting opportunity. Just my thoughts…

Best, Oleg

photo credit: tec_estromberg via photopin cc

Share

0 comments

Can BOX become a platform for PLM?

January 19, 2015

Platform is a topic, which comes quite often in the discussion about future of PLM. CIMdata recently came with a topic of “platformization” in PLM. You can catch up on the discussion – A CIMdata dossier: PLM platformization. I can probably divide all existing PLM platforms into two groups – 2D/3D design platform and Object […]

Share
Read the full article →

Will search replace engineer’s brain in the future?

January 16, 2015

Computers are changing the way we work. It is probably too broad statement. But if I think about the fact today is Friday afternoon, it should be fine . I want to take a bit futuristic perspective today. Google, internet and computing are good reason why our everyday habits today are different from what we […]

Share
Read the full article →

Top 5 PLM trends to watch in 2015

January 15, 2015

Holidays are over and it was a good time to think about what you can expect in engineering and manufacturing software related to PLM in coming year. You probably had a chance to listen to my 2015 PLM predictions podcast few months ago. If you missed that, here is the link. Today I want to […]

Share
Read the full article →

How many enterprise PLM systems will survive cloud migration

January 14, 2015

Cloud adoption is growing. For most of existing PLM vendors it means to think about how to migrate existing platforms and applications to the cloud. I covered related activities of PLM vendors in my previous articles. Take a look here – PLM cloud options and 2014 SaaS survey. It can give you an entry point […]

Share
Read the full article →

Utility and future PLM licensing models

January 13, 2015

Razorleaf article More PLM Licensing models made me think about business models and licensing transformation that happening these days in engineering and manufacturing industry. I guess, we knew changes are coming… Back in 2012 I shared some of my thoughts about PLM Cloud and Software Licensing Transformation. In a bit different perspective I’ve been discussed future […]

Share
Read the full article →

Why today’s CAD & PLM tools won’t become future platforms?

January 12, 2015

PLM business and software vendors are transforming. Manufacturing companies are looking for new type of solutions that can give a faster ROI as well as become a better place for engineering and manufacturing innovation. The dissatisfaction of customers about slow ROI and low value proposition is growing. Back in 2012 I was listening to Boeing […]

Share
Read the full article →

3 PLM deployment and adoption challenges

January 9, 2015

The discussion about the demand for PLM services yesterday made me think more about challenges in PLM deployments. Implementation is one of the biggest time consuming component in enterprise software. Engineering, manufacturing and product lifecycle domain represents a significant difficulties for software vendor and service provider. It is very rare to see fast implementations. In today’s […]

Share
Read the full article →