From the category archives:

Technologies

smart-products-bom

We live in the era of smart products. Modern smartphones is a good confirmation to that. The average person today keeps in his pocket a computer with computational capability equal or even more than computer that aerospace and defense industry used for navigation. In addition to that, you smartphone has communication capability (Wi-Fi and Bluetooth) which makes it even more powerful. If you think about cost and availability of boards like raspberry pi and Arduino, you can understand why and how it revolutionize many products these days. Although, wide spread of these devices has drawbacks.

Smart products are bringing a new level of complexity everywhere. It starts from  engineering and manufacturing where you need to deal with complex multidisciplinary issues related to combination of mechanical, electronic and software pieces. The last one is a critical addition to product information. Bill of materials has to cover not only mechanical and electronic parts, but also software elements.

Another aspect is related to operation of all smart products. Because of connectivity aspects of products, the operation is required to deal with software, data and other elements that can easy turn your manufacturing company into web operational facility with servers, databases, etc.

As soon as devices are exposed to software, the problem of software component traceability is getting critical. Configuration management and updates is a starting point. But, it quickly coming down to security, which is very critical today.

GCN article – How secure are your open-source based systems?  speaks about problem of security in open source software. Here is my favorite passage:

According to Gartner, 95 percent of all mainstream IT organizations will leverage some element of open source software – directly or indirectly – within their mission-critical IT systems in 2015. And in an analysis of more than 5,300 enterprise applications uploaded to its platform in the fall of 2014, Veracode, a security firm that runs a cloud-based vulnerability scanning service, found that third-party components introduce an average of 24 known vulnerabilities into each web application.

To address this escalating risk in the software supply chain, industry groups such as The Open Web Application Security Project, PCI Security Standards Council and Financial Services Information Sharing and Analysis Center now require explicit policies and controls to govern the use of components.

Smart products are also leveraging open source software. The security of connected devices and smart product is a serious problem to handle. Which brings me to think about how hardware manufacturing companies can trace software elements and protect their products from a potential vulnerability.

What is my conclusion? To cover all aspects of product information including software becomes absolutely important. For many manufacturing companies the information about mechanical, electronic and software components is siloed in different data management systems. In my 2015 PLM trends article, I mentioned the importance of new tools capable to manage multidisciplinary product information. Software BOM security is just one example of the trend. The demand to provide systems able to handle all aspect of product BOM is increasing. Just my thoughts…

Best, Oleg

photo credit: JulianBleecker via photopin cc

Share

3 comments

The anatomy of PLM upgrades

by Oleg on January 26, 2015 · 0 comments

plm-migration-upgrades

Software upgrades is a fascinating topic. It has been with us from a very early beginning of software. Seriously, we hate upgrades. On the other side, very often, this is the only way to make a progress. The main problem of upgrades is related to existing dependencies – migration of data, file formats and data incompatibilities, hardware incompatibilities, etc.

As software is getting more complex, the complexity of upgrades are increasing. Enterprise software is a very good example. Talk to people about ERP, PLM and other enterprise software upgrades and you can learn a lot about effort and cost of upgrades for an organization.

For a long time, enterprise software upgrades were considered as something inevitable. Which led to many problems for customers. One of the extreme situation is when a specific configuration of a system becomes non-upgradable. It is known as “version lock-in”. Most typical reasons – features and customization incompatibility between new software version and the one customer is still running. As much as customers are discovering the complexity of upgrades, we can see software vendors are trying to leverage it to demostrate their differentiation.

For last few years, I can see an increased focus of PLM vendors around “upgrade and migration”. My hunch, too many customers stuck in previous versions of PLM software or outdated PLM systems. Random PLM (future) thoughts article by Jos Voskuil speaks about PLM systems upgrades complexity. Read the following passage:

Not every upgrade is the same! Where consumer software will be used by millions and tested through long Alfa and beta cycles, PLM software often comes to the market in what you could consider a beta stage with limited testing. Most PLM vendors invest a lot of their revenue in providing new functionality and technology based on their high-end customer demands. They do not have the time and budget to invest in the details of the solution; for this reason PLM solutions will remain a kind of framework. In addition, when a solution is not 100 % complete there will be an adaptation from the customer, making upgrades later, not 100 percent guaranteed or compatible. More details on PLM Upgrades after the conference, let’s look into the near future.

I think, the overall trend in quality of enterprise software is positive. Consumer software mentioned by Jos is only one factor why enterprise software vendors are investing more in quality. Jos’ article made me think more about how customers should approach the topic of PLM migrations and upgrades. In general,  I think, it can be applicable not only to PLM systems. PLM vendors are trying to make migrations easy from both economical and technological standpoints. Here are some of my thoughts about anatomy of PLM software migration.

Migration technologies

While some technologies can give you an advantage during migration and upgrades, from a technical standpoint you cannot avoid upgrades. Very simple – from time to time you need to restructure database to bring new features or optimize for performance. Since PLM is relying on OS and database technologies, you need to get upgrades to bring PLM system into compatible state with new OS/RDBMS. If you PDM/PLM system is integrated with other CAD systems, this is another aspect of migrations.

From technological perspective, migration is always sort of extract, transfer, load type of things. It can be minor or major. It can happen in a single database or may require a separate set of application or database servers. PLM system architecture designed with “upgrade in mind” can make it easier, but won’t eliminate it completely.

PLM vendors and economic of migration 

PLM vendors are starting to pay attention to migration and upgrades. While the status of PLM systems is far from an ideal when it comes to migration, some vendors are proposing to cover upgrades and migrations as part of PLM service and licensing offerings.

SaaS (cloud) is providing another way to hide migration and upgrades. Since customer is not buying software to install it in their data centers, the problem of migrations and upgrades eventually is part of PLM vendor responsibility.

Technical elements of migration

There are 3 main elements that can increase PLM system vulnerability to upgrades and migrations – 1/ custom data model; 2/ code customization and scripting; 3/ integration with other system. The amount of specialization in each of them, can increase a potential cost and complexity of migration.

What is my conclusion? You cannot avoid migrations and upgrades. So, to plan ahead is a good idea. You should evaluate vendor and product for “updatability”. It is not simple, especially when it comes to on-premise software. Product architecture evaluation should be an important element of your system selection process. If you think about SaaS /cloud as a universal solution for upgrades and migration, I recommend you to take it carefully as well. It certainly removes a pain from a customer. However, take into account it won’t eliminate upgrades from technological standpoint. Upgrades are essential part of SaaS product development. Depends on SaaS architecture and development methodology, the system can be in an upgrade mode all the time. Which is a good thing because it will be become part of product delivery. Just my thoughts…

Best, Oleg

Share

0 comments

plm-data-modeling

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology, the process of data modeling can be called differently. But fundamentally, you can see it in any PLM implementation. This is a process, which creates an information model of product and processes in a specific company. To get it done is not simple and it requires lot of preparation work, which is usually part of implementation services. Even more, once created data model needs to be extended with new data elements and features.

Is there a better way? How other industries and products are solving similar problems of data modeling and data curating. It made me think about web and internet as a huge social and information system. How data models are managed on the web? How large web companies are solving these problems?

One of the examples of creating a model for data on the web was Freebase. Google acquired Freebase and used as one of the data sources for Google Knowledge Graph. You can catch up on my post why PLM vendors should learn about Google Knowledge Graph. Another attempt to create a model for web data was Schema.org, which is very promising in my view. Here is my earlier post about Schema.org – The future of Part Numbers and Unique Identification. Both are examples of curating data models for web data. The interesting part of schema.org is that several web search vendors are agreed on some elements of data model as well as how to curate and manage schema.org definitions.

However, it looks like manual curating of Google Knowledge Graph and Schema.org is not the approach that makes web companies to feel happy about and leapfrog in the future. Manual work is expensive and time consuming.  At least some people are thinking about that. Dataversity article “Opinion: Nova Spivack on a New Era in Semantic Web History” speaks about some interesting opportunities that can open a new page in the way data is captured and modeled. He speaks about possible future trajectories of deep learning, data models and relationships detecting. It can extend Schema.org, especially in the part that related to automatically generated data models and classifications. Here is my favorite passage:

At some point in the future, when Deep Learning not only matures but the cost of computing is far cheaper than it is today, it might make sense to apply Deep Learning to build classifiers that recognize all of the core concepts that make up human consensus reality. But discovering and classifying how these concepts relate will still be difficult, unless systems that can learn about relationships with the subtly of humans become possible.

Is it possible to apply Deep Learning to relationship detection and classification? Probably yes, but this will likely be a second phase after Deep Learning is first broadly applied to entity classification. But ultimately I don’t see any technical reason why a combination of the Knowledge Graph, Knowledge Vault, and new Deep Learning capabilities, couldn’t be applied to automatically generating and curating the world’s knowledge graph to a level of richness that will resemble the original vision of the Semantic Web. But this will probably take two or three decades.

This article made me think about the fact manual data curating for Freebase and Schema.org is a very similar process to what many PLM implementers are doing when applying specific data and process models using PLM tools. Yes, PLM data modeling happens usually for a specific manufacturing companies. At the same time, PLM service providers are re-using elements of these models. Also companies are interconnected and working together. The problem of communication between companies is painful and still requires some level of agreement between manufacturing companies and suppliers.

What is my conclusion? Data modeling is an interesting problem. For years PLM vendors put a significant focus how to make flexible tools that can help implementers to create data and process models. Flexibility and dynamic data models are highly demanded by all customers and this is one of the most important technological element of every PLM platform today. New forms of computing and technologies can come and automate this process. It can help to generate data models automatically via capturing data about what company does and processes in a company. Sounds like a dream? Maybe… But manual curating is not an efficient data modeling. The last 30 years of PDM/PLM experience is a good confirmation to that. To find a better way to apply automatic data capturing and configuration for PLM can be interesting opportunity. Just my thoughts…

Best, Oleg

photo credit: tec_estromberg via photopin cc

Share

0 comments

Cloud PDM: stop controlling data and check shadow IT practices

January 20, 2015

An interest of customers in cloud PDM solution is growing. I guess there are multiple factors here – awareness about cloud efficiency and transparency, less concern about cloud security and improved speed and stability of internet connections. If you are not following my blog, you can catch up on my older blog articles about cloud PDM […]

Share
Read the full article →

3 PLM deployment and adoption challenges

January 9, 2015

The discussion about the demand for PLM services yesterday made me think more about challenges in PLM deployments. Implementation is one of the biggest time consuming component in enterprise software. Engineering, manufacturing and product lifecycle domain represents a significant difficulties for software vendor and service provider. It is very rare to see fast implementations. In today’s […]

Share
Read the full article →

The demand for PLM services

January 8, 2015

Services is an important part of every PLM implementation. My attention caught news article – Kalypso and GoEngineer form strategic partnership. I found it interesting, especially the following passage: “The Kalypso-GoEngineer partnership enables both firms to scale our businesses to better serve the growing demand for PLM services and software,” said George Young, CEO of […]

Share
Read the full article →

What stops manufacturing from entering into DaaS bright future?

January 7, 2015

There are lot of changes in manufacturing eco-system these days. You probably heard about many of them. Changes are coming as a result of many factors – physical production environment, IP ownership, cloud IT infrastructure, connected products, changes in demand model and mass customization. The last one is interesting. The time when manufacturing was presented […]

Share
Read the full article →

PLM and Entire System Common Data Model

January 5, 2015

Products are getting more complex these days. There are multiple reasons for that. Adding of connectivity and software is one of the most visible reasons why it happens. Think about any simple gadget these days, which cost less than 99$ in US. It includes mechanical components, electrical parts and software. In additional to that, products […]

Share
Read the full article →

How PLM can use multiple mobile apps in a single screen

December 30, 2014

One size doesn’t fit all. We know that. Engineers are using multiple tools. Shifting context is complex. For years, CAD and PLM companies have been trying to create a single UI, system or application. Despite all these efforts, to integrate multiple applications or tasks is remaining a very challenging requirements. Going back 15-20 years ago, […]

Share
Read the full article →

When BOM is not BOM

December 17, 2014

Bill of Materials (BOM) is a central part of everything in product development. Sometimes, people call it product structure. Manufacturers are using BOM to define list of raw materials, parts and sub-assemblies with corresponded quantities need to manufacture a product. This is over simplistic definition. As usual, devil is details and BOM story is getting […]

Share
Read the full article →