Posts tagged as:

PLM

PLM: from sync to link

by Oleg on October 17, 2014 · 2 comments

plm-data-link-sync

Data has an important place in our life. Shopping lists, calendars, emails, websites, family photos, trip videos, documents, etc. We want our data to be well organized and easy to find. Marketing folks like to use the term – data at your fingertips. However, the reality is just opposite. Data is messy. We store it in multiple places, we forget names of documents and we can hardly control it.

Everything I said above applies to manufacturing companies too. But, it gets even more complicated. Departments, contractors, suppliers, multiple locations and multiple systems. So, data lives in silos – databases, network drives, databases, multiple enterprise systems. In my article – PLM One Big Silo, I’ve been talking about organizational and application silos. The data landscape in every manufacturing company is very complex. Software vendors are trying to crush silos by introducing large platforms that can help to integrate and connect information. It takes time and huge cost to implement such system in a real world organization. Which makes it almost a dream for many companies.

In my view, openness will play a key role in the ability of system to integrate and interconnect. It will help to get access to information across the silos and it leads to one of the key problem of data sharing and identity. To manage data in silos is a complex tasks. It takes time to organize data, to figure out how to interconnect data, organize data reporting and to support data consistency. I covered it more in my PLM implementations: nuts and bolts of data silos article.

Joe Barkai’s article Design Reuse: Reusing vs. Cloning and Owning speaks about the problem of data re-use. In my view, data reuse problem is real and connected directly to the issue of data silos. I liked the following passage from Joe’s article:

If commonly used and shared parts and subsystems carry separate identities, then the ability to share lifecycle information across products and with suppliers is highly diminished, especially when products are in different phases of their lifecycle. In fact, the value of knowledge sharing can be greater when it’s done out of sync with lifecycle phase. Imagine, for example, the value of knowing the manufacturing ramp up experience of a subsystem and the engineering change orders (ECOs) that have been implemented to correct them before a new design is frozen. In an organization that practices “cloning and owning”, it’s highly likely that this kind of knowledge is common knowledge and is available outside that product line.

An effective design reuse strategy must be built upon a centralized repository of reusable objects. Each object—a part, a design, a best practice—should be associated with its lifecycle experience: quality reports, ECOs, supplier incoming inspections, reliability, warranty claims, and all other representations of organizational knowledge that is conducive and critical to making better design, manufacturing and service related decisions.

Unfortunately, the way most of companies and software vendors are solving this problem today is just data sync. Yes, data is syncing between multiple systems. Brutally. Without thinking multiple times. In the race to control information, software vendors and implementing companies are batch-syncing data between multiple databases and applications. Parts, bill of materials, documents, specifications, etc. Data is moving from engineering applications to manufacturing databases back and forth. Specifications and design information is syncing between OEM controlled databases and suppliers’ systems. This data synchronization is leading to lot of inefficiency and complexity.

It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between applications and databases. This is not a simple task. Industry that years was taking “sync” as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

What is my conclusion? There is an opportunity to move from sync to link of data. It will allow to simplify data management and will help to reuse data. It requires conceptual rethink of how problems of data integrations are solved between vendors. By providing “link to data” instead of actually “syncing data”, we can help company to streamline processes and improve quality of products. Just my thoughts…

Best, Oleg

Share

2 comments

kenesto-edm

It has been more than two years since I was reviewing Kenesto – an outfit founded by Mike Payne with the a strong vision to simplify process management. Navigate to the following article PLM, Kenesto and process experience to refresh your memories.

Steve Bodnar of Kenesto put comments on my blog about Google Drive and 3rd party apps with hints about some Kenesto functionality around file synchronization and cloud data management. It was a good alert that Kenesto is preparing some refresh. The following Kenesto press release caught my attention yesterday – Kenesto Extends Engineering Collaboration with New Vaulting and State-of-the-art Desktop File Synchronization. I found it interesting, since it moved Kenesto from process management cloud tool into something bigger – data management and vaulting. Back in 2012, I thought, that ability to handle engineering data is a big differentiation between traditional PLM system and cloud process management tool like Kenesto. The following passage from Kenesto press release can give a short description of the shift Kenesto made – it moved into data and file management space.

Kenesto today announced the full availability of its latest innovations – file vaulting and a pioneering file synchronization service – to enable mainstream design and engineering firms to more easily and effectively collaborate and manage their data. Kenesto’s latest capabilities also work well in conjunction with such design tools as Creo®, SolidEdge®, SolidWorks®, and Spaceclaim® for manufacturing customers and also Revit® for AEC customers, to enable file management and sharing across design workflows. This is all done while also ensuring proper handling of updates to component and assembly models connected to items and bills-of-material, for example.

I made a trip into Kenesto website. It presents a broad range of solutions – engineering design management, change management, procurement and supplier collaboration, program and project management. These are traditional PLM suspects. However, some of solutions are clearly outside of typical PLM domain – management of marketing program, PR and advertising, idea management.

Kenesto features are covering wide range of capabilities – projects, dashboard, reporting, document management, vaulting, web viewing, workflow and task management. My special attention caught  Enterprise-class File Synchronization. This is an interesting feature and it made me think about cloud PDM functionality and cloud file sharing. My blog- Cloud PDM ban lifted. What next? speaks about growing interest of PLM and other vendors to apply cloud technologies to PDM – space that traditionally tried to avoid cloud-touch. So, Kenesto just joined the cloud of cloud PDM vendors and I need to add Kenesto in the list of companies open for cloud PDM competition.

kenestoDesktopSync

What is my conclusion? It looks like Kenesto decided to change the trajectory of Kenesto technologies and moved from process and workflow management to a full scope of product data management and lifecycle solutions. I guess Kenesto prefers not to use traditional PDM, PLM buzzwords. However, Engineering Data Management (EDM) acronym made me feel a bit nostalgia… At the same time, cloud sync and in-browser office files editing tools can provide an interesting differentiation in cloud-era. Just my thoughts…

Best, Oleg

Disclaimer: Kenesto didn’t sponsor and didn’t influence content of this blog post.

Share

2 comments

google-data-center

Companies are moving to cloud these days. The question vendors and customers are asking today is how do we move to the cloud. I’ve been asking this question in my post few month ago – PLM / PDM: Why the cloud? Wrong question… I discovered multiple options for customers to start their move to the cloud – mainstream cloud productivity tools to share data and collaborate, to migrate existing PLM platforms to cloud using IaaS strategies as well as to build new type of platforms and tools using new type of cloud platforms and infrastructure.

Today, I want to show the perspective on public cloud from both sides – large provider of public cloud infrastructure (Google) and large manufacturing company (GE) and to see what is the intersection between their strategies.

Google – example of public cloud platform

My attention caught Google presentation – The next generation of Cloud. Navigate your browser to the following link to watch it. Besides the fact it was inspiring by the exact same question – “How to you move to the cloud”, it provided a very interesting insight on the aspect of Google public cloud platform.

google-1google-2google-3google-4

Hardware cost is declining and Google is adjusting public cloud to match economic realities. Together with economic of scale and utilization, I can see a trajectory towards decreased of public cloud cost even more in the future. 

Large manufacturers move to the cloud

So, what customers are thinking about public cloud? Inforworld article just published an article presenting GE strategy to go all-in with public cloud. Presented as an interview with GE COO Chris Drumgoole, article outlines his aggressive plans to migrate to public cloud services — and how they support GE’s organizational goals. Read the article and draw your opinion. Here is my favorite passage:

Drumgoole won’t talk specific numbers, but he claims that “north of 90 percent” of the apps deployed by GE this year have been in a public cloud environment. We’re big fans of the idea that everything ends up in the public cloud utility model eventually. “Eventually” is the big caveat, because some people within GE would argue that should be tomorrow, while others would tell you it’s 15 years from now. It’s a subject of good debate. But either way, the regulatory environment we live in right now prohibits it. In a lot of spaces, when we say technically that we think something should be public, and we’re comfortable with it being public, the regulatory environment and the regulators aren’t quite there yet and we end up having to do some sort of private or hybrid cloud. That’s probably one of the biggest barriers to us moving more public.

Drumgoole speaks about connected devices, big data and analytics as a significant driver to move data to the cloud. I reminded me one of my previous posts – IoT data will blow up traditional PLM databases (http://beyondplm.com/2014/09/23/iot-data-will-blow-up-traditional-plm-databases/). The amount of data is huge and it will certainly require new approach in data management. Here is the example of how much data produced by jet engine these days:

Take one of the jet engines we make, and if it’s fully instrumented. On a typical flight, it’s going to generate about two terabytes of data. Not everybody fully instruments them, but if you instrument it the way people would like in order to get predictive data, you’re talking about 500GB per engine per flight. A flight with a GE engine takes off or lands every three seconds. All of a sudden, the data gets very, very large very, very fast.

PLM vendors and public cloud

As for today, I’m not aware about any PDM/PLM software using Google Cloud as a platform. The majority of cloud PLM software built on top of infrastructure provided by collocated hosting services and variety of Amazon cloud infrastructure. Dassault Systems and Siemens PLM made few public statements about support of diverse set of cloud options and IaaS infrastructure. It would be interesting to see future evolution of PLM cloud platforms.

What is my conclusion? The technology and economic of cloud is changing  these days. My hunch, it will pull more vendors and companies to use public cloud in the next few years. Software companies will try to balance between leveraging technological platforms and cost. At the same time, customers will try to balance between regulatory requirements and opportunities to make data accessible and scale without limits. Interesting time and significant opportunity. Just my thoughts..

Best, Oleg

Share

0 comments

How to build online community around CAD/PLM software?

October 13, 2014

  There is one thing that seems make everyone interested and listen carefully these days – online communities. To build a successful community is a tricky thing. To make a money out of community is huge. Successful online communities can provide a lot of insight about how people are communicating, what is the value of […]

Share
Read the full article →

Importance of PLM and PIM bridge

October 11, 2014

PIM. Product Information Management. Sorry for brining yet another three letter acronym into discussion today. PIM stands for a discipline to manage data about products available outside of the company. Here is Wikipedia description: Product information management or PIM refers to processes and technologies focused on centrally managing information about products, with a focus on […]

Share
Read the full article →

MBOM collaboration and cost of change

October 9, 2014

The only thing that is constant is change. This is very much applies to everything we do around BOM. Engineering and manufacturing eco-system are full of jokes about engineering changes. You maybe heard about renaming “engineering change order” into “engineering mistake order” as well as the correlation between number of engineers and number of ECOs […]

Share
Read the full article →

Manufacturing BOM dilemma

October 8, 2014

Manufacturing process optimization is one of the biggest challenges in product development these days. Companies are looking how to low the cost, optimize manufacturing process for speed and to deliver large variety of product configurations. The demand for these improvements is very high. The time when engineering were throwing design”over the wall of engineering“ is over. […]

Share
Read the full article →

The future role of voice in PLM processes

October 7, 2014

Processes and workflows is a big topic in PLM. If you think about PLM as a way to manage a full scope of product development processes in organization, workflow is a foundation part of technologies and tools to implement that. The definition of PLM process is usually complex and can come as workflow or rule-based. You […]

Share
Read the full article →

PLM and growing community of fabless manufacturers

October 6, 2014

There are so many interesting trends to watch these days in manufacturing. I’ve been blogging about Kickstarter projects and manufacturing startups. Another interesting topic to speak about is so called “fabless manufacturing”. A bit history. Fabless manufacturing is not a new thing. Navigate to the following Wikipedia about Fabless Manufacturing to refresh your knowledge. Fabless manufacturing roots are […]

Share
Read the full article →

How cloud pricing war will affect PLM?

October 3, 2014

Large infrastructure cloud providers are slashing prices. TechCrunch article Nobody Can Win The Cloud Pricing Wars is providing some additional details about the situation. The same article speaks about the moment when CIOs won’t be able to ignore the pricing advantage: Earlier this week, Google lowered prices 10 percent across the board on their Google Compute […]

Share
Read the full article →