You searched for:

cloud

PLM and Microsoft Azure Cloud In A Box

by Oleg on October 22, 2014 · 0 comments

ms-azure-cloud

How do you move to the cloud? This is one of topics I’m discussing on my blog for the last year. The last time, I took a swing towards public cloud. Navigate to my PLM vendors, large manufacturers and public cloud article for more information. However, not everybody will move to public cloud. At least not very soon.

For those who is looking for alternatives, especially within private cloud zone, the last update from Microsoft can be a very good news. Navigate to the Business Insider blog – Microsoft’s Satya Nadella Just Fired A Shot At HP And IBM. Microsoft turns to Dell to create a new computer server. Here is the passage which provides more info:

The new computer is called the “Microsoft Cloud Platform System” and it will be a mini-version of Microsoft’s cloud, Azure, that enterprises can install in their own data centers. By using this server, enterprises can easily move applications from their own private data center to Microsoft’s cloud and back again. (In geek speak, this is called “hybrid computing”.)

Some more details came from CMSWire blog earlier today – Take a Seat Google, Amazon: Microsoft’s Cloud Wins the Day. So what is that Microsoft Azure Cloud in A Box. Here is the definition of a “Box”:

...new Azure-like appliance that Enterprises can deploy in their own data centers. It has been designed specifically to handle big data workloads (32 cores, 450 gigabytes of RAM and 6.5 terabytes of local solid-state drive storage). Officially named the Microsoft Cloud Platform System (CPS), powered by Dell it is, in essence, an “Azure consistent cloud in a box” with pre-integrated hardware from Dell and software from Microsoft.

I captured the following architecture shot from WinITPro article:

plm-azure-in-a-box

It made me think about what is the potential impact and opportunity for PLM vendors. For most of them, alignment with Microsoft can be very beneficial. In the case Microsoft will do hard work and promote their Cloud Platform System to CIOs of large enterprise companies, PLM can be the icing on the cake. So, on the surface it all looks good. Especially, for PLM vendors especially fully aligned Microsoft software stack. I guess Microsoft partnership programs can provide some additional benefits too.

The issue I’d like to question is related to data layer. Most of large PLM deployments today are running on top of Oracle database. Oracle has their own cloud plans – Oracle cloud PaaS will provide a magic button for PLM. The availability of Oracle DB as part of Azure Cloud Platform can be questionable and become an issue to move PLM systems to Azure.

What is my conclusion? The devil is in the details. This is the best way to describe the status of cloud PLM software architecture today. PLM vendors are developing their own cloud strategies. Manufacturing companies are looking for the easiest path to the cloud. We will see some interesting moves from both sides. A good time for PLM architects and tech advisers. Just my thoughts…

Best, Oleg

Share

0 comments

PLM Files Detox

by Oleg on October 21, 2014 · 0 comments

zero-files-no-CAD-files

The digital life around us is changing. It was a time when everything we did was running around desktop computer. You do your job, Save As… and, yes(!) put it in a file that can give you control over the result of your job. That’s the reason why engineers are in love with CAD files and Excel spreadsheets - it gives them full control of what they do. Excels are getting messy within time, but we can start a new file or open a new Excel spreadsheet.

Rob Cohee of Autodesk reminded me how much engineers are in love with files in his LinkedIn article – My Name is Rob, and I’m Addicted to Files. I captured few passages from Rob’s article before. He brilliantly explains the full engineering enjoyment of control over design and related information.

It started out small with a .DWG here, a .DOC, there with a sprinkle of .XLS files in between.

I had the freedom to create all this data, and the power is nothing short of addicting. Critical design requirements, tolerance, specification, and performance requirements, assembly instructions, a digital folder of file after file containing all of this critical information. I was the Michelangelo of AutoCAD R13 C4, the DWG was my canvas, safety was my muse.

The drawing file became everything. It was my design, requirements document, revision control, my parts list, my BOM, my supplier and procurement instructions, my cut list, my everything. All that data, all in one place locked away in my CAD file that only I had access to make modifications. The control was dizzying, euphoric at times. Any change to the drawing file had to go through me and me alone.

Rob’s article reminded me some of my old posts – The future of CAD without files. I still like very much a diagram I placed there from O’Reilly Radar article – Why files need to die. Here is my conclusion back into 2011.

The fundamentals of CAD and design systems are files. We use them to store assemblies, parts, drawings. In addition to that, we use them as a reference in many places. Do think “file” paradigm will live with CAD and other design systems forever? The movement of CAD vendors seems to me the obvious application of modern web principles to the world of design and engineering. The initial signals are here. CATIA V6 pushed the limits and eliminated files by connecting CATIA system directly to Enovia back-end. Autodesk cloud experiments with systems like AutoCAD WS made existence of files on the disc obsolete. PTC introduced Creo Apps. It will be interesting to see if PTC will come with the future idea of eliminating files. I think the computing and information paradigms are shifting from file-oriented to data (and web) oriented. The initial signs are here. The speed of this movement is questionable. Manufacturing is slow changing environment and engineers are very reluctant to changes.

PDM (Product Data Management) was a solution to end CAD file mess. PDM systems came to hunt for CAD and other files. The intent was to bring files into order, manage revisions, share data and… after some time, to eliminate files. We can see it started to happen now in some high-end systems such as CATIA V6. So, why PDM failed to detox engineers from files? Here is the thing… PDM was invented to help engineers to manage and control data. It sounds like engineers should like PDM, since it helps them to control files. But it didn’t go according to the plan. PDM added “frictions” into engineering freedom to create data in the way engineers want. Name control, check-in/out, approvals, etc. As a result of that, PDM failed to become a friend and turned to be engineers’ nightmare. Engineers don’t like PDM and in many situations engineers were forced to use PDM.

Working environment is changing fast. We are getting disconnected from files in our digital life. Our everyday workflows are getting distributed, mobile, disconnected from desktops and… files. We want to get access to data and not to files. To make this process successful, we need to think how to remove frictions. When you go to engineering school, you learn about importance of frictions. But software is different. Especially these days. Frictions can slow down the process of software adoption.

What is my conclusion? Engineering and manufacturing is slow changing environment. Engineers are conservative and design minded. Therefore, many PLM tools failed to become a favorite engineering data management and collaboration tool. Large teams accepted PDM tools because they had no choice. I believe, the future won’t belong to files. We are going to see more data-driven environment around us. To establish such environment is one of the main challenges for PLM companies today. To make it happen, PLM vendors must think how to remove frictions between users and PLM tools. Just my thoughts…

Best, Oleg

Share

0 comments

PLM: from sync to link

by Oleg on October 17, 2014 · 6 comments

plm-data-link-sync

Data has an important place in our life. Shopping lists, calendars, emails, websites, family photos, trip videos, documents, etc. We want our data to be well organized and easy to find. Marketing folks like to use the term – data at your fingertips. However, the reality is just opposite. Data is messy. We store it in multiple places, we forget names of documents and we can hardly control it.

Everything I said above applies to manufacturing companies too. But, it gets even more complicated. Departments, contractors, suppliers, multiple locations and multiple systems. So, data lives in silos – databases, network drives, databases, multiple enterprise systems. In my article – PLM One Big Silo, I’ve been talking about organizational and application silos. The data landscape in every manufacturing company is very complex. Software vendors are trying to crush silos by introducing large platforms that can help to integrate and connect information. It takes time and huge cost to implement such system in a real world organization. Which makes it almost a dream for many companies.

In my view, openness will play a key role in the ability of system to integrate and interconnect. It will help to get access to information across the silos and it leads to one of the key problem of data sharing and identity. To manage data in silos is a complex tasks. It takes time to organize data, to figure out how to interconnect data, organize data reporting and to support data consistency. I covered it more in my PLM implementations: nuts and bolts of data silos article.

Joe Barkai’s article Design Reuse: Reusing vs. Cloning and Owning speaks about the problem of data re-use. In my view, data reuse problem is real and connected directly to the issue of data silos. I liked the following passage from Joe’s article:

If commonly used and shared parts and subsystems carry separate identities, then the ability to share lifecycle information across products and with suppliers is highly diminished, especially when products are in different phases of their lifecycle. In fact, the value of knowledge sharing can be greater when it’s done out of sync with lifecycle phase. Imagine, for example, the value of knowing the manufacturing ramp up experience of a subsystem and the engineering change orders (ECOs) that have been implemented to correct them before a new design is frozen. In an organization that practices “cloning and owning”, it’s highly likely that this kind of knowledge is common knowledge and is available outside that product line.

An effective design reuse strategy must be built upon a centralized repository of reusable objects. Each object—a part, a design, a best practice—should be associated with its lifecycle experience: quality reports, ECOs, supplier incoming inspections, reliability, warranty claims, and all other representations of organizational knowledge that is conducive and critical to making better design, manufacturing and service related decisions.

Unfortunately, the way most of companies and software vendors are solving this problem today is just data sync. Yes, data is syncing between multiple systems. Brutally. Without thinking multiple times. In the race to control information, software vendors and implementing companies are batch-syncing data between multiple databases and applications. Parts, bill of materials, documents, specifications, etc. Data is moving from engineering applications to manufacturing databases back and forth. Specifications and design information is syncing between OEM controlled databases and suppliers’ systems. This data synchronization is leading to lot of inefficiency and complexity.

It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between applications and databases. This is not a simple task. Industry that years was taking “sync” as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

What is my conclusion? There is an opportunity to move from sync to link of data. It will allow to simplify data management and will help to reuse data. It requires conceptual rethink of how problems of data integrations are solved between vendors. By providing “link to data” instead of actually “syncing data”, we can help company to streamline processes and improve quality of products. Just my thoughts…

Best, Oleg

Share

6 comments

Kenesto revamp: does it change cloud PLM game?

October 16, 2014

It has been more than two years since I was reviewing Kenesto – an outfit founded by Mike Payne with the a strong vision to simplify process management. Navigate to the following article PLM, Kenesto and process experience to refresh your memories. Steve Bodnar of Kenesto put comments on my blog about Google Drive and 3rd […]

Share
Read the full article →

PLM vendors, large manufacturers and public cloud

October 14, 2014

Companies are moving to cloud these days. The question vendors and customers are asking today is how do we move to the cloud. I’ve been asking this question in my post few month ago – PLM / PDM: Why the cloud? Wrong question… I discovered multiple options for customers to start their move to the cloud […]

Share
Read the full article →

PLM and growing community of fabless manufacturers

October 6, 2014

There are so many interesting trends to watch these days in manufacturing. I’ve been blogging about Kickstarter projects and manufacturing startups. Another interesting topic to speak about is so called “fabless manufacturing”. A bit history. Fabless manufacturing is not a new thing. Navigate to the following Wikipedia about Fabless Manufacturing to refresh your knowledge. Fabless manufacturing roots are […]

Share
Read the full article →

How cloud pricing war will affect PLM?

October 3, 2014

Large infrastructure cloud providers are slashing prices. TechCrunch article Nobody Can Win The Cloud Pricing Wars is providing some additional details about the situation. The same article speaks about the moment when CIOs won’t be able to ignore the pricing advantage: Earlier this week, Google lowered prices 10 percent across the board on their Google Compute […]

Share
Read the full article →

Google Drive third-party apps and cloud PDM foundation

October 1, 2014

Designers and engineers working in manufacturing, architecture and construction firms are familiar with the idea of Z-drive. Usually, this is a name of drive that accessible in your local network (LAN). Usually, the same drive can be available also via WAN, but it is not always reliable because of latency, which can make your CAD […]

Share
Read the full article →

PLM, IoT platforms and extended lifecycle

September 30, 2014

IoT is a growing buzz these days. Analysts are projecting billions of devices available online very soon. The number is impressive and IoT eco-system is fueled by newcomers developing variety of connected devices and related software. Earlier this year, PTC  surprised PLM community by their strategy related to IoT by acquisition of ThingWorx. However, ThingWorx […]

Share
Read the full article →

Oracle Cloud PaaS will provide a magic button for PLM

September 29, 2014

Cloud PLM architecture and implementations is one of the topics I’m following for the last few years. It is interesting to watch dynamics of this space from initial ignorance to careful recognition and marketing buzz. I can see differences in how PLM vendors are approaching cloud. In my view, nobody is asking a question “why […]

Share
Read the full article →