Posts tagged as:

Interoperability

data-interop-iot-plm

One of the heavily debated topics in CAD/PLM industry is data interoperability. I remember discussion about data interoperability and standards 10-15 years ago. Even vendors made some progress in establishing of independent data formats, the problem is still here. At the same time, I’m convinced that successful interoperability will play one of the key roles in the future of CAD/PLM. Navigate your browser to my article with few examples showing  how important data interoperability for building granular architecture of future application and collaboration.

IoT (Internet of Things) is relatively new trend. We started to discuss it recently. Applications of IoT are bringing lots of interesting opportunities in many domains- smart houses, connected devices, infrastructure operations and many others. However, here is the thing – we can see many companies looking how to get into IoT field. By nature, this field is very diverse. I can hardly can imagine single manufacturer supplying everything you need for your “smart house”. So, we are getting (again) into the problem of interoperability between devices, services and processes.

Forbes article Digital Business Technologies Dominate Gartner 2014 Emerging Technologies Hype Cycle speaks about several business and technological trends. IoT is one of them. Article points on the problem of data interoperability as the one that will impact the most future progress in IoT. Here is the passage I captured:

What will slow rapid adoption of IoT? Standardization, including data standards, wireless protocols and technologies. A wide number of consortiums, standards bodies, associations and government/region policies around the globe are tackling the standards issues. Ironically, with so many entities each working on their own interests, we expect the lack of standards to remain a problem over the next three to five years. In contrast, dropping costs of technology, a larger selection of IoT-capable technology vendors and the ease of experimenting continue to push trials, business cases and implementations of IoT forward.

It made me think about two issues. The problem of standardization and data interoperability can be only solved by business interests of vendors. With absence of mutual business interests we will see dumb devices not interconnecting and managing to exchange data. The value of IoT solutions will be impacted. The second problem is related to PLM vendors consuming data from multiple devices and services to improve decision making. Standardization in that field can provide an advantage and  present a solid business interests of vendors.

What is my conclusion? We can see an entire new industry of IoT is under development these days. Data interoperability is a problem that needs to be resolved earlier than later. Roots of data interoperability problems are usually related to hidden business interests of vendors. Learning from previous mistakes of CAD/PLM industry can help. CAD/PLM vendors can provide tools that helping manufacturing companies to build a better connected devices. Just my thoughts…

Best, Oleg

Share

0 comments

Data. Conversion. Interoperability. Translation. The discussion about these topics is endless in CAD/PLM world. Customers are looking for interoperability between different product versions, competitive products, data models, data formats, databases and geometrical kernels. Customers were always first impacted by problems of interoperability. The lifecycle of engineering and manufacturing work is longer than typical lifecycle of product version or even engineering IT solution. Technically, data interoperability is a complex problem. It is not easy to solve, even if you are want to do so. Evan Yares recently posted an interesting article about interoperability – CAD Interoperability today. Interoperability plays an important role in product lifecycle applications in large OEMs and Supply Chain.

Until now, the perception was that customers are most impacted from data interoperability problems. It was true until very recently. However, I can see some new trends and changes in this space . Consumerization, BYOD and cloud trends are introducing new elements and aspects in product development roadmaps. CAD/PLM vendors are forced to think about cloud and mobile development as well as potential disruptive competition coming from newcomers  and other vendors. New design applications become more granular and focusing on a specific functionality or target customers. Two examples of recent announcements are Autodesk Fusion 360, SolidWorks Mechanical Conceptual. These application were born to co-exist with old products. Existing products won’t retire tomorrow. The ability to re-use data with existing product lines such as Inventor (for Autodesk) and SolidWorks (for Dassault) and other CAD packages will be vital for success of new products. I’ve been reading GraphicSpeak – SolidWorks Mechanical Conceptual introduced but not delivered article earlier today. Randall Newton is talking about the product SolidWorks Mechanical Conceptual (SWMC) announced by SolidWorks during SolidWorks World 2013 in Orlando last week. SWMC is build on top of Dassault 3DEXPERIENCE platform. I found the following passage interesting:

Reading between the lines, so to speak, of what was said at SolidWorks World, it seems two critical challenges remain before SWMC will be a selling product. It must prove to be fully and seamlessly interoperable with SolidWorks, and it must be more cloud-based. Interoperability has always been a significant challenge in the 3D CAD industry. 3D kernels are complicated. Dassault’s 3D Experience platform uses the CGM 3D kernel; SolidWorks uses the Parasolid 3D kernel from Dassault’s rival Siemens PLM. Completely accurate automated moving of files from Catia V5 and V6 is not commonly possible, and they share the same 3D kernel. Most of us can only imagine the complexity of moving between CGM and Parasolid.

Granularity is one of the most trending topic these days. Everybody are thinking about Apps. Company are moving away from developing heavy and complex product suites towards granular applications. Al Dean of Develop3D wrote an interesting article about granularity few years ago – Why granularity is going to rock your future… This is my favorite passage:

There are two things that might influence this and push us into further levels of explicit detail and granularity. The first is the ‘cloud’ (yes, I broke my own rules). When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of. Consider what would happen when you start to work on today’s products, in a highly collaborative environment, where data is being passed globally, between teams, between languages, between professional disciplines. And you still need to track data down to this type of level. And when you’re working on a product that looks like X-Ray image.

What is my conclusion? I agree with Al Dean. We will see more granularity in data and new applications. Interoperability becomes a very important factor in a future success of new apps. New level of data compatibility is required. Vendors will be forced to improve the level of interoperability of their existing products as well as new apps. Interesting time and change happens these days. Vendors need take a note. Important. Just my thoughts…

Best, Oleg

Share

2 comments

PLM, Interoperability and Cloud

by Oleg on July 28, 2010 · 410 comments

I read Microsoft Talked Open Data, Open Cloud and watched Microsoft’s Jean Paoli video. It seems to me, Microsoft is taking seriously the topic of heterogeneous development environment, IT, cloud and data together.

Some of the ways in which Microsoft demonstrates their committment to these four elements include:

  • Support for and participation in standards organizations such as OASIS, W3C, ECMA, ISO and OpenID.
  • Support for data portability using OData.
  • Support for multiple languages including .NET, Java, PHP and Ruby when developing cloud applications.
  • They also make it easy to move data in and out of SQL Azure, Windows Azure Storage and Live Contacts using an API.
  • The ability to manage identity across on-premise and cloud based applications.

Paoli’s talk and Microsoft presentation made me think again about PLM interoperability. Interoperability is long and a very painful issue in CAD and PLM industry. I think, with the introduction of cloud platforms, we may discover a renaissancein of the interoperability discussions.

What is my take? One important statement made by Microsoft- “Data belongs to users“. You can take it to the cloud and retrieve it back. Company owns the data. I think this is an important statement. PLM industry needs to think about it. Just my thoughts…

Best, Oleg

Share

410 comments

Product Data Formats for the 21st Century

July 21, 2010

Data formats is an interesting topic in the context of engineering and manufacturing. Manufacturing company is relaying on a significant amount of information that resides in the organizations. I had a chance to write in the past – 3D CAD Future: How To Liberate Data? I think, the topic is actually much wider than 3D […]

Share
Read the full article →

Open vs. Closed PLM Debates

June 24, 2010

I read Fortune CNN Money Blog article by Jon Fortt – Chrysler’s Engineering Software Shift. In the competitive world of PLM software it raises again the question about what is the better choice – Open or Closed? The context of this article is leaked information about Chrysler’s movement from CATIA to NX or, maybe more […]

Share
Read the full article →