Posts tagged as:

Interoperability

Data. Conversion. Interoperability. Translation. The discussion about these topics is endless in CAD/PLM world. Customers are looking for interoperability between different product versions, competitive products, data models, data formats, databases and geometrical kernels. Customers were always first impacted by problems of interoperability. The lifecycle of engineering and manufacturing work is longer than typical lifecycle of product version or even engineering IT solution. Technically, data interoperability is a complex problem. It is not easy to solve, even if you are want to do so. Evan Yares recently posted an interesting article about interoperability – CAD Interoperability today. Interoperability plays an important role in product lifecycle applications in large OEMs and Supply Chain.

Until now, the perception was that customers are most impacted from data interoperability problems. It was true until very recently. However, I can see some new trends and changes in this space . Consumerization, BYOD and cloud trends are introducing new elements and aspects in product development roadmaps. CAD/PLM vendors are forced to think about cloud and mobile development as well as potential disruptive competition coming from newcomers  and other vendors. New design applications become more granular and focusing on a specific functionality or target customers. Two examples of recent announcements are Autodesk Fusion 360, SolidWorks Mechanical Conceptual. These application were born to co-exist with old products. Existing products won’t retire tomorrow. The ability to re-use data with existing product lines such as Inventor (for Autodesk) and SolidWorks (for Dassault) and other CAD packages will be vital for success of new products. I’ve been reading GraphicSpeak – SolidWorks Mechanical Conceptual introduced but not delivered article earlier today. Randall Newton is talking about the product SolidWorks Mechanical Conceptual (SWMC) announced by SolidWorks during SolidWorks World 2013 in Orlando last week. SWMC is build on top of Dassault 3DEXPERIENCE platform. I found the following passage interesting:

Reading between the lines, so to speak, of what was said at SolidWorks World, it seems two critical challenges remain before SWMC will be a selling product. It must prove to be fully and seamlessly interoperable with SolidWorks, and it must be more cloud-based. Interoperability has always been a significant challenge in the 3D CAD industry. 3D kernels are complicated. Dassault’s 3D Experience platform uses the CGM 3D kernel; SolidWorks uses the Parasolid 3D kernel from Dassault’s rival Siemens PLM. Completely accurate automated moving of files from Catia V5 and V6 is not commonly possible, and they share the same 3D kernel. Most of us can only imagine the complexity of moving between CGM and Parasolid.

Granularity is one of the most trending topic these days. Everybody are thinking about Apps. Company are moving away from developing heavy and complex product suites towards granular applications. Al Dean of Develop3D wrote an interesting article about granularity few years ago – Why granularity is going to rock your future… This is my favorite passage:

There are two things that might influence this and push us into further levels of explicit detail and granularity. The first is the ‘cloud’ (yes, I broke my own rules). When you’re working on a system that’s remotely located on a server, whether that’s over your internal network or across the wider web, you’ll need to manage and exchange finite packets of information, features, sketch entities and such. To do that, you need to manage and keep track of those individual parcels of data and oackets of change. That’s going to require a level of granularity that’s way beyond what most data management systems are currently capable of. Consider what would happen when you start to work on today’s products, in a highly collaborative environment, where data is being passed globally, between teams, between languages, between professional disciplines. And you still need to track data down to this type of level. And when you’re working on a product that looks like X-Ray image.

What is my conclusion? I agree with Al Dean. We will see more granularity in data and new applications. Interoperability becomes a very important factor in a future success of new apps. New level of data compatibility is required. Vendors will be forced to improve the level of interoperability of their existing products as well as new apps. Interesting time and change happens these days. Vendors need take a note. Important. Just my thoughts…

Best, Oleg

Share

2 comments

PLM, Interoperability and Cloud

by Oleg on July 28, 2010 · 410 comments

I read Microsoft Talked Open Data, Open Cloud and watched Microsoft’s Jean Paoli video. It seems to me, Microsoft is taking seriously the topic of heterogeneous development environment, IT, cloud and data together.

Some of the ways in which Microsoft demonstrates their committment to these four elements include:

  • Support for and participation in standards organizations such as OASIS, W3C, ECMA, ISO and OpenID.
  • Support for data portability using OData.
  • Support for multiple languages including .NET, Java, PHP and Ruby when developing cloud applications.
  • They also make it easy to move data in and out of SQL Azure, Windows Azure Storage and Live Contacts using an API.
  • The ability to manage identity across on-premise and cloud based applications.

Paoli’s talk and Microsoft presentation made me think again about PLM interoperability. Interoperability is long and a very painful issue in CAD and PLM industry. I think, with the introduction of cloud platforms, we may discover a renaissancein of the interoperability discussions.

What is my take? One important statement made by Microsoft- “Data belongs to users“. You can take it to the cloud and retrieve it back. Company owns the data. I think this is an important statement. PLM industry needs to think about it. Just my thoughts…

Best, Oleg

Share

410 comments

Data formats is an interesting topic in the context of engineering and manufacturing. Manufacturing company is relaying on a significant amount of information that resides in the organizations. I had a chance to write in the past – 3D CAD Future: How To Liberate Data? I think, the topic is actually much wider than 3D and CAD. Engineers are using multiple applications and – CAD, CAE, Office, various databases driven applications. In addition to 3D CAD formats, organizations deal with multiple public and proprietary formats. Some of them are specific for the existing applications. However, formats like CSV are generic and used by multiple applications. I read an article Is JSON the CSV of the 21st century by Martin David on the Line.ar th.inking blog. Martin is discussing what are the perspective to have JSON-like formats to have wider expansion in the next decades and replace CSV and similar formats.

I decided to put some of my thoughts about the history of product data formats and share some ideas about what may happen in the 21st century.

Technology and Interoperability
The story of data formats and interoperability is going together very often. Time ago, all application developers saved data into proprietary file formats. The interoperability was very poor. Then, the idea of relational databases invented by Edgar Codd came in to solve an interoperability problem. Everything is in relational tables and all applications supposed to use SQL. Nevertheless, multiple proprietary data models caused an interoperability problem again. Later in 1990s, XML was introduced as the next magic thing that will solve the problem of the interoperability. Since the first introduction in 1996, lots of different XML formats were developed. Some of them were developed in CAD, PDM and PLM space. However, the problem of interoperability is still with us.

Applications
Product data formats make their origins in hundreds and thousands of applications developed for engineering and manufacturing space. From the technological standpoint, I can classify them in the three groups: design related, database oriented and office applications.

Design related applications (CAD, CAM and CAE) are impacted by the development of major CAD systems. CAD applications are continuing to be very protective with regards to the formats. However, the adoption of geometrical kernels (Parasolid, ACIS and others) maintain today’s status quo in this space. Many “integration service” companies are dealing with multiple translations of all possible and impossible CAD data formats.

Office tools became part of engineering application and continue to make a significant influence on the product data because of wide adoption of MS Excel. Excel files are everywhere and you can find complete data management systems developed on top of the Excel and corresponded to the Excel data formats. Whatever will happen in the future, Excel legacy will continue to dominate for a very long period of time in everything related to product data formats.

The majority of engineering and product data related application is using database technology. This is what we have today in the industry. Relational databases and SQL-driven data development continues to dominate in this space. These applications created a huge amount of legacy data in engineering and manufacturing organizations. In most of the situations, companies continue to use data in relational databases even after application themselves becomes useless.

Standards
Product data format in CAD and other applications are tightly related to the issue of standards. My favorite association related to standards is tooth brush. We, obviously, need them. However, everybody wants to use their own tooth brushes. During the last 25 years, there are multiple attempts to create standards for CAD, PDM, PLM and other engineering applications. Some of them were more successful and adopted such as STEP and IGES. Some of standards originated by vendors succeeded on the level of visibility that can make them very close to become de-facto industry standards.

Product Data Ownership
The question of data ownership is an important one. Many organizations are using software to create various types of product data. It resides in application files and databases. Who owns data? The reasonable answer – data belongs to the people and organizations that created these files. However, an absence of agreed and open standards created the situation when organizations are dependent on applications to access data.

Product data Format in the 21st Century
So, what will happen with product data formats in the visible future? I think, industry will need to find an answer on this question. The situation we have today was created by the business strategies of software vendors, existing technologies and specific application dominance. In my view, there is no “silver bullet” solution that can solve the problem of product data format in the short term. However, introduction of new web technologies, data standards and product data ownership can create a demand for the future innovation in this space.

These are just my thoughts.. I’m interested to know what is your take on the product data formats?
I’m looking forward to having a discussion around this topic.

Best, Oleg

Share

1014 comments

Open vs. Closed PLM Debates

June 24, 2010

I read Fortune CNN Money Blog article by Jon Fortt – Chrysler’s Engineering Software Shift. In the competitive world of PLM software it raises again the question about what is the better choice – Open or Closed? The context of this article is leaked information about Chrysler’s movement from CATIA to NX or, maybe more […]

Share
Read the full article →