Changing engineering software paradigms with data links

Changing engineering software paradigms with data links

Existing data management paradigms were formed for the last 20-30 years of software development in CAD, CAM, CAE, PDM, PLM and related products. There are many great products created during this time. However, one of the problems of current paradigm and software is silos of information. We can see it everywhere. There are small siloes created in a way of files stored in multiple folders by the same engineer and there are bigger silos like 2 databases in organization serving engineering and manufacturing departments. Engineering and manufacturing workflows are demanding to coordinate data flow between people and applications. And current paradigm is very inefficient to make production development and manufacturing processes connected. The information is copied between multiple places which can potentially create a problem with data accuracy and just also create very bad user and application experience. Sending files over the emails or synchronizing databases every 24 hours are just two simplest examples of such problems.

Check my earlier blog – PLM: From Sync to Link. The idea I introduced there is powerful and simple at the same time. It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between file folders, applications and databases. Industry that years was taking “sync” as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

I came across Onshape blog article – Why Onshape Documents Make You More Productive on the Manufacturing Floor by John Desilets. The article speaks about new concepts, practices and applications used by Onshape to help engineers in manufacturing shop floor to access correct and updated engineering information. The technique and applications are relying on Onshape paradigm of full cloud CAD systems eliminating files, data synchronization between desktop and servers and allowing users and applications to access data via stable URLs pointing on CAD assemblies, parts, drawings and other elements of information stored in Onshape document. The last one isn’t a tradition document with file on the disc, but a virtual container that can hold other elements of related information. Here is a passage that can give you an idea how it works.

In manufacturing, you may have several operations and processes needed to make a single part. Managing all this data is tedious and involves many copies (OP 1, OP 2, etc.). All of these operations are stored on the network or a local drive as a copy. In most cases, the setup person will need to access this information to be as efficient as possible when machining the part.

Whether they are navigating through the Onshape Document with QR codes or links, the shop floor setup person can now locate updated critical information quickly, efficiently, and securely. That last part can’t be understated. To better protect your intellectual property, only users who you have shared the Document with will have access.

Why Onshape Documents Make You More Productive on the Manufacturing Floor

What is my conclusion? Onshape QR code viewer example is a great demonstration how powerful the idea of information linking as a paradigm replacing existing approach of file, save, copy, email workflow. By linking to the right information source we can insure user always accessing the right information. Application can have a control over the flow of information and by setting right data to the link we can insure user will get the latest version of drawing or model. One of the main challenges I see today is to help people to move from one environment in which things are organized in files and separate isolated application to a new tools and applications helping to link and intertwine data. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post

  • Pingback: GHZ Partners | IT Services, Software()

  • Hi Oleg,

    Linking is fine, as long as the data is managed correctly on each side of the link.

    If you are linking to a version of an artifact, then that version needs to be maintained. If someone else wants to change it you need to be notified. So just pointing to other places is not sufficient. There needs to be some connection between the IP management regimes in each system to ensure that the right things are linked over time, and that those dependencies are also maintained.

    Stan Przybylinski
    VP of Research
    CIMdata, Inc.
    http://www.CIMdata.com

  • beyondplm

    Stan, thanks for the comment! I agree – you need to know what you link to. I believe “data management regime can be different”. However, the data you linked to can be managed or not. It is important that linking application will be aware about what is the data, how it managed, how it can be changed and level of trust. I will give you few example – some companies are using McMaster part numbers internally and some websites are using Amazon URLs to define books. In both cases, there is no connection between data management regimes. But you can trust consistency of Amazon links as well as McMaster PNs. On the opposite site, PLM system can sync data from ERP system and data can be obsolete because sync application will be shutdown. And users of PLM users will keep using “synchronized data”. So, the devil is in details, but if data is “linked” and links are maintained, chances to use wrong data will be minimized. Probably all together it is what you called “data management regime”. Best, Oleg