A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM and a future of deep linking

PLM and a future of deep linking
Oleg
Oleg
19 June, 2015 | 4 min for reading

deep-linking-plm

I like links. The things about links are fascinating. It is about connection between products, people, companies and things. The nature of our life today is to stay connected. Therefore links are important. It is natural to see links appearing everywhere in engineering and manufacturing too. Think about a product and its dependencies. The information is naturally linked between assemblies, parts, documents, bill of materials, materials, suppliers, manufacturing, shop-floor, orders and many other things.

The nature of things in PLM to be connected. At the same time, the reality of engineering software is in fact highly fragmented. The time when vendors and customers believed in a single system (or database) that can contain and manage all information is over. Customers are using multiple applications and it is unusual to see two or more PLM systems in the same company. When it comes to supply chain, the situation is even more complex.

Application integrations remains one of the most painful aspect of enterprise software and PLM can clearly lead the wave of complexity involved into implementations. My article yesterday –How PLM can avoid cloud integration spaghetti was a warning to all folks that imagine that cloud will be a silver bullet to kill application pain. It doesn’t. The cloud integration can be sometimes even more complex compared to traditional integration hacks using SQL and ETL tools.

I want to continue discussing the topic of cloud integration. The topic I’m bringing today is related to so called “deep linking”. If you’re not familiar with the topic, navigate to the following Wikipedia article – Deep Linking – to get some background information.

In the context of the World Wide Web, deep linking consists of using a hyperlink that links to a specific, generally searchable or indexed, piece of web content on a website (e.g.http://example.com/path/page), rather than the home page (e.g. http://example.com/). The technology behind the World Wide Web, the Hypertext Transfer Protocol (HTTP), does not actually make any distinction between “deep” links and any other links—all links are functionally equal. This is intentional; one of the design purposes of the Web is to allow authors to link to any published document on another site. The possibility of so-called “deep” linking is therefore built into the Web technology of HTTP and URLs by default—while a site can attempt to restrict deep links, to do so requires extra effort. According to the World Wide Web Consortium Technical Architecture Group, “any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole”.[1]

TechCrunch article – A brief history of deep linking brings an interesting perspective of trajectory of deep linking development on the web and in app word. Below is my favorite passage. It is important since it gives a notion of how to threat standards in internet and application world.

In order for me to write this article, and for you to be able to read it, we have to share a common language: modern English. The same holds true for directing users through deep links — in order to construct a deep link that an application will understand, we need to have some shared way of expressing information or addresses. In software engineering, a well-defined shared vernacular is defined by a “standard.”

The problem with standards, though, is that many of them do not actually become standard practice, and introduce as much fragmentation as they resolve. I could define the word “basilafacitarian” as “a person who likes basil a lot,” but unless it enters the common vernacular, it’s useless as a means of communication for me to tell you that I like basil.

The same is true for an app speaking to another app; unless the URL “myapp://show-item/id123” is mutually agreed upon, there’s no guarantee what the receiving app will do with it.

Coming back to PLM world, we can see customers and vendors are struggling with data and application fragmentation. It is not getting any better in cloud PLM world – we just move to another set of APIs and technologies.

What is my conclusion? The idea of setting standards for deep linking is interesting. It can provide some level of solution to stop proprietary silos, data islands and data pumping challenges to send data between applications back and forth. It is not simple and requires some level of synchronization between vendors and customers. We have already enough history on the web and in app world to learn and correct the course to control data and make it transparent at the same time.  Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net

 

Recent Posts

Also on BeyondPLM

4 6
14 July, 2017

Graphs are fascinating.  They represent the power of connection and intelligence. I’ve been following Microsoft Graph development for some time....

20 November, 2009

How you can explain to customers about PLM on cloud (or on demand)? Quite many times, I’m hearing a phrase...

24 September, 2014

Once upon a time “cloud” was a taboo word in PLM domain. It was hard to believe manufacturing companies will...

12 March, 2014

I had new kind of experience yesterday. It was on demand webinar. To me it was more like live blogging...

23 August, 2017

AI is trending topic these days. AI enabled startups are everywhere and you can find many publications using term AI...

16 May, 2017

Yesterday, my attention was caught by Develop3D article  – Data management related tools. Al Dean is sharing his thoughts about...

19 December, 2017

PLM people always had a big vision. How to build a PLM for a whole company. For entire supply chain....

6 September, 2023

For the first time after COVID, I’m coming to the City By The Bay to attend Autodesk DevCon 2023. For...

16 July, 2018

  One of my most favorite chapters in technology for the last decade is related to graph data models. As...

Blogroll

To the top