A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

PLM and a future of deep linking

PLM and a future of deep linking
Oleg
Oleg
19 June, 2015 | 4 min for reading

deep-linking-plm

I like links. The things about links are fascinating. It is about connection between products, people, companies and things. The nature of our life today is to stay connected. Therefore links are important. It is natural to see links appearing everywhere in engineering and manufacturing too. Think about a product and its dependencies. The information is naturally linked between assemblies, parts, documents, bill of materials, materials, suppliers, manufacturing, shop-floor, orders and many other things.

The nature of things in PLM to be connected. At the same time, the reality of engineering software is in fact highly fragmented. The time when vendors and customers believed in a single system (or database) that can contain and manage all information is over. Customers are using multiple applications and it is unusual to see two or more PLM systems in the same company. When it comes to supply chain, the situation is even more complex.

Application integrations remains one of the most painful aspect of enterprise software and PLM can clearly lead the wave of complexity involved into implementations. My article yesterday –How PLM can avoid cloud integration spaghetti was a warning to all folks that imagine that cloud will be a silver bullet to kill application pain. It doesn’t. The cloud integration can be sometimes even more complex compared to traditional integration hacks using SQL and ETL tools.

I want to continue discussing the topic of cloud integration. The topic I’m bringing today is related to so called “deep linking”. If you’re not familiar with the topic, navigate to the following Wikipedia article – Deep Linking – to get some background information.

In the context of the World Wide Web, deep linking consists of using a hyperlink that links to a specific, generally searchable or indexed, piece of web content on a website (e.g.http://example.com/path/page), rather than the home page (e.g. http://example.com/). The technology behind the World Wide Web, the Hypertext Transfer Protocol (HTTP), does not actually make any distinction between “deep” links and any other links—all links are functionally equal. This is intentional; one of the design purposes of the Web is to allow authors to link to any published document on another site. The possibility of so-called “deep” linking is therefore built into the Web technology of HTTP and URLs by default—while a site can attempt to restrict deep links, to do so requires extra effort. According to the World Wide Web Consortium Technical Architecture Group, “any attempt to forbid the practice of deep linking is based on a misunderstanding of the technology, and threatens to undermine the functioning of the Web as a whole”.[1]

TechCrunch article – A brief history of deep linking brings an interesting perspective of trajectory of deep linking development on the web and in app word. Below is my favorite passage. It is important since it gives a notion of how to threat standards in internet and application world.

In order for me to write this article, and for you to be able to read it, we have to share a common language: modern English. The same holds true for directing users through deep links — in order to construct a deep link that an application will understand, we need to have some shared way of expressing information or addresses. In software engineering, a well-defined shared vernacular is defined by a “standard.”

The problem with standards, though, is that many of them do not actually become standard practice, and introduce as much fragmentation as they resolve. I could define the word “basilafacitarian” as “a person who likes basil a lot,” but unless it enters the common vernacular, it’s useless as a means of communication for me to tell you that I like basil.

The same is true for an app speaking to another app; unless the URL “myapp://show-item/id123” is mutually agreed upon, there’s no guarantee what the receiving app will do with it.

Coming back to PLM world, we can see customers and vendors are struggling with data and application fragmentation. It is not getting any better in cloud PLM world – we just move to another set of APIs and technologies.

What is my conclusion? The idea of setting standards for deep linking is interesting. It can provide some level of solution to stop proprietary silos, data islands and data pumping challenges to send data between applications back and forth. It is not simple and requires some level of synchronization between vendors and customers. We have already enough history on the web and in app world to learn and correct the course to control data and make it transparent at the same time.  Just my thoughts…

Best, Oleg

Image courtesy of Stuart Miles at FreeDigitalPhotos.net

 

Recent Posts

Also on BeyondPLM

4 6
3 December, 2010

Collaboration is a tough word. PLM Is using this word all the time. For the last 10-15 years in this...

6 February, 2013

I’m on the road in Europe this week. Europe met me with snow in Zurich and not very reliable internet...

16 January, 2017

Historically, engineering and manufacturing created a waterfall process. Some people called it “over the wall engineering“. To follow this process,...

21 March, 2011

The definition of what is Product Lifecycle Management is one of the most controversial topics in PLM blogosphere and other...

21 July, 2010

The ability to have an important engineering information reference seems to me obvious. Thanks for one of my readers, who...

23 December, 2009

Short prompt to think about before Holiday break. Mashups. First coming to us with the world of Web 2.0 and...

23 January, 2014

Collaboration is not a new buzz in engineering domain. CAD and PLM companies are using this term already few decades...

23 April, 2023

Manufacturing processes are getting more complex and it leads to an increased level of complexity in the implementation of major...

24 June, 2013

The two terms “cloud computing” and “virtualization” have many things in common. From a specific technological standpoint, we can even...

Blogroll

To the top