Many years ago, I’ve been doing some work implementing PLM-ERP integration using Microsoft BizTalk Server. It was fascinating experience. BizTalk was expanding its integration capabilities and tools. It worked very well for many cases. However, after many years, I have to admit that for some complex PLM-ERP use cases it was a stretch. Biztalk blog article reminded me elements of integration complexity – schema mapping and scripting tools.
In practical sense, many integration technologies are ending up as a complex scripting and code projects. Integration is still one of the most challenging parts of PLM implementations.
Fast forward in 2015. With recent development of web and cloud technologies, there is a real opportunity to re-think the way PLM integration are done – PLM needs to learn Web APIs.
Two weeks ago at Autodesk PLM360 conference in Boston, I started the conversation about PLM integrations inspired by Autodesk presentation of “evented web” integration approach. Thanks Monica Schnitger – she gave an excellent name to that – IFTTT-ization of PLM integration technologies. It took me some time to think about IFTTT approach. Autodesk introduced integration technologies using software Jitterbit – a very easy integration tool allowing to hook processes and events between PLM360 and other applications. I left the event with a following question – how different PLM360/Jitterbit from IFTTT native tools.
Over the weekend, I’ve got a good portion of laugh reading Ed Lopategui article – Accelerating the PLM Cloud Integration Bus. I think Ed is spot on in the following passage
The 64,000 dollar question is if Autodesk is implementing Jitterbit like it was IFTTT for PLM360 why not open the door to IFTTT itself? Referring back to the SOA/ESB discussion at the beginning of this post (if was in there for some reason, you know), it’s important to note that point to point solutions just don’t scale. Once integration gets complicated enough IFTTT is just not going to cut it. And while things might be simple enough at the outset, it gets messy quick. You don’t want the wheels to come off and the bomb to explode at the first bump in the road. Best to found the system on what you’re really going to need in the end: a lightweight, business-facing ESB. But there’s no need to open that door completely at the outset, because very early stage a point-to-point event driven system might be just right.
It made me think again about IFTTT approach. I went back to my earlier post – PLM Workflow “California Roll” Style. IFTTT technologies as a driver for web based workflows. In a nutshell, it is a easy tool to hook simple events between web services. But, what is the limit for such logic?
If you had a chance to study programming, you probably familiar with the discussion about when “if” statement. In many situations “if” is considered as an evil keywords in programming languages. Conditional clauses sometimes do result in code which is hard to manage. In my view, “if” is as bad as hummer if you misuse it. But, in some situations hammer can be very useful. Remember, if you run “if” or “switch” around type code, it can smell bad. By allowing to branch on a certain type code, you are creating a possibility to end up with numerous checks scattered all over your code, making maintenance more complex. Polymorphism allows you to bring this branching decision as closer to the root of your program. This is where a more powerful technique should come. It called polymorphism.
The essence of polymorphism is that it allows you to avoid writing an explicit conditional when you have objects whose behavior varies depending on their types.As a result you find that switch statements that switch on type codes or if-then-else statements that switch on type strings are much less common in an object-oriented program.
This approach works in many programming languages, but don’t think similar paradigm is supported in integration middleware. I never heard about applying polymorphism in PLM integration scenarios. I think, it can apply well, especially if both PLM and ERP systems can rely on a consistent classification infrastructure. Once, the code of Item Master integration can be inherited to implement specific aspects of integration for electrical, mechanical and other components – it might be more scalable rather than simple “if-then” clause.
What is my conclusion? The devil of integration techniques is in details. This is where it becomes interesting and you can see how specific integration technology can scale. A cost of integration maintenance is the second important criteria after scaling factor. What is the limit of IFTTT like integration and how does it apply to PLM? Maybe cloud based PLM can simplify overall integration flow with IFTTT and PLM360/Jitterbit combination has some interesting technologies to make it happen? Maybe Autodesk implemented some “polymorphism” into PLM360/Jitterbit bundle to make it scale? It can be an interesting turn, in my view. Just my thoughts (and speculation) for the moment…