What is LCAP and why PLM IT and admins should care?

What is LCAP and why PLM IT and admins should care?

The era of four digits acronyms is coming. Check out my FLaaS article. If you never heard about LCAP, it is time to meet the new name – Low Code Application Platforms. You might remember my articles about low code development earlier this year. Check them out:

Will low-code save PLM from customization mess and make it more open?
Low code- a sexy version of messy PLM implementations
Why PLM low code initiative can be DOA?

My attention was caught by the Mendix announcement and sharing of Gartner MQ for LCAP. You can grab it here.

Gartner predicts 75% of large enterprises to move into LCAP by the end of 2024. This means LCAP will become a leader to provide business applications. My favorite passage is about the market and goals.

IT leaders are facing mounting challenges around application delivery. Developer shortages and skill-set challenges are impacting their ability to deliver increasing levels of business automation in a rapid and reliable fashion. In response, the vendors of low-code application platforms (LCAPs) have been improving the ease at which business applications can be delivered, providing broader capabilities requiring smaller and less specialized teams of developers.

An LCAP is characterized by its use of model-driven or visual development paradigms supported by expression languages and possibly scripting to address use cases such as citizen development, business unit IT, enterprise business processes, composable applications and even SaaS applications. These platforms are offered by vendors that may be better known for their SaaS offerings, or their business process management (BPM) capabilities, as well as specialist vendors for rapid application development. The primary goal is increased application development productivity with reduced skill-set requirements for developers.

So, LCAP is after application development where the primary user is the large enterprise IT. The tools are probably out of reach for medium-size manufacturing enterprises. They don’t have resources to run LCAP projects and will rely on more packaged tools and services providers. The goal to address IT projects in large enterprises is logical but raises the question of how these projects will boost or co-exist with current PLM systems and architectures. The majority of large PLM platforms are still old, relying on old SQL DB foundation, have a hard time to scale outside of the single organization and to support multi-tenant and multi-cloud deployments. Bringing a set of new tools (LCAP) will eventually change the balance of PLM implementation projects. Most of them were providing customization around the data held by the PLM database. It looks like the trajectory of LCAP will be to take the data out and provide it outside or, at least, to make PLM data intertwined with the model-driven backbones of LCAP platforms.

The latter can bring an interesting dilemma for all PLM and IT architects. Historically, PLM is a backbone for all product information (CAD files) and additional data. The expansion beyond CAD and engineering was always a problem for PLM vendors that stuck in engineering projects. Recent waves of digital transformation made companies carry more about how to organize seamless data handover between applications and supporting processes with the right data. LCAP can be a tool to do a job, which will intensify the decision process of how to develop applications using PLM data. The problem is that not all PLM systems will make it. The existing PLM architecture will be slowing this development by locking data, making API access problems, not providing modern web development tools. The business models are also conflicting. The ugly truth is that PLM still very much relies on data locking. How LCAP platforms will interplay with PLM backbones? This is a question to ask.

What is my conclusion?

LCAP tools are the evolution of BPM and it provides a large amount of development infrastructure. The customer of LCAP is the enterprise IT, which is a developer of solutions for large enterprises. It looks like PLM systems will be invited to fill the box of data and some process providers. The efficiency of the data handout and integration technique will be the key element of success. Otherwise, enterprise IT will leave old PLM architects behind. Here is the opportunity for SaaS PLM vendors. By using new modern web and cloud technologies, SaaS PLM can be a natural fit for LCAP to support better data integration and development tool. This is an interesting opportunity enterprise IT architects can discover to replace archaic decades-old PLM SQL client-server backbones. Just my thoughts…

Best, Oleg

Disclaimer: I’m co-founder and CEO of OpenBOM developing a digital network platform that manages product data and connects manufacturers and their supply chain networks.


Share This Post