A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

The Future of PLM Openness: From REST APIs to Agentic Workflows with MCP

The Future of PLM Openness: From REST APIs to Agentic Workflows with MCP
Oleg
Oleg
10 May, 2025 | 9 min for reading

In recent weeks, I’ve shared a few thoughts about the architectural shifts in PLM—specifically, the slow but inevitable transition from monolithic legacy systems to modular, composable platforms. Check my article – Rethinking Monolithic PLM Architecture – Exploring What Comes Next.

While I was discussing the future of modern PLM architectures and an opportunity for modern PLM systems and software in building a Manufacturing Graph using new types of data modeling technologies, the question of complexity of integration was practically inevitable.

One of the main counter-arguments I often hear in favor of unified (read: monolithic) PLM architectures is the perceived complexity of integrations. And yes, integrations are hard—but the tools, protocols, and platforms supporting them are rapidly evolving.

Which is true – integrations are complex and, especially, if you build them using technologies available product lifecycle management (PLM) and product data management (PDM) systems and business processes 20 years ago, can be very costly and fragile. While many of my readers agreed that it is practically impossible to put an entire scope of product development processes, product data and PLM solutions in a single database (aka centralized data management), at the same time, the question about integrations is the one that needs to be addressed to explore what is possible in the PLM software architecture especially When You Stop Building PLM Like It’s 2005.

So today, I want to take a closer look at the trajectory of PLM openness, which was an important topic in PLM community. There is an agreement about the need of PLM openness for product lifecycle, supply chain management, service lifecycle management, organization data integrity and product quality. You absolutely need it for building a collaborative environment and supplier collaboration.

The elements of PLM openness are main components of digital transformation. Connection between engineering data and production planning and supply chain collaboration is essential robust process management in design and engineering teams and supply chain partners. All together it will streamline product development and change management. Integrated systems will provide significant cost savings and optimization for engineering teams and manufacturing companies.

But what PLM technology and product related data management tools will support openness and integrations? How it will be down? What is the development costs and how to organize workflow automation? How to support such software for PLM data quality and supply chain agility? What are the key components of systems that supports data sharing, making CAD data available for a product process and deliver customer expectations to accellerate development cycle. What technologies will allow to software providers to support global manufacturers with integrated systems.

In my article today, I want to share how integration approaches are shifting, and what we need to understand about REST APIs, GraphQL, and the emerging world of MCP (Model Context Protocol) and some other technologies recently introduced such as Google Agent2Agent Protocol. If the last decade was about REST APIs, the next one will be about enabling LLM-driven, agentic PLM workflows.

Let’s unpack how we got here—and where we’re going.

A Brief History of PLM Integration and APIs

The history of APIs in PLM mirrors the broader evolution of enterprise software—and the long-standing tension between control and openness. Early business systems, including PLM, were designed to lock in users, trap data in silos, and make integrations an afterthought. APIs, if they existed at all, were either nonexistent or extremely limited, reinforcing vendor dependency and preventing true data portability.

Over time, as business demands evolved and the market began to prioritize flexibility and ecosystem connectivity, the need for openness became impossible to ignore. This shift toward openness is reflected in the gradual development of integration technologies, which we can think of in three generational waves:

1990s – Early 2000s: Proprietary APIs. Legacy PLM systems offered integrations through complex, vendor-specific APIs. These required specialized knowledge, deep customization, and tight coupling to internal data models. Most of these systems were designed around closed data architectures and resisted change.

2010s – Early 2020s: RESTful APIs & GraphQL. A significant improvement arrived with RESTful APIs and, more recently, GraphQL interfaces. These standardized access to resources and enabled broader adoption through web-based integrations and mobile applications. However, they still demanded developer-level understanding and manual implementation, and each system remained its own silo behind slightly more accessible doors.

2024+: MCP and Agentic Integration. As AI and LLMs enter the mainstream, a new integration paradigm is emerging—one that is not built for humans to code against directly, but rather for intelligent agents to dynamically discover, interpret, and use. MCP (Model Context Protocol) represents this next phase of openness, where integration becomes adaptive, semantically rich, and context-aware.

PLM API and Openness

Let’s dive deeper into this next chapter.

Future of AI Integration: Understanding MCP vs REST API

As large language models (LLMs) become part of everyday workflows, their power depends on being able to connect to other systems—just like traditional software. Historically, this meant calling APIs. But in late 2024, Anthropic introduced a new open protocol—MCP (Model Context Protocol)—specifically designed for LLM agents.

If you’re working on the future of PLM or any intelligent enterprise software, it’s time to understand what this means.

What Is MCP? Think of It as USB-C for AI Agents

If REST APIs are like proprietary power adapters—each with a different shape and spec—then MCP is the USB-C of AI integrations.

MCP provides a standard interface for AI agents to discover and interact with tools, data, and services dynamically, without having to hardcode specific APIs into the model’s prompt or logic. Its architecture is built around:

  • MCP Host – Like a laptop running your workflows
  • MCP Clients – Open sessions for AI agents to use tools and request data
  • MCP Servers – External services that offer tools, data, or prompts through standardized primitives

These primitives fall into three categories:

  • Tools (functions the AI can call)
  • Resources (read-only contextual data and PLM data products)
  • Prompt Templates (guidance for interaction)

This architecture allows AI agents to discover and use capabilities at runtime—no manual API stitching required.

So What’s the Difference Between MCP and REST APIs?

At a glance, both APIs and MCP allow systems to communicate—but they were designed for very different worlds.

MCP implementations in their current form are typically built as wrappers around existing REST APIs, starting with a layer that categorizes traditional API calls into standardized “Tools” primitives. These tools are then exposed through a consistent interface that AI agents can use without knowing the underlying service logic. As this foundation matures, more advanced agentic workflows are introduced—enabling autonomous interaction across services in a modular and adaptive way. An MCP server can sit on top of a PLM system and expose part search, BOM queries, and revision updates in a way that an LLM can understand and navigate on its own. This unlocks not just data access but also the ability to control PLM-specific actions—triggering workflows, initiating change requests, or generating reports—through AI agents. Therefore, the most critical element of this transformation will be the ability of a PLM system to adapt, support robust data modeling, and scale for seamless deployment. Think about BOM agents importing data directly from suppliers, searching for the right components online, and fetching this data later to be used for costing models or compliance validation, all without manual involvement.

Why This Matters for PLM software and Future of Integrations

Modern PLM platforms—especially modular and composable ones—willa evolve beyond REST APIs. REST is necessary but not sufficient. New cloud-native online services is a new generation of PLM tools (different from hosted 20 years old PDM/PLM tools) will operate in agentic environments where autonomous agents assist with design reviews, BOM cost analysis, procurement tasks, change impact assessments, and more.

These AI-driven workflows require MCP-style interfaces that allow agents to:

  • Dynamically discover services across multiple systems (PDM, CAD, ERP)
  • Request structured data or run actions like creating a change order
  • Adapt to new tools as they become available—without manual retraining

In my recent blog, Rethinking Monolithic PLM Architecture: Exploring What Comes Next, I described how modern PLM systems must become composable, interoperable by default. MCP is a concrete technical enabler of that vision.

Here is my take of the future PLM integration stack. The digital thread platforms will play a key role in the creation of data modeling foundations and new MCP integration interfaces will be instrumental in development intelligent integrations.

The Agentic Future of PLM Integration

We’re entering an era where PLM will no longer be a monolith guarded by a handful of system integrators. Instead, a distributed network of tools, microservices, and AI agents will collaborate through dynamic, semantically rich interfaces.

Imagine this:
A procurement assistant AI agent is asked to “find approved substitutes for a discontinued part and estimate cost and lead time impact.” Today, that task would require a human jumping across multiple systems and APIs.

With MCP-based integrations, that agent could query a PLM service, navigate product structures, fetch ERP data, and present a synthesized response—without custom glue code or static APIs.

This is the vision: Composable PLM + Agentic Integration = Intelligent, Connected Product Development.

What is my conclusion?

The world of PLM is changing. In my earlier articles I wrote about switching from monolithic PLM to composed and modular PLM systems. While many vendors are still catching up with unified PLM models trying to consolidate multiple elements of their “PLM platform grandiose development” and providing REST APIs calling it PLM Openness, the next wave of openness will come from standardized, dynamic, agent-friendly interfaces like MCP. Modern PLM services will deliver mechanisms to discover what services are available, connect to those services using protocols like MCP, Google Agent2Agent, or similar. It will create a new level of openness and communication between PLM services online.

If you’re building or evaluating PLM systems in 2025 and beyond, don’t just ask about their REST endpoints and they built integrations — ask how they’ll support MCP-style discovery, tooling, and agent workflows. Because that’s where the real transformation begins.

Just my thoughts…

PS. What do you think—can PLM vendors and integrators adapt to this next generation of openness? Let’s continue the conversation.

Best, Oleg

Disclaimer: I’m the co-founder and CEO of OpenBOM, a digital-thread platform providing cloud-native collaborative and integration services between engineering tools including PDM, PLM, and ERP capabilities. With extensive experience in federated CAD-PDM and PLM architecture, I’m advocates for agile, open product models and cloud technologies in manufacturing. My opinion can be unintentionally biased

Recent Posts

Also on BeyondPLM

4 6
10 April, 2017

Cloud computing is maturing, creating a host of opportunities for providers as more and more companies consider the technology for...

10 November, 2017

Earlier this year, I learned from CIMdata that cloud PLM is not doing very well. CIMdata announced collaborative research sponsored...

22 May, 2009

Once in a while, industries, companies, or developers come up with something that they think will change the technological and...

2 August, 2010

Some very interesting news is coming from Lockheed Martin. Navigate to the following link – Lockheed Martin Launches Open-Source Social Networking...

7 January, 2022

The cloud has revolutionized the way businesses operate, and the industrial sector is no exception. A cloud-based platform can help...

10 March, 2024

The discussion about engineers preferring to use spreadsheets, while PLM function is always introducing some sort of governance and business...

21 June, 2019

Many years ago, I’ve heard a very serious marketing joke. How to create enterprise software? It is easy – just...

22 May, 2018

We are facing new technologies intervention almost every day. While technological transformation is a very good thing, to find and...

21 September, 2015

If you’re in the manufacturing business, I’m sure you’ve heard about PLM (Product Lifecycle Management). And the thing you probably...

Blogroll

To the top