A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

The end of single PLM database architecture is coming

The end of single PLM database architecture is coming
Oleg
Oleg
5 August, 2014 | 3 min for reading

PLM-distributed-cloud-database-architecture

The complexity of PLM implementations is growing. We have more data to manage. We need to process information faster. In addition to that, cloud solutions are changing the underlining technological landscape. PLM vendors are not building software to be distributed on CD-ROMs and installed by IT on corporate servers anymore. Vendors are moving towards different types of cloud (private and public) and selling subscriptions (not perpetual licenses). For vendors it means operating data centers, optimize data flow, cost and maintenance.

How to implement future cloud architecture? This question is coming to the focus and, obviously, raising lots of debates. Infoworld cloud computing article The right cloud for the job: multi-cloud database processing speaks about how cloud computing is influencing what is the core of every PDM and PLM system – database technology. Main message is to move towards distributed database architecture. What does it mean? I’m sure you are familiar with MapReduce approach. So, simply put, the opportunity of cloud infrastructure to bring multiple servers and run parallel queries is real these days. The following passage speaks about the idea of how to optimize data processing workload by leveraging cloud infrastructure:

In the emerging multicloud approach, the data-processing workloads run on the cloud services that best match the needs of the workload. That current push toward multicloud architectures provides the ability to place workloads on the public or private cloud services that best fit the needs of the workloads. This also provides the ability to run the workload on the cloud service that is most cost-efficient.

For example, when processing a query, the client that launches the database query may reside on a managed service provider. However, it may make the request to many server instances on the Amazon Web Services public cloud service. It could also manage a transactional database on the Microsoft Azure cloud. Moreover, it could store the results of the database request on a local OpenStack private cloud. You get the idea.

However, not so fast and not so simple. What works for web giants might not work for enterprise data management solutions. The absolute majority of PLM systems are leveraging single RDBMS architecture.  This is fundamental underlining architectural approach.  Most of these solutions are using “scale up” architecture to achieve data capacity and performance level. Horizontal scale of  PLM solutions today is mostly limited to leverage database replication tech. PLM implementations are mission critical for many companies. To change that would be not so simple.

So, why PLM vendors might consider to make a change and to think about new database architectures? I can see few reasons – the amount of data is growing; companies are getting even more distributed; design anywhere, build anywhere philosophy comes into real life. The cost of infrastructure and data services becomes very important. In the same time for all companies performance is an absolute imperative – slow enterprise data management solutions is a thing in the past. To optimize workload and data processing is an opportunity for large PLM vendors as well as small startups.

What is my conclusion? Today, large PLM implementations are signaling about reaching technological and product limits. It means existing platforms are achieving a possible peak of complexity, scale and cost. To make the next leap, PLM vendors will have to re-think underlining architecture, to manage data differently and optimize cost of infrastructure. Data management architecture is the first to be considered. Which means end of existing “single database” architectures. Just my thoughts…

Best, Oleg

Recent Posts

Also on BeyondPLM

4 6
27 January, 2010

I want to raise discussion about reporting in PLM. We don’t see it much in marketing materials about PLM. I...

26 January, 2017

Artificial Intelligence (AI) is having a renaissance moment these days. With the risk to disclose my age, I can remind...

12 November, 2015

Cost is one of the most important drivers to optimize manufacturing and operations. But at the same time, this is...

2 October, 2014

I’m learning a lot these days about IoT. The amount of connected devices around us is growing and it raises...

10 July, 2015

Disruption is a lovey topic to speculate just before the weekend. Aras-Airbus story gave a context to PLM industry analysts...

1 July, 2011

This week was packed with multiple announcements and events. One of them happened in Paris – first Dassault Application Innovation...

20 April, 2009

In one of my previous posts, I already discussed PLM process management: Should PLM develop its own process tools?. In...

30 December, 2015

Standards are similar toothbrushes. Everyone need one, but nobody wants to somebody else standard. CAD and PLM solutions are supporting...

18 December, 2015

The Internet of Thing is coming and it capturing imagination and budgets of PLM vendors and manufacturing companies. Earlier this...

Blogroll

To the top