A blog by Oleg Shilovitsky
Information & Comments about Engineering and Manufacturing Software

Monolithic Systems and Future of PLM Scalability

Monolithic Systems and Future of PLM Scalability
Oleg
Oleg
6 April, 2017 | 3 min for reading

plm-scale-monolithic-app

In my earlier blog I demystified the notion of “monolithic” PLM marketing and shared some technological aspects related to PLM system architectures. The topic is not simple from all standpoints – technological, conceptual and even emotional. It is not a surprise that PLM vendors have their own definitions of monolithic applications. The demand of manufacturing companies is to grow their PLM technologies and products into innovation platforms and it will add lot of topics on the discussion table of PLM IT architects, analysts and vendors. And made me think more about PLM and future scalability. Below I will outline 3 dimensions to scale PLM systems. These dimensions are not specific for PLM systems. However, I added some examples and context so manufacturing companies and related PLM vendors will be able to apply it to their work.

Horizontal scaling

The problem of horizontal scaling is probably the oldest one and PLM vendors are very much familiar with the need to add more computing resources (CPUs), memory and storage to PLM servers. Adding resources is not simple and cannot be done without limits (technical and financial). In monolithic applications, horizontal scale can be a big problem. To solve it, you can consider running multiple instances of applications on different servers behind load balancer. Not all applications are ready to scale in such way. Another potential drawback is related to the need to run all applications against the same instance of data. Data model is usually the most complex element of PLM system. Regardless of the product these date models are not capable to be used with multiple instances of application.

Vertical scaling

Typically vertical scaling means adding resources (CPUs, memory) to servers. But this approach has limits. If monolithic systems is too big to scale, you can consider to split it into components. In such way, application will be split into set of services. Each service (mini-application) will be running independently and responsible for a specific function. There are multiple ways to decompose application and can be done for new development without significant problem. However, to do it for existing monolithic software can be a big deal. In such way PLM systems developed 15-25 years ago, can have level of difficulties to be separated into services.

Data scaling

One of the most interesting aspects of scale is data scale related to development of granular data sets. In such approach, each service is responsible for only a subset of data. Specific components of the system can be responsible to orchestrate requests to a specific server or data element. Most of PLM systems are designed with Database = Organization state of mind. In such approach splitting data into functional segments can be a hard task. Cloud based systems usually have more options to scale data backend compared to on premise systems managed by company IT.

What is my conclusion? The demand to scale in PLM systems is huge. Modern manufacturing company is a global organization with high level complexity of data, deployment, sophisticated relationships and processes. Latest development of IoT and related technologies is adding an special level of scale problem of significant amount data processes by systems. To scale existing PLM systems will be a high priority task for PLM vendors. Manufacturing companies have to check system architectures before planning to deploy and scale existing PLM systems. Existing monolithic PLM systems have their limits that not always can be resolved without significant architectural changes. It will be an interesting and busy time for PLM architects and technologists. Just my thoughts…

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Recent Posts

Also on BeyondPLM

4 6
24 August, 2018

I’m taking some time off with my family, so please excuse me for irregular blogging this week. My attention was...

23 November, 2009

I think, we are all crazy about a “social computing” topic. Social marketing, social communication, social platform, social content… Few...

10 July, 2015

Disruption is a lovey topic to speculate just before the weekend. Aras-Airbus story gave a context to PLM industry analysts...

12 October, 2012

I’m sure, you learned about two industrial revolutions back in your school-time. First industrial revolution started in 18th century in...

4 April, 2012

One of the unpublished rules of PLM Think Tank is not to talk about marketing. Marketing is a tough job....

21 August, 2015

Integration is one of the most painful aspects of PLM deployment and implementations. Especially when you need to integrate engineering,...

22 January, 2015

One of the most complicated parts of any PLM implementation is data modeling. Depends on PLM vendor, product and technology,...

23 June, 2021

Earlier this week I attended the Autodesk Accelerate 2021 event. The event was originally started back in 2014 as an...

26 July, 2011

Think about enterprise software and collaboration. I think, collaboration is boring. It is complicated, requires lots of system configuration and...

Blogroll

To the top