From the category archives:

Trends

plm-link-data

One of the most popular topics in engineering (but not only) software ecosystem. Open vs. Close. I’ve been discussing it many times – Open vs. Closed PLM Debates, PLM and New Openness, Closed Thoughts About PLM openness and few more. There is clear trend towards openness these days and, in my view, it is hard to find a PDM/PLM company that will defend closed approach and not openness.

However the definition of openness can be quite different. What else, the implementation of openness can be different too. Speaking from the engineering standpoint, devil is in details. So, I wanted to speak about some aspects of “openness” and how it might be implemented in PDM / PLM world. For very long period of time, data in PDM/PLM world was completely dependent on Relational Database Management Systems (RDBMS). The time of proprietary databases and data files is finally over. So, you can think data is peacefully located in RDBMS where it can be easy accessed and exchanged.  Not so fast… There are two main constraints preventing data openness in RDBMS: data access technology and data schema. You need to support both in order to have access to the data. An alternative would be to use published APIs, which will provide you an access layer. In most cases, APIs will eliminate the need to know data model, but in a nutshell will not be very different from data access technology.

For many years ODBC remains one of the most widely adopted database access technology. I’m using name ODBC, but it can also refer variety of similar data access technologies – JDBC, OLE DB, JDBC, ADO.NET, JDO, etc. This is where things went wrong with data access and openness. The power and success of ODBC came from the use of DSN (Data Source Names) as a identification of data access. All ODBC-compliant applications leveraged the fact other developers have implemented RDBMS specific libraries – ODBC drivers. So, used don’t need to think about Oracle, SQL server, MySQL, etc. User just need to connect to DSN.

The distinct development and end-user models of ODBC ensured a massive ecosystem of ODBC-compliant applications and database connectivity drivers. Unfortunately, RDBMS vendors — the same ones that collectively created the SQL CLI and inspired its evolution into ODBC — also sought to undermine its inherent RDBMS agnosticism. The problem it created lies in the producing of huge amount of data driven applications relying on ODBC data access and claiming data openness as the ability to access, retrieve and (sometimes) update data in the RDBMS. Hidden behind DNS, databases converted into data silos. Data extracted from a specific database was dead and lost without context of the database. So called “openness” became simple “data sync pipe”.  What else, each DNS remains separate. So, if you have few databases you are out of luck to access data in logical way. Applications are pumping data from one database to another mostly trying to synchronize data between different databases. The amount of duplicated and triplicated data is skyrocketing.

So, what is the alternative? We need to stop “syncing data” and instead of we need to start “linking data”. Think about simple web analogy. If you want to reference my blog article, you don’t need to copy it to your blog. For most of the cases you can create a link to my blog and URL address. Now, let’s bring some more specific technologies into this powerful analogy. Maybe you are familiar with semantic web and linked data. If not, this is the time! Start here and here.

There is a fundamental differences between old ODBC world and new way of linking data. You can get some fundamentals here and by exploring W3C data activity. I can summaries three main principles of linking data – 1/ use of hyperlinks to the source of data; 2/ separation of data abstraction data access APIs; 3/ conceptual data modeling instead of application level data modeling. So, instead of implementing ODBC drivers and APIs to access data, each data provider (think about PLM system, for the moment) will implement an linked data web abstraction layer. This abstraction layer will allow to other applications to discover data and run queries to get results or interlink data with data located in other systems. LinkedData is fast developed ecosystem. You can lear more here.

What is my conclusion? We are coming to the point where we need to re-think the way we are accessing data in business systems and start building a better abstraction level that will allow to stitch data together via linkage opposite to synchronization. The wold wide web and the internet are ultimately success stories for open standard adoption and implementation techniques. Applying that will simplify access to data and build a value of data connection to the enterprise. Just my thoughts…

Best, Oleg

Share

5 comments

aws-for-plm-cloud-options

Amazon is an absolutely marketshare leader in cloud computing. Because “cloud” is such a big and vague word these days, we must clarify and say “public cloud”. So, you may think for most of us, cloud is equal to Amazon. AWS EC2 allows us to spin new servers quickly and provide great services to everybody interested in development of SaaS packages.

Not so fast… Questions are coming too. I can see two major ones – cost and strategy. I’ve been posted Cloud PLM and battle for cost recently. Amazon public cloud is coming with challenging cost sticker to some of us. Strategy question is connected to many factors – PLM PaaS opportunity, security and storage alternatives. Finally, with huge respect to Amazon, I’m not sure how many CAD / PLM companies are interested in catholic marriage between cloud PLM platforms and AWS. To provide PLM solution independent from Amazon IaaS and to control data storage is  an interesting option for many vendors and partners. How to do so? I think, this is part of strategy for every PLM vendor these days looking how to develop long term relationships with manufacturing OEMs and suppliers.

My attention caught Gigaom article – Want to beat Amazon in the cloud? Here are 5 tips. Read the article. It provides some interesting opportunities how to compete AWS. It raises the point that in 2014 AWS became an elastic service commodity competing on cost. This is an interesting quote explaining that -

But fast-forward to 2014: there are dozens of IaaS providers offering similar capabilities. The selling points — like self-service, zero CAPEX and elasticity — that once made the cloud look exciting are not as appealing anymore, and they are no longer the differentiating factors. In the current context, selling cloud for its self-service capabilities is similar to Microsoft trying to sell the latest version of Windows only for its graphical interface.

Cost is important. However, for enterprise, value is often even more important. Therefore, speaking from the perspective of PLM players, my favorite passage is related to how to support scale-up and shared storage:

AWS’s philosophy of throwing more VMs at an application is not ideal in many scenarios. It might work wonders for marketing websites and gaming applications but not for enterprise workloads. Not every customer use case is designed to run on a fleet of servers in a scale-out mode. Provide a mechanism to add additional cores to the CPU, more RAM and storage to the VM involving minimal downtime. The other feature that’s been on the wish list of AWS customers for a long time is shared storage. It’s painful to setup a DB cluster with automatic failover without shared storage.

Here is my point. I think, CAD and PLM vendors will have to discover how to provide a balanced and scalable cloud platform. This platform will have to answer on questions how to scale from the solution for small manufacturers and mid-size companies to enterprise OEMs and Tier 1 suppliers. The border between these segments is vague. It is hard to develop two distinct PLM offerings and support two separate platforms. It was hard in the past on premise software and it is even more complicated on the cloud.

What is my conclusion?  PLM providers will have to discover how to grow up from AWS-based offering and develop scalable cloud PLM platforms. It must include diverse options for data storage as well as computing power. So, to beat Amazon can be not such a dream option for PLM vendors like it looks from the beginning. Just my thoughts…

Best, Oleg

Share

2 comments

PLM Return on Mobility Challenges

by Oleg on March 20, 2014 · 0 comments

plm-return-on-mobile-tech

Almost two years ago I posted my Mobile PLM gold rush – did vendors miss the point? post. Mobile usage is skyrocketing. It is hard to imagine our lives without mobile devices. Is it a good time to get back to the conversation about PLM, engineers and mobile? What is a special purpose of mobile applications for engineers and product lifecycle management?

I was reading The Future of Enterprise Mobility article earlier this week. Article focuses on the research done by 451 Research and Yankee Group about mobile applications and enterprise. I captured two main challenges – data and device type.  First one is easy and complex at the same time – too much data is flowing through mobile devices these days. IT cannot protect the environment from mobile devices- this is a reality.  Another one is related to a diverse number of mobile developers – Apple, Samsung, Nokia (Microsoft) ,Blackberry. Here is my favorite passage:

Among the changes Yankee Group looks to take place is that mobile applications will move front and center. It’s only in fairly recent times that the tools to help companies affect this shift have been available. Tools that are both enterprise-grade and that offer the type of agility, scalability and flexibility for enterprises to innovate in a truly mobile-world have not long been a reality. They are beginning to emerge, but enterprises are still being sold either the false promise that traditional approaches have all along allowed this capability or the false compromise that you can’t have both.

Another market change is mobile cloud platforms will look to become the new mobile middleware. New mobile cloud development and infrastructure platforms have emerged during the past 18 months with a steely gaze on the enterprise, on the proliferation of internal and customer-facing applications being considered, and on becoming the new mobile enterprise middleware. By abstracting much of the traditional back-end engineering complexity to cloud-based services, these vendors offer a compelling approach, one that will continue to have market-wide impact and be key to helping enterprises scale not only their applications and projects but also their innovation.

So, what are potential problems and issues vendors are facing developing their mobile strategies. Is it just “another screen” with a little bit different user experience? From my view, many of enterprise people including engineers, this is how mobile devices was perceived for the last few years. Mobile applications for engineers and PLM, specifically, used mobile coolness factor, but didn’t deliver much value. Thinking about that, I thought ROM (Return on Mobility) topic introduced in the article is a good parameter to manage before deciding about future mobile options for PLM and engineers.

The Return on Mobility scorecard is a new research methodology that calculates the value enterprises achieve from their investments in the platforms they use to develop, deploy and manage mobile technologies and services. It’s an ROI specifically for mobility. With the increasing importance of mobile, social and cloud technologies enabling business success, it’s crucial for companies to make the right, informed decisions concerning the solutions and platforms they use. – The focus of the RoM scorecard goes beyond total cost of ownership to measure ROI for enterprises through benefits such as application integration, employee productivity and customer experience.

It made me think what can be an easy ROI from mobile technologies in product development and manufacturing. Here is my guess. Process speed.  By increasing and optimizing processes, we can improve decision making and information flow. An example can be ECR/ECO process. Cost of ECO is one of the highest for every manufacturing and development organization. One of the opportunity to get ECO process faster. Email and messaging are two top scored mobile applications. We use it everywhere. It must be a very good way to get people involved into the process and speed process up.  For the last years we gathered lots of experience in mobile email. To connect mobile email to ECO process with right context can provide high ROM.

What is my conclusion? In my view, we came to the end of “mobile for the sake of mobile” story. ROM is an absolutely right approach that must be taking into account before every mobile application for engineers will be developed. PLM process speed and specifically ECO/ECN turnaround can be a good application for mobile platforms and tech. Just my thoughts…

Best, Oleg

Share

0 comments

PLM, Mass Customization and Ugly BOM Vertical Integration

March 19, 2014

A car can be any color as long as it is black. This famous Henry Ford quote speaks about how manufacturing handled customization in the past. That was the era of mass production. The idea of limited customization options combined with high level of standardization and high volumes of batch production allowed to decrease cost and […]

Share
Read the full article →

How engineers find path from emails and messages to collaboration?

March 14, 2014

We are really bad about managing ourselves. The natural people behavior is to create mess. Life is getting more dynamic these days. We are swamped into the ocean of information, data streams, social networks, emails, calls, etc. If you want me to do something, send me an email. I’m pretty sure you are familiar with […]

Share
Read the full article →

Why and when to re-think PDM?

March 10, 2014

PDM (Product Data Management) isn’t a new discipline. Nevertheless, I think, PDM is going through the time of disruption and renaissance. Cloud, social and mobile technologies are changing the way we’ve been working in the past. From that side, I can see companies that trying to re-invent PDM with a new meaning and technologies. I’ve […]

Share
Read the full article →

Does PLM have a chance to win over ERP dominant position?

March 5, 2014

PLM and ERP have long “love and hate” relationships in manufacturing world. You can find lots of materials speaking about complementary roles of PLM and ERP. This is of course true. However, integration between PLM and ERP never been easy. Despite many technologies, systems and solutions available on the market, customers are often skeptical about […]

Share
Read the full article →

PLM Open Source Future – Cloud Services?

February 17, 2014

For the last few years, open source was one of the major disruptive factor in tech. Open source powers world’s leading tech companies. Tech giants like Google, Facebook, Amazon and many others would not exist without open source. The success of RedHat put a very optimistic business projection on the future disruption of industry by […]

Share
Read the full article →

Do we need a new TLA for PDM?

February 14, 2014

Product Data Management (PDM) is not a new buzz. Lots of things where written and spoken about how to manage CAD files, revisions and related data. Decades were spent on how to create a better way for engineers to “collaborate” or work together. Nevertheless, I can see few companies are trying to disrupt PDM space […]

Share
Read the full article →

PLM Software and Open Source Contribution

February 11, 2014

Open source is a topic that raised many controversy in the last decade. Especially if you speak about enterprise software. The trajectory of open source software moved from absolute prohibition to high level of popularization. In my view, the situation is interesting in the context of PLM software. The specific characteristic of PLM is related […]

Share
Read the full article →