Few month ago, I shared the story of True & co – company actively experimenting and leveraging data science to improve design and customer experience. You can catch up by navigating on the following link – PLM and Big Data Driven Product Design. One of the most interesting pieces of True & Co experience I’ve learned was the ability to gather a massive amount of data about their customers and turn in into a information to improve product design process.
Earlie this week the article What’s next for big data prediction for 2015 caught my attention. I know… it is end of the year “prediction madness”. Nevertheless, I found the following passage interesting. It speaks about emerging trend of Information as a service. Read this.
The popularity of “as-a-Service” delivery models is only going to increase in the years ahead. On the heels of the success of software as a service models, I believe Information-as-a-Service (IaaS) or Expertise-as-a-Service delivery models are likely the next step in the evolution. The tutoring industry provides a good blueprint for how this might look. Unlike traditional IT contractors, tutors are not necessarily hired to accomplish any one specific task, but are instead paid for short periods of time to share expertise and information.
Now imagine a similar model within the context of data analytics. The shortfall most often discussed with regard to analytics is not in tooling but in expertise. In that sense, it’s not hard to imagine a world where companies express an interest in “renting” expertise from vendors. It could be in the form of human expertise, but it could also be in the form of algorithmic expertise, whereby analytics vendors develop delivery models through which companies rent algorithms for use and application within in their own applications. Regardless of what form it takes in terms of its actual delivery, the notion of information or expertise as a service is an inevitability, and 2015 might just be the year IT vendors start to embrace it.
It made me think about how PLM can shift a role from being only “documenting and managing data and processes” towards providing services to improve it by capturing and crunching large amount of data in organization. Let speak about product configurations – one of the most complicated element of engineering and manufacturing. Mass production model is a think in a past. We are moving towards mass customization. How manufacturing companies will be able to get down cost of products and keep up with a demand for mass customization? Intelligent PLM analytics as a service will be able to help here.
What is my conclusion? Data is a new oil. Whoever will have an access to a most accurate data will have a power to optimize processes, cut cost and deliver product faster. PLM companies should take a note and think how to move from “documenting” data about design and processes towards analytical application and actionable data. Just my thoughts…
Best, Oleg
Pingback: PLM, wait… Big Data 2.0 is coming()
Pingback: Beyond PLM Blog » 3 reasons why big data is a big challenge for PLM()