Posts tagged as:

Data

How PLM can ride big data trend in 2015

by Oleg on December 22, 2014 · 0 comments

plm-actionable-data-decision-process

Few month ago, I shared the story of True & co – company actively experimenting and leveraging data science to improve design and customer experience. You can catch up by navigating on the following link – PLM and Big Data Driven Product Design. One of the most interesting pieces of True & Co experience I’ve learned was the ability to gather a massive amount of data about their customers and turn in into a information to improve product design process.

Earlie this week the article What’s next for big data prediction for 2015 caught my attention. I know… it is end of the year “prediction madness”. Nevertheless, I found the following passage interesting. It speaks about emerging trend of Information as a service. Read this.

The popularity of “as-a-Service” delivery models is only going to increase in the years ahead. On the heels of the success of software as a service models, I believe Information-as-a-Service (IaaS) or Expertise-as-a-Service delivery models are likely the next step in the evolution. The tutoring industry provides a good blueprint for how this might look. Unlike traditional IT contractors, tutors are not necessarily hired to accomplish any one specific task, but are instead paid for short periods of time to share expertise and information.

Now imagine a similar model within the context of data analytics. The shortfall most often discussed with regard to analytics is not in tooling but in expertise. In that sense, it’s not hard to imagine a world where companies express an interest in “renting” expertise from vendors. It could be in the form of human expertise, but it could also be in the form of algorithmic expertise, whereby analytics vendors develop delivery models through which companies rent algorithms for use and application within in their own applications. Regardless of what form it takes in terms of its actual delivery, the notion of information or expertise as a service is an inevitability, and 2015 might just be the year IT vendors start to embrace it.

It made me think about how PLM can shift a role from being only “documenting and managing data and processes” towards providing services to improve it by capturing and crunching large amount of data in organization. Let speak about product configurations – one of the most complicated element of engineering and manufacturing. Mass production model is a think in a past. We are moving towards mass customization. How manufacturing companies will be able to get down cost of products and keep up with a demand for mass customization? Intelligent PLM analytics as a service will be able to help here.

What is my conclusion? Data is a new oil. Whoever will have an access to a most accurate data will have a power to optimize processes, cut cost and deliver product faster. PLM companies should take a note and think how to move from “documenting” data about design and processes towards analytical application and actionable data. Just my thoughts…

Best, Oleg

Share

0 comments

PLM: from sync to link

by Oleg on October 17, 2014 · 10 comments

plm-data-link-sync

Data has an important place in our life. Shopping lists, calendars, emails, websites, family photos, trip videos, documents, etc. We want our data to be well organized and easy to find. Marketing folks like to use the term – data at your fingertips. However, the reality is just opposite. Data is messy. We store it in multiple places, we forget names of documents and we can hardly control it.

Everything I said above applies to manufacturing companies too. But, it gets even more complicated. Departments, contractors, suppliers, multiple locations and multiple systems. So, data lives in silos – databases, network drives, databases, multiple enterprise systems. In my article – PLM One Big Silo, I’ve been talking about organizational and application silos. The data landscape in every manufacturing company is very complex. Software vendors are trying to crush silos by introducing large platforms that can help to integrate and connect information. It takes time and huge cost to implement such system in a real world organization. Which makes it almost a dream for many companies.

In my view, openness will play a key role in the ability of system to integrate and interconnect. It will help to get access to information across the silos and it leads to one of the key problem of data sharing and identity. To manage data in silos is a complex tasks. It takes time to organize data, to figure out how to interconnect data, organize data reporting and to support data consistency. I covered it more in my PLM implementations: nuts and bolts of data silos article.

Joe Barkai’s article Design Reuse: Reusing vs. Cloning and Owning speaks about the problem of data re-use. In my view, data reuse problem is real and connected directly to the issue of data silos. I liked the following passage from Joe’s article:

If commonly used and shared parts and subsystems carry separate identities, then the ability to share lifecycle information across products and with suppliers is highly diminished, especially when products are in different phases of their lifecycle. In fact, the value of knowledge sharing can be greater when it’s done out of sync with lifecycle phase. Imagine, for example, the value of knowing the manufacturing ramp up experience of a subsystem and the engineering change orders (ECOs) that have been implemented to correct them before a new design is frozen. In an organization that practices “cloning and owning”, it’s highly likely that this kind of knowledge is common knowledge and is available outside that product line.

An effective design reuse strategy must be built upon a centralized repository of reusable objects. Each object—a part, a design, a best practice—should be associated with its lifecycle experience: quality reports, ECOs, supplier incoming inspections, reliability, warranty claims, and all other representations of organizational knowledge that is conducive and critical to making better design, manufacturing and service related decisions.

Unfortunately, the way most of companies and software vendors are solving this problem today is just data sync. Yes, data is syncing between multiple systems. Brutally. Without thinking multiple times. In the race to control information, software vendors and implementing companies are batch-syncing data between multiple databases and applications. Parts, bill of materials, documents, specifications, etc. Data is moving from engineering applications to manufacturing databases back and forth. Specifications and design information is syncing between OEM controlled databases and suppliers’ systems. This data synchronization is leading to lot of inefficiency and complexity.

It must be a better way to handle information. To allow efficient data reuse, we need to think more about how to link data together and not synchronize it between applications and databases. This is not a simple task. Industry that years was taking “sync” as a universal way to solve problem of data integration cannot shift overnight and work differently. But here is a good news. For the last two decades, web companies accumulated lot of experience related to management of huge volumes of interconnected data. The move towards cloud services is creating an opportunity to work with data differently. It will provide new technologies of data integration and data management. It also can open new ways to access data across silos. As a system that manage product data, PLM can introduce a new way of linking information and help to reuse data between applications.

What is my conclusion? There is an opportunity to move from sync to link of data. It will allow to simplify data management and will help to reuse data. It requires conceptual rethink of how problems of data integrations are solved between vendors. By providing “link to data” instead of actually “syncing data”, we can help company to streamline processes and improve quality of products. Just my thoughts…

Best, Oleg

Share

10 comments

data-interop-iot-plm

One of the heavily debated topics in CAD/PLM industry is data interoperability. I remember discussion about data interoperability and standards 10-15 years ago. Even vendors made some progress in establishing of independent data formats, the problem is still here. At the same time, I’m convinced that successful interoperability will play one of the key roles in the future of CAD/PLM. Navigate your browser to my article with few examples showing  how important data interoperability for building granular architecture of future application and collaboration.

IoT (Internet of Things) is relatively new trend. We started to discuss it recently. Applications of IoT are bringing lots of interesting opportunities in many domains- smart houses, connected devices, infrastructure operations and many others. However, here is the thing – we can see many companies looking how to get into IoT field. By nature, this field is very diverse. I can hardly can imagine single manufacturer supplying everything you need for your “smart house”. So, we are getting (again) into the problem of interoperability between devices, services and processes.

Forbes article Digital Business Technologies Dominate Gartner 2014 Emerging Technologies Hype Cycle speaks about several business and technological trends. IoT is one of them. Article points on the problem of data interoperability as the one that will impact the most future progress in IoT. Here is the passage I captured:

What will slow rapid adoption of IoT? Standardization, including data standards, wireless protocols and technologies. A wide number of consortiums, standards bodies, associations and government/region policies around the globe are tackling the standards issues. Ironically, with so many entities each working on their own interests, we expect the lack of standards to remain a problem over the next three to five years. In contrast, dropping costs of technology, a larger selection of IoT-capable technology vendors and the ease of experimenting continue to push trials, business cases and implementations of IoT forward.

It made me think about two issues. The problem of standardization and data interoperability can be only solved by business interests of vendors. With absence of mutual business interests we will see dumb devices not interconnecting and managing to exchange data. The value of IoT solutions will be impacted. The second problem is related to PLM vendors consuming data from multiple devices and services to improve decision making. Standardization in that field can provide an advantage and  present a solid business interests of vendors.

What is my conclusion? We can see an entire new industry of IoT is under development these days. Data interoperability is a problem that needs to be resolved earlier than later. Roots of data interoperability problems are usually related to hidden business interests of vendors. Learning from previous mistakes of CAD/PLM industry can help. CAD/PLM vendors can provide tools that helping manufacturing companies to build a better connected devices. Just my thoughts…

Best, Oleg

Share

4 comments

How to visualize future PLM data?

August 12, 2014

I have a special passion for data and data visualization. We do it every day in our life. Simple data, complex data, fast data, contextual data… These days, we are surrounded by data as never before. Think about typical engineer 50-60 years ago. Blueprints, some physical models… Not much information. Nowadays the situation is completely […]

Share
Read the full article →

Importance of data curation for PLM implementations

August 4, 2014

The speed of data creation is amazing these days. According to the last IBM research, 90% of the data in the world today has been created in the last two years alone. I’m not sure if IBM counting all enterprise data, but it doesn’t change much- we have lots of data. In manufacturing company data […]

Share
Read the full article →

PLM security: data and classification complexity

July 30, 2014

Security. It is hard to underestimate the importance of the topic. Information is one of the biggest assets companies have. Data and information is a lifeblood of every engineering and manufacturing organization. This is a key element of company IP. Combined of 3D models, Bill of Materials, manufacturing instructions, suppliers quotes, regulatory data and zillions […]

Share
Read the full article →

PLM implementations: nuts and bolts of data silos

July 22, 2014

Data is an essential part of every PLM implementation. It all starts from data – design, engineering, manufacturing, supply chain, support, etc. Enterprise systems are fragmented and representing individual silos of enterprise organization. To manage product data located in multiple enterprise data silos is a challenge for every PLM implementation. To “demolish enterprise data silos” […]

Share
Read the full article →

What PLM Architects and Developers Need to Know about NoSQL?

July 7, 2014

People keep asking me questions about NoSQL. The buzzword “NoSQL” isn’t new. However, I found it still confusing, especially for developers mostly focusing on enterprise and business applications. For the last decade, database technology went from single decision to much higher level of diversity. Back in 1990s, the decision of PDM/PLM developers was more or […]

Share
Read the full article →

PLM One Big Silo

June 9, 2014

Silos is an interesting topic in enterprise software. And it is a very important topic for product lifecycle management. Why so? Because, PLM is heavily relies on the ability to work and communicated across the organization and extended value chain. Accessing information in multiple departments, functional domains and application is part of this story. Silos […]

Share
Read the full article →

Will PLM Vendors Jump into Microsoft Cloud Window in Europe?

April 10, 2014

Cloud is raising lots of controversy in Europe. While manufacturing companies in U.S. are generally more open towards new tech, European rivals are much more conservative. Many of my industry colleagues in Germany, France, Switzerland and other EU countries probably can confirm that. Europe is coming to cloud systems, but much slower. I’ve been posting […]

Share
Read the full article →