COFES 2017 – My Favorite Roundtables

COFES 2017 – My Favorite Roundtables

cofes2017roundtables

The Congress on the Future of Engineering Software (COFES) will take place later this week in sunny Scottsdale, AZ. The timing is everything. For the last ten years, traveling to COFES from snowing and raining Boston is always an adventure that turns you mind in a different mode. So, from Thursday, I will put myself in a warm nirvana of “thinking” about future with a group of great people at COFES.

This year, COFES is focusing on the topic of complexity and transformation. Check my earlier blog  – PLM complexity and future transformation. Here is how COFES website describe to problem of complexity in modern engineering software:

The growth of complexity in everything we do is presenting us with new and difficult challenges, from our constantly changing business environment, to conflicting requirements of more simplicity (to the customer) in products that require more complexity to deliver. New phenomena result from complexity, often requiring consideration of things that were not previously an issue. The demands of IoT, the emergence of additive manufacturing. And it’s not just products:  emergent properties of complexity occur in processes, in IT, in business models, in politics, and in economies. Rather than focus on how to design complex systems from scratch, our attention will be on interventions that can transform existing systems to mitigate the effects of complexity.

To develop a complex system from scratch is hard. One of my favorite theories is related to John Gall’s law of building complex systems. Read more here. You can only build a complex system by iterating from simple to more advanced system. So, to transform existing systems is an interesting proposal compared to big bang replacement of existing system with a new one.

One of my favorite parts of COFES is roundtable sessions. Earlier today, I was skimming through the list of roundtables in COFES program. Here is the list of my favorite roundtable. I look forward to attend some of them later this week.

Rethinking Complexity

This is a passage from COFES program:

We are being asked to solve ever-more-complex problems. However, the record of failed projects reminds us that the best big solutions are composed of several small solutions. “Composed” is the key idea. Systems thinking is the key to this composing. How can systems thinking change the way we address complexity? Complex systems without the ability to adapt are prone to failure. With systems thinking, we can begin to address that need. How do we design in that adaptability? Do we have the right tools? What about embracing complexity by re-imagining the systems with biological analogues? What’s the cost of not using this approach? Rather than driving complexity out of our solutions, can we design systems that embrace complexity to address complex problems?

Rethinking is a word which often is used these days in a context of startup companies. You rethink XYZ to bring value to ABC. This is a magic formula that worked for many entrepreneurs for many generations. However, these days, large companies are trying to catch up with innovation and “rethinking” of their business. It is interesting to see how existing PLM companies will be rethinking large and monolithic PLM systems they have built for the last 15-20 years.

The Intersection of Model-Based Design and System Engineering

Products are getting more complex these days. A new way to introduce manufacturing products is to think about “computer on wheels” (car) or software wrapped in plastic (gadget, phones, etc.). To manufacture these projects is much more complex tasks compared with anything else industry have seen for the last century. So, it will require new models, approaches and software. Hence the topic of model driven design and system engineering is coming.

Designs in mechanical engineering, electronics, interconnect, IoT, and software progress from simple simulations to more complex and realistic simulations, to real prototypes. Model-in-the-Loop runs models on emulated hardware. Software-in-the-Loop runs software on emulated hardware. Hardware-in-the-Loop runs software on actual hardware, typically at the prototype stage. At each simulation/emulation stage, X-in-the-Loop emulation requires content from the other domains as a baseline. How are those system configurations managed across those domains? You might have a dynamic 3D model of mechanical hardware, a prototype of a printed-circuit board, a diagram of the product interconnect, a fully functioning cloud-based system for IoT, and a UML model for software. A month later, the latest system configuration could be very different. Managing that configuration, and simulating it, could be tremendously advantageous. How would we go about doing that? What challenges are still unresolved?

Handling Massive Streams of IoT Data

I speculated if existing PLM systems will be able to hand huge amount data. Read my earlier blog here – IoT data will blow up existing traditional PLM databases. The IoT is coming and PLM companies are developing strategies to catch up with IoT wave. One of the questions that is still unclear is how IoT will redefine PLM?

In his COFES 2007 keynote, Bruce Sterling introduced us to the idea of the Internet of Things. Now, 10 years later, IoT is everywhere. At its core are sensors and communications. Sensors are generating ever-increasing amounts of data, and there is great value in it. The challenge we face is one of scale. The amount of data is increasing at a rate much, much greater than the communication bandwidth available to transmit it. We no longer have the ability transmit all the data from where it is collected to where it is consumed. In fact, we no longer have the capacity to store all the incoming data. Where does that leave us? What strategies can we deploy, without losing value?

Evolution of the Digital Twin Vision

Digital twin is a confusing term. Some manufacturing is claiming they’ve been doing it forever and actually very surprised PLM vendors are coming with “new” thing called digital twin. Earlier this week I had a chance to attend CIMdata forum, which provided a good clarification story about digital twin. But the subject is morphing into something bigger (sometimes mentioned as digitalization trend). Here is an interesting teaser for digital twin discussion at COFES.

The idea of digital twins came from aerospace, where there is a need to understand the wear and tear on each individual plane, and an opportunity to analyze the stress experienced by that specific plane to influence how it should be maintained. Since then, that sharp vision has been applied in other arenas and the definition expanded and morphed. What is the consensus of the vision for digital twins today? Where is it likely to expand in its value and use? What new opportunities are likely to be presented? How might the vision evolve from here?

What is my conclusion? PLM industry is stressed by a growing level of complexity and the need to preserve value of investment manufacturing companies made into PLM systems. PLM vendors don’t have a luxury to act in a disconnected way. Therefore we are going to see lot of discussion about how to provide complementary solutions and transform an industry as we go. Just my thoughts..

Best, Oleg

Want to learn more about PLM? Check out my new PLM Book website.

Disclaimer: I’m co-founder and CEO of openBoM developing cloud based bill of materials and inventory management tool for manufacturing companies, hardware startups and supply chain. My opinion can be unintentionally biased.

Share

Share This Post