PLM Prompt: 3D, PLM and Eyetracking Technologies

PLM Prompt: 3D, PLM and Eyetracking Technologies

I think we are moving fast toward combination of virtual and physical worlds. How we can get closer and combine our experience in these two words? In one of my previous prompts “Combine virtual and physical map overlay with G1″, I discussed how possible to merge physical and virtual maps.

Today my prompt is about how we can combine virtual modeling and virtual experience with eyetracking technologies. This is absolutely cool stuff if you think about it context of 3D modeling and virtual prototyping. Here few examples from company called Think.



I had chance to see also some eyetracking techno from our corporateblog 3D perspectives “Eye Tracking Super Power with Tobii”.

So, what do you think about it? Does it make sense in mainstream 3Dmodeling?

Best, Oleg


Share This Post

  • Oleg,
    We have been talking user interfaces for modeling today based on the Multitouch video that just came out. My conclusion after some discussion is that we will have a mix of technologies playing the interface between person and machine for modeling. What this seems to be missing, though, is the paradigm of taking a physical activity (like modeling in clay), digitizing it, and improving it with computing power. The examples of tracking eye gaze and movement to understand what a person is thinking and viewing make perfect sense to me. But we manipulate things with our hands, and observe with our eyes. How comfortable would we be manipulating objects with our eyes? It doesn’t seem like the right analog to the physical activity. To me, I would predict longer learning curves, less intuitive interfaces, more time spent on the interface than the activity, and probable eye strain. Having said all of that, I am not an ergonomics or user interface expert so I would be happy to hear someone with some real experience with this. Maybe the experience of the disabled would be a good place to start researching? And I have to admit that I am typing this with my hands instead of speaking, so there is something to be said for being able to learn how to translate one physical activity to a different function. I am curious on how you feel it would work for you. Also, what are your thoughts on the Multitouch modeling paradigm?

  • Jim, Thanks for your comments! For me eye-tracking is first of all technology that enables to develop better products (and not “control system” for user interface”). My point on eye-tracking technologies is mostly in trying to quantify /estimate quality of developed product. So, straightforward use case is to estimate how visible is product on the shelf or car on the street etc… Does it make sense? I hardly believe we can develop eye-driven CAD systems… not yet ;)… best, Oleg

  • Pingback: computers()