Monday, November 10, 2008

I'm just getting around to posting this...

It has been a ridiculously busy few months. While I have been in a few countries, the guys in the lab have been putting together a few new projects that are kind of cool. This one actually uses our real-time 3D modeling engine to allow people to interact with a touch screen and drive the application. I guess what is pretty unique here is that this is taking actual CAD data out of .DXF/.DWG files and allowing people to manipulate them in real-time.





We call the application AirTouch. Although the user does touch the screen, the image kind of looks like it is in air.

Instead of some pre-rendered camera path or a “generally accurate” gesture input, this is actually rendering CAD @ 60Hz on the fly. Yes, it can do 120Hz for stereo, but that doesn’t translate well onto the web.


Anyway, we see some pretty big markets for this type of technology, including architecture, engineering, medical and even entertainment. Who knows, but as of today, it is a good way for people to take things from the virtual world and interact with them in the real world.

Oh, and it also works off of proximity detection as well (thanks Steve), so that you don’t need to actually touch anything to determine Z plane, so for you doctors out there, wanting to go through scans in real-time w/o touching anything, here ya go.