Monday, November 10, 2008

I'm just getting around to posting this...

It has been a ridiculously busy few months. While I have been in a few countries, the guys in the lab have been putting together a few new projects that are kind of cool. This one actually uses our real-time 3D modeling engine to allow people to interact with a touch screen and drive the application. I guess what is pretty unique here is that this is taking actual CAD data out of .DXF/.DWG files and allowing people to manipulate them in real-time.





We call the application AirTouch. Although the user does touch the screen, the image kind of looks like it is in air.

Instead of some pre-rendered camera path or a “generally accurate” gesture input, this is actually rendering CAD @ 60Hz on the fly. Yes, it can do 120Hz for stereo, but that doesn’t translate well onto the web.


Anyway, we see some pretty big markets for this type of technology, including architecture, engineering, medical and even entertainment. Who knows, but as of today, it is a good way for people to take things from the virtual world and interact with them in the real world.

Oh, and it also works off of proximity detection as well (thanks Steve), so that you don’t need to actually touch anything to determine Z plane, so for you doctors out there, wanting to go through scans in real-time w/o touching anything, here ya go.

Saturday, September 13, 2008

Been busy this month

Alright, this might not really be "out of the lab" stuff, but over the last month, we have done a few things that were pretty interesting. The thought was to take some of the stuff we already have and make it better. We did turn back to the boys in the lab to refine these, so it technically is "out of the lab", but more from an advancement standpoint.

The result are a few projects that we put together over the last month. Here is a quick look and an explanation of why they are cool...or, at least, why I think they are cool.

1. This is one of our standard 360 degree video domes, but the twist on this one is the content. we usually take a mix of CG and still imagery, as there is not a high-res source that can capture hemispherical. We crafted up this monster of a rig to shoot the world's first 4K hemispherical. Yup, that is a Nikkor 6mm on a RED ONE.




...And the result.

2. This is a pic of something we did for Google out at Fashion Week in NYC. It is using one of our standard multi-channel HD rigs, but the cool twist is that we embedded the screens behind mirrors and put it in a 3 sided box that was 10' tall. We used one of our 8 channel HD surround rig behind glass, which gave the appearance of having the imagery floating in space. We "interacti-fied" it by writing an iPhone app that let us drive the media, sequence and content from any of our phones.



3. Not being busy enough, we decided to take our multitouch technology to the next level by figuring out how to integrate our own "deep zoom" type of functionality into the app. Basically, we let people take a trip from space and drill down into a virtual tour of Dubai, all from the comfort of the Javitz Center. I would post a video, but I didn't go, so we don't have any video documentation of this thing. sucks.



Anyway, besides what we did for GM last year, this is the only other time I have ever seen a 6'x4' multitouch wall that the public could play with.

Monday, August 4, 2008

New VisionAire technology from Obscura

Here is the latest. Alright, alright, it is not really "multi-touch", because you really dont touch anything. The system just senses where the presenters hands are and allow him to interact. Multiple people could be doing this too.

We call it VisionAire. Get it, "vision" and "air" with a little European flair. Basically, we were looking for a new way to allow a presenter to interface with visual data. This uses our standard multi-touch framework and integrates it with the Musion system we have in house. The result is a truly interactive way to give presentations.



See for yourself.

Friday, July 18, 2008

Quick Studio Tour

There may be some audio issues with the guys down at CNet, but generally, it is a pretty cool little piece on us. Click here to watch the piece.

Wednesday, July 16, 2008

Not your grandpas QTVR

This may not look like much (unless you know what it is), but it is the latest example of real-time 3D modeling driven by one of our touch screens. The cool part of this type of thing is that the user is navigating through a 3D studio max file in real time. This is not pre rendered, it is outputting a 1400x1050 visualization that is driven by the touch screen. Note the zoom, pan and rotate feature. Lots 'o math going on here.


Basically, the guys wrapped the Max application in our FireFrame server and wrote hooks to render 60fps. This can work with any type of 3D model. The future of visualization has arrived, and is currently on sale in aisle 4 down at Obscura:-)

Saturday, May 10, 2008

The best use of multi-touch I have seen yet

When I got back from the Google deal in NYC, the tech guys had set this up in the studio. I walked in to see Rooney G. going for a new high score. I thought it was cool, so I jumped in to help.

OK, it may be pretty lame, but this is actually a great use of multi-touch. Gaming and entertainment may be the logical commercial uses of this type of technology.



This retro game is given a new lease on life with this type of technology. Oh, and it also plays multiple streams of 720P video, but who is counting.

Sunday, May 4, 2008

This was a tough one...



OK, so when we decided to do this with our client, we all kind of looked at each other and said "it can not be that hard". After all, our technology is rock-solid, the projectors are capable and it is "only" 250,000 sq feet of synchronized video, so how hard could it be? New York City, Meatpacking District, Spring time, it will be a breeze.

The way we designed it, we had 6 buildings and 12 surfaces that needed to be lit up. Challenges included (but were not at all limited to):

- The projectors were off angle by about 40 degrees from each surface
- We couldnt project in the windows of the buildings so that we didn't disturb hotel guests/patrons (get your head around that)
- There was a ton of ambient light, making projection difficult to see
- It was raining
- There was no power on the roofs
- Each projector had a custom lens solution to accomodate its throw/coverage
- Cranes were not available to move gear onto the roofs, so we had to use boom lifts
- There was no way to run cabling between buildings to carry video signal
- Projection surfaces consisted of brick, glass, steel and cobblestone
- Did I mention that it was raining?

Our solution was to use the state of the art 30,000 lumen projectors with our FireFrame servers to make up a 24,000 pixel composition. 20 projectors, 4 locations, 1 mile apart and 6 stories up: difficult. That, combined with the aforementioned "issues", made the challenge monumental.

Well, our friends in the city of NYC were actually very cooperative and we got the whole thing permitted and arranged in about 3 weeks. Sure, we had a few obstacles to get around but when it came time to hit the magic "go" button, it all worked out flawlessly.

The art community of NYC turned out in force for the launch of the iGoogle product, and was in awe of our production.

If anyone wants to hear the real challenges (the ones listed were trivial in comparison), please drop me a note. Thanks to Google for actually believing that this could happen (and funding it) and my friend Hashem for setting it up from the git-go.





Wheeew, it worked and was a rockin success.

Wednesday, March 5, 2008

2πr² never looked so good

I had to think back to high school geometry to figure out how large this projection screen was. 2*3.14159* Radius squared. I think i got a B, so I may be wrong, but i came up with 13,288 square feet of video...now thats a pretty big screen. It appears even bigger when it is wrapped around you. That is how big this screen was that we used for Nakheel, arguably the World's most creative real estate developer.

The idea was to provide an environment where the company could immerse their audience in their brand and the new Blue Communities initiative. It would also serve as an environment to announce their latest development, The Universe.

This video dome was 3' larger than Google, so it was, and is, the world's largest video dome. We used 14 Christie HD video projectors and our FireFrame servers to put this piece of technology together. It was up for over a month in Dubai, and served as a theater for all of Nakheel's visitors.

We incorporated a new audio system too, this time using 8 channels of discrete audio running through a new set of Meyers….Surround never sounded so good.

Saturday, February 16, 2008

Holographic kiosks for GM at the NAIAS

Yeah, its been a while since the last post. I've been kinda busy. We just got back from Detroit, where we put together a really cool display for GM that consists of 5 interactive holopro kiosks integrated with 70 linear feet of holographic glass behind the kiosks. The result was that when people were using the kiosks, the walls behind them lit up with complimentary media, kinda like that Tom Cruise movie that everyone talks about...what's it called?