Jeff Han and Phillip Davidson showed off a remarkable “multitouch” computer interface at last year’s conference. They’ve been very busy in the subsequent year, expanding their system to be a wall sized (3×8 foot) display. Most touchscreen systems, Han points out, recognize only a single point of contact. That’s okay for conventional computing systems because they’re systems based on a single mouse cursor.
But there are some really cool things you can do when a screen can detect multiple touch points. Han and Davidson start with a demo of 250 videos, the most popular videos on YouTube. With two hands, Han can zoom, rotate and pan each video. The huge size of the screen supports collaborative work – the boys throw videos from one person to another. The video’s remarkably sharp – better resolution than HD.
To retrieve menus, Han makes a circular motion on the screen – it would be a bad idea to have to reach the upper right corner of the vast screen. He uses the menu to pull up Google Maps imagery. With another hand, he adds data filters on top of the map – he’s able to move these filters over the map, zooming into them, using them as “lenses” to see the data.
Using these lenses with microscopy information, you can immediately see the uses of this technology, applying color enhancement and edge detection to real data. At the same time as Han plays with a MEMS, Davidson plays with enhancing images of blood vessels. The performance is pretty amazing.
The last demo we see is of brainstorming – pulling a keyboard up from the screen, Han enters a keyword and retrieves photos from Flickr matching the word. He touches them again and they sprout tags on their side. Grab one of those tabs and it pulls up more images. He throws a set of photos to Davidson, compresses them, strings them into necklaces of images.
One possible application of the screen is in filmmaking – storyboarding works because anyone can interact with imagery via a tool as simple as a marker. This multitouch technology gives people a chance to do something as simple and as complex as marking up real images together by touching the screen.
I’m very much looking forward to playing with this later today and photographing it tomorrow…
Couid you explain what you mean by “better resolution than HD”?
What resolution was the video?
If videos from YouTune were used, the typical resolution is 320 x 240, how was this scaled to “better resolution than HD”?
Mike, I don’t have a pixel count for resolution, but I’ll try to get it for you. He was showing 250 You Tube videos at once in very small windows – it wasn’t being shown full screen…
I think the iPhone will be the first consumer item with this kind of touch. It will be interesting to see what the problems and successes are as people get used to this new way of interacting with displays.
No word on availability from Jeff yet? Last I heard he was setting up shop to get these amazing technologies to market ASAP. I really hope that’s still the case. Sure the iPhone is cool, but Apple can’t do everything. Apple’s greatest legacy is to inspire others to make intuitive interfaces; something Jeff Han is a child of.