Apple's annual developer conference in San Francisco.
Apple Multi-Touch Display?
One video has received a lot of attention. It is a video by Jeff Han on his research into Multi-Touch displays. The impressive display shows a user using multiple fingers/hands in manipulating objects on the screen. A closer look at technology behind the demonstration reveals that it is a very intricate setup.
The system shown uses a technique called Frustrated Total Internal Reflection (FTIR) to accomplish the tasks shown. It involves the use of a transparent screen with the images rear-projected onto it. For touch-sensing, a camera is placed behind the screen to detect the user interaction. A diagram of the setup is shown here.
The use of both a projector and rear-camera, of course, is not feasible in any potential laptop or tablet device from Apple... but conveniently enough, Apple has recently applied for a patent on an Integrated Sensing Display (diagram) providing a screen that can be used both as a display and as a camera:
The integrated sensing display includes both display elements and image sensing elements. As a result, the integrated sensing device can not only output images but also input images.
At the time, many users dismissed the need for such an elaborate display when the built-in iSights offer a reasonable solution for video-conferencing needs. Of course, the use of this "integrated sensing" display would not likely use the FTIR method as described above, but could offer similar multi-touch functionality.
Apple has recently demonstrated a need for a multi-touch display in their recent Gesture patent application, specifically in this image.
Update: Hrmph previously noted a European patent application from Apple for a "Multi-point touchscreen" -- allowing up to 15 simultaneous presses.