A Touch Less Remote: Part 4 of 6
The BBC R&D Prototyping team has been investigating how multi-touch software could support television viewing in the future. This article, Written by Fritz Solares looks at the hardware and software behind the prototypes.
The multi-touch table we used was built for us by the BBC R&D workshops as there wasn't anything commercially available that did exactly what we needed. Most devices described as multi-touch only support three or four simultaneous touches and we wanted to have several people using it at the same time. On the flipside most devices available that are truly multi-touch are extremely expensive and use closed technology that doesn't lend itself to rapid prototyping.
There are several ways to handcraft multi-touch tables that are well documented on the web. Ours uses the 'frustrated total internal reflection' (FTIR) technique. FTIR devices are cheap as they can be built with consumer items (a webcam, a projector and an out-of-the-box PC) and theoretically they support as many touches as the number of fingers you can fit on the screen. One downside is the devices are bulky because of the distance between the screen and the projector.
For the screen our device uses a sheet of tough transparent plastic covered in a layer of silicone and then a layer of thin translucent sticky-backed film. Strips of infrared LEDs surround the plastic creating light that shines across it and is reflected within it until the surface is touched, the reflection is broken, and the light is scattered is down out of the sheet. This is much the same effect as when you can see your fingerprints but not much else through a full glass of water.
A webcam with an infrared filter is mounted below the plastic to pick up the scattered light and this is where the software takes over to makes sense of the image it receives. We used the CCV application which takes the video stream from the camera and outputs touch data as a series of coordinates using the TUIO specification, an open framework that defines a common protocol and API for tangible multitouch surfaces. Some of the benefits of CCV are that it is open source, cross platform, quite robust and easy to use. It can be used with many different types of touch lighting techniques including FTIR.
The prototypes themselves were developed in Flash using the ActionScript 3.0 (AS3) language. We have a lot of Flash experience in the team and it's really quick to import new graphics and fonts which makes it good for rapid prototyping.
We used the Touchlib library that listens to the touch date broadcast by CCV and turns it into standard AS3 events that can be used within new or existing applications. Again Touchlib has the advantage of being open source and there are quite a few examples available which makes it easy to get up and running, Unfortunately there isn't much documentation and there is little support built in for higher level interactions such as gestures which means these need to be coded from scratch. We ended up building our own very simple multitouch application framework as a basis for developing our prototypes.
From a technical point of view the project ran quite smoothly although we had a couple of specific design constraints to overcome. Because of the silicone used we found the table surface was not as responsive as we would have liked so as a work-around we built applications with only the simplest of gestures like touch and drag. We also had limited time and this gave us another good reason to stick to simpler interactions. For any future prototypes we would hope to change the surface to something softer and more flexible.
In the next blog post we'll be looking at some of the design challenges in developing these prototypes in more detail.
Touchlib library https://nuigroup.com/touchlib