The main goal of this project was to create an interactive sound experience by making an instrument/tool that the user could play with sound in a simple and fun way.
Any sample (sound file) can be loaded and assigned to a symbol (a "sound line"). Each sample can be activated and deactivated from a palette placed at the top of the screen. The representation of the sound is simple but gives clear visual feedback corresponding to your treatment of each sample. As you stretch and rotate the lines, the sounds also change, affected by different types of filters and digital audio effects.
The technology was initially developed by Jefferson Y. Han at New York University. Making the touch-table occupied most of our time in the project as it is an extensive project. As the table worked, we also made a simple drawing tool.
The Soundlines Interface
Any sample (sound file) can be loaded and assigned to a symbol of two circles connected with a line. Each sample can be activated and deactivated from a palette placed at the top of the screen. The representation of the sound is simple but gives clear visual feedback corresponding to your manipulation of each sample. You stretch and move the objects by moving the circles. The symbols can snap together, and in this way, the user can play with different samples at the same time by moving only one point. This makes the interface more unpredictable and playful, as you manipulate many samples at the same time.
How it Works
The project is based on the multi-touch table technology using FTIR (frustrated total internal reflection) initially developed by Jefferson Y. Han at New York University. Compared to traditional touch screens, the advantage of this solution is that it has multiple inputs. This gives the possibility for bi-manual interaction and for more than one person to control different parameters at the same time. The tactility of using a finger on a screen is more interesting than using traditional computer devices, and using several fingers is both faster and funnier. Making a touch-table is a project in itself, and it occupied a lot of our time.
Technical issues we have had to manage:
- finding the right materials
- get a camera to see IR light
- power supply /electronic setup
- get enough IR-light
- coating
- video analysis and tracking
- communication between different computers and software
In addition, we had to build a working prototype and find a suitable visual interface design. We chose the one that best utilized the possibility of multiple inputs from different design concepts.
FTIR
The acrylic sheet is edge-lit by high-power infrared LEDs placed directly against the polished edges. The total internal reflection keeps the light trapped within the sheet, except at points where it is frustrated by a finger, causing light to scatter out through the sheet towards an IR camera placed underneath the screen.
Video Tracking, Communication, and Sound
The IR light emitted from the fingers is tracked using Max/MSP/Jitter. The coordinates of the fingers are communicated to Processing using the Open Sound Control protocol (OSC). The graphics are created in Processing and projected back onto a projection sheet. The different soundline objects' lengths and angles are continuously sent back to Max/MSP/Jitter and used as control parameters for manipulating the samples differently.
Other Interfaces
Now that the physical interface is built and working, it is relatively easy to make new software concepts. We have created a drawing tool in addition to the “Soundlines” application.
Future Work
We want to develop new software applications and interface solutions for the multi-touch table. We will possibly make use of the table in work on our respective master theses.
Presentation / Demos
- Ung kunstscene, The National Museum, Oslo, Aug. 24-Sept. 3, 2006
- NoMuTe, Trondheim Oct. 10-12, 2006
- NordiCHI 2006, Aesthetic Artefacts, Oct. 14-18 2006