// new media class


Martin Stelter — Articulating Gestures

Remote Control — Winter 2015

Wider interaction space
Common smartphones have a GPS unit for location tracking, position and acceleration sensors for motion detection, a light sensor for measuring the ambient brightness, a magnetic field sensor to find north, a proximity sensor for knowing if the display is covered and cameras to film or make photos. By using these features a wider range of interactions can be create:

– Grab the phone in order to give commands, no unlocking etc needed. (acceleration, gyroscope and proximity sensors)
– Use the phone as a pointer in order to make light at a specific place. (acceleration and magnet field sensors)
– Perform a specific movement in order to get a certain light. (acceleration sensor)
– Change your facial expressions in order to change the light mood. (camera)
– Hold the phone on your chest or cover it in order to dimm the light. (proximity sensor)
– Blow onto the phone in order to switch off the light. (microphone)
– ...

Controling a moving head
In order to create an readable and easy-to-setup scenario, I decided to work with a moving head stage light. The device brings full control of position, intensity and spread of a light beam, parameters I could node to gestures. Many gestures I examined alone, while other gestures were tested with fellow students. At the end of the 5-week study an interactive performance were set up and visitors could grab the phone and play with the light beam. There were no instructions but the direct feedback of the moving head and a simple interface on the phone. Many users intuitively discovered gestures and understood how to control the light.