Controlling music with hand gestures alone has been a part of electronic music since the 1920s. The theremin for example was (and is) an early electronic musical instrument controlled without physical contact by the performer. The theme to the original “Star Trek” TV show is probably the best known of the Theramin soundtracks, forever associated with mid-century science fiction screen imagery.
Its Russian inventor Léon Theremin patented the device in 1928. The instrument's controlling section consists of two metal antennas which sense the relative position of the player’s hands and control oscillators for frequency with one hand, and amplitude (volume) with the other. The electric signals from the theremin are amplified and sent to a loudspeaker.
In motion capture sessions, movements of one or more actors are sampled many times per second. Whereas early techniques used images from multiple cameras to calculate 3D positions, often the purpose of motion capture is to record only the movements of the actor, not his or her visual appearance. This animation data is mapped to a 3D model so that the model performs the same actions as the actor.
From the 1920s world of touch-free music performance and the late 20th century world of motion capture and motion sensing in movies and video games, comes iRing which lets you control your iPhone, iPad and iPod touch music apps and effects without touching your device.
iRing deploys hand gestures to directly affect parameters of effects and other aspects in your music performances by simply moving your hands in front of your device. In theory though, it could be used to control physical objects, such as puppets in the same way. You could wave your hand over your iPad, and have these gestures control the movement of a motorized marionette a thousand miles away!
So how does it work?
iRing uses patented advanced image-recognition, motion control and precise geometric positioning technology to determine the exact position of the wearable rings, recognizing and tracking the dot patterns printed on iRing allowing you to control various app parameters, without touching your device. Your device ‘sees’ your rings, and turns their position and range into values which in turn control the settings in real-time.
iRing app uses the front-facing camera on your device and advanced volumetric positioning algorithms to recognize and determine the exact position of the ring in relation to the selected device camera. This precise reading of the position of the ring is converted by the app into music or MIDI control messages. So in essence, iRing and the iRing companion apps track your movements and convert them into useful info your apps use to change things.
Because movements can be recognized along the three axis, with two rings you can control up to six parameters simultaneously On top of movements also certain gestures are recognized (like the rotation of the ring) greatly enlarging your music expression possibilities.
The system could be used in a very wide number of ways - to play air guitars, air harps, to control lights, to control projected effects like ghosts on water. The performer could wave their hands over an unseen iPad and every gesture could appear to make a floating genie or tiny firefly - or a thousand of them in a cloud swirl this way and that.
The possibilities for real-time, gesture based performance are endless.
Click here to find out more about IK Multimedia’s iRings: