Advertisement

Tech

Motion-Sensing MIDI Controllers for Live Performances

By  | 

Translating studio-centric electronic music production into a captivating live performance is difficult, to say the least. Clicking away on a laptop will never be exciting to watch; keyboard and pad controllers are a little more interesting, but most of the action still takes place outside of the audience’s view. To make your show more engaging, try using a motion-sensing MIDI controller in your live PA setup. These devices capture the motion of your hands in the air and translate them into MIDI or Open Sound Control data, which you can then use to play synthesizers and effects in your DAW.

Open-Source

Building your own motion-sensing MIDI controller requires a fair amount of skill, but is the least expensive and most customizable option. These devices are typically based on the Arduino platform, run open-source code, and use readily-available electronic components. The Therething, for example, uses a pair of inexpensive ultrasonic sensors to track your hands’ position as you move them around the instrument’s casing. The Arduino processor inside the Therething then translates the sensor data into MIDI note data and sends it to your DAW. The Airharp works on a similar principle, but incorporates a set of pushbuttons that allows you to play chord progressions by “strumming” in the air.

Hardware Hacks

A less technically demanding option is to use a piece of hardware that already includes motion sensing as a MIDI controller. Although there’s some work involved, it’s all on the software end, so no need to break out the soldering iron. The Synapse application allows you to control Ableton Live using a Kinect Xbox controller via a set of Max for Live patches. Once you’ve set up the system, you can play synths, apply effects, and trigger samples by moving your hands and head. Even a plain-vanilla webcam can translate motion into MIDI data: Peripheral MIDI Controller turns motion and light intensity into MIDI continuous controller commands, while the Moomvi Max for Live device converts the video captured by your webcam into a simple grid of cells, each of which triggers a MIDI note. The GlovePIE script turns your Wiimote video game controller into a MIDI instrument on a Windows system; the Osculator app does the same on a Mac. If you have an Android, iPhone, or iPod touch, you can use its built-in accelerometer and gyroscope to control your DAW. Install a smartphone app like TouchOSC (cross-platform), Control (iOS only), or the aforementioned Osculator, then connect your phone to your computer using an ad hoc Wi-Fi network or (ideally) a USB cable. You can then map the app’s parameters to MIDI controls in your DAW, allowing you to, for example, open and close a filter by waving the device back and forth.

Purpose-Built

The final option is to use a pre-made, purpose-built motion-sensing MIDI controller. This is the most expensive and least customizable way to go; on the upside, these devices are more or less plug-and-play, so there’s little effort involved in making them work. The Hot Hand controller consists of a ring with a wireless transmitter and a USB dongle with a wireless receiver. Slip the ring on your finger, map the parameters to MIDI commands, and wave your hand to activate the synth or effect. The Machina MIDI Jacket is a wearable controller that incorporates both motion sensors and pushbuttons; it allows you to control a DAW by waving your arm, flexing your fingers, and manipulating the physical controls. It’s scheduled to become available for purchase in November 2013.

James P is the creator of Producer Tools, the ultimate mobile app for music producers. Read more of his tutorials at Quadrophone.com.

Advertisement