Expressive Inputs for Live Music Visualization

Luz is getting some seriously cool input options!

Background Theory

Luz is based on the premise that any movement (or more generally, any visual change) can be quantified in the range of 0% to 100%.  Think about your arm, and call it 0% when your hand is at your shoulder and 100% when fully extended.


Luz Object Relationship

If you had enough of these 0-100% variables for all your joints, you could describe every position your body can be in.  Then you could animate those variables over time, maybe to the beat, and you’ve got dancing!  (This might explain my dancing…)

Think of your variables as a vocabulary for describing the playing music.  Name them whatever you like: energy, happy, bouncy, scratchy.

As the VJ, you set up your visualizations to respond to these variables in appropriate ways.  (More on this in a future post.)

End result: visualizations that respond to the intent, feeling, direction of the music, as interpreted by you.


It’s important to find good hardware to serve as inputs.  I’ve found that the MIDI controller I recently bought is far, far more expressive and fun than computer keyboards and mice, but it has some creative and technical drawbacks:

  1. You can’t jump around between values; only slide/rotate there.  Sometimes the music just demands faster changes!
  2. MIDI‘s limit of 128 distinct integer values for each slider/knob results in visible steps when the total range of movement is big.  To help counter this, I added adjustable smoothing to variables, and that helps tremendously, but of course, smoothing adds a slight delay.

Beyond Sliders and Knobs

Sliders and knobs are great for setting and fading between values, but for reason #1 above, I find that they aren’t ideal for following melodic aspects of music.

I really just want to move my body, in whatever way feels natural to the music, and have that translate to meaningful input in Luz.  The obvious choice these days is using a Wiimote, and that is definitely going to happen, but I don’t own one yet.

I thought about using a cheap Wacom pad and just reading the absolute X, Y, and Pressure as the pen dances around its surface to the music.  That also will happen, but I don’t own a Wacom pad yet.

Then I realized: I already have a pressure sensitive X/Y pad!  And so do most people (and all portable VJs): the laptop’s touchpad!  One coffee shop hacking session later, and we’ve got…

Your Touchpad as Expressive Input

Luz now takes absolute X, Y, and Pressure data directly from the Touchpad driver, performs a little automatic calibration, some transformations, and gets values from 0.0 to 1.0.

These values are also broadcast in the OpenSoundControl format over UDP, which opens up some great possibilities when 2+ laptops are present: using a touchpad with each hand, or having a friend playing the second one while watching the show.  You could even run Luz on multiple laptops with multiple projectors, and have both instances responding to the touchpad input from both touchpads.

Sliiiiiide, tap, tap, looooop

So this works fantastically, and it’s super fun interpreting music as x/y movement, and having that turn in to visualizations.

Unfortunately: I’ve learned that there are two types of touchpad hardware out there, one made by Synaptics and one by ALPS.  The ALPS touchpad, it turns out, has “smart” firmware that delays taps by maybe 200ms.  And I have an ALPS touchpad.  Frustrating.  So you should avoid laptops with ALPS touchpads if you want to use it this way.  You can see which you have by reading /proc/bus/input/devices.)  At least movement is reported instantly and that still works great.

And Joysticks too!

While I was at it, I added support for joysticks via SDL.  Joysticks are fun because they offer a LOT of inputs, both axes and buttons.

Joysticks are also ideal for art installations and music venues where cheap joysticks, appropriately modded and perhaps absurdly over-sized, would work great for live audience participation in the visualizations.

Back to Playing

All of these input options mean you can kick back on the couch, throw the visuals up on the projector, hand out controllers to your friends, and jam.  Visually speaking.

As I play with the MIDI controller, touchpad, and a joystick with analog pads, to music, it occurrs to me that both the manual dexterity and the creative enterpretation of the music are skills. It’s something you can get good at. I like the idea of performing in Luz being a skill.


One Response to Expressive Inputs for Live Music Visualization

  1. Stefan Kost says:


    I work on something simmillar as part of buzztard. btic (buzztard input controllers) is a lib that handles alsa-midi controllers and HID devices. Maybe we can synergise the work. I was also thinking about the touchpad, but found no way to tell X to release the touchpad when I have a usb-mouse. Have you found a way?


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: