Introducing the Luz Body Tracker

April 25, 2011

Luz is an open-source live motion graphics editor and DMX controller.

I just checked in a major new feature, the Luz Body Tracker:

In this test, the hands control the heights of the bars.  The ‘praise jebus’ variable goes to 100% when both* hands move above the shoulders, and it in turn drives the amount of the Pixel Storm effect on the scene renderer.

*This effect is accomplished by setting the variable’s ‘Combine Method’ to ‘Multiply’, as seen in the screenshot. (%56 * %73 = about 41%)

With the Luz Body Tracker, any effect in Luz including DMX lighting can be easily controlled directly by human body movement.

Want to control a theater’s lighting entirely from body movement? Luz can do that.

Luz Body Tracker was graciously tested by the kids at the after school play space LightTroupe visits. (And apparently the Kinect can track kids on swings. Whodathunkit?)

Luz Body Tracker

The Body Tracker sends values like “Human 01 / Left Hand / X”.

It can track multiple people simultaneously, and you can limit the maximum number of people (to whatever number you choose to support in your Luz project).  It’s all unattended and automatic. The Luz Body Tracker is intended to facilitate museum installations.

The raw x,y,z data for each spot on the body (shown as circles above) is converted to the 0.0-1.0 range and sent to Luz via OpenSoundControl.  Elbow, knee, and hip bend angles (0.0-1.0) are also calculated and sent.

Like in the Input Manager application, where each axis of each device is auto-calibrated, here each body part is individually auto-calibrated.  This means that someone whose elbow can’t physically raise above their shoulder will still be able to send 100% activation for “Elbow / Y”.  It means shorter people can still reach a full “Head / Y” value.

Further, elbow and hands are calibrated relative to shoulders, knees and feet are calibrated relative to hips.  In other words, your expressive limbs’ calibrations are in body-space, not stage-space.  It feels great– if you stick your right arm out as far as it’ll go, “Right Hand / X” will always hit 1.0.

Luz Body Tracker uses OpenNI+NITE for skeleton tracking. Because these aren’t provided as debian/ubuntu packages yet, the Luz Body Tracker is a little harder to compile than the rest of the Luz Studio suite.

Big thanks to Ether Snow who began the Luz Body Tracker app, got it fully working, and then handed the code and a Kinect over to me for integration into the Luz world.

Luz has enticed many non-traditional users– artists, women, kids– to install Ubuntu. It has the potential to bring thousands of non-technical folks to freedom and fun.

Please support the Luz Project— it all goes directly to feature development, creation of tutorials and documentation, and the drive towards a 1.0 release.


Luz – Introducing Shader Snippets

April 19, 2011

Luz is a live interactive motion-graphics editor and performer:

Luz is free and open source software.

Recent Advances

Luz recently got DMX support, allowing you to control venue lighting from Gamepads, Joysticks, WiiMotes, Wacom Tablets, MIDI devices, Kinects, live audio analysis, the beat of the music, or any app that sends OpenSoundControl.

Using Timelines, you can also schedule precise live visuals+DMX performances:

And now, a major advance in the graphics quality and diversity possible in Luz.

Jump down to the video, or read on for information about OpenGL Shaders and the limitation that was recently overcome.

OpenGL Shaders and the One-Shader Limitation

Shader programs replace a majority of the graphics pipeline on the graphics card, bending vertices, warping pixels as they go by. Shaders make possible the beauty and realism of modern games.

Unfortunately, only one shader program can be active when drawing each object. This doesn’t work well with the Luz model, where an actor can have any number of effects operating on them at once.

(This single-shader limitation is present in Quartz Composer. And the limitation is present in Microsoft Silverlight.)

Introducing Shader Snippets

Luz’s Shader Snippet technology overcomes this limitation by creating a shader program on the fly, by gathering tiny snippets of shader code from the user’s chosen effects, assembling them with a few tweaks to make them fit together nicely, and compiling them into a final program. (More technical details will follow in a separate blog post.)

Now, let’s take a look at Shader Snippets in action:

(If you know of other motion graphics software that assembles shader programs at run time, please let me know about it in a comment.)

The Luz Project

How better to challenge the mind than with creating beautiful, dancing, interactive visuals while listening to music?

How better to grow the open-source community than with fun, with play, with totally unique experiences?

I’ve been an advocate and user of the free desktop for my whole adult life, and Luz is an attempt to entice people of all sorts to try the free desktop.

I know this strategy works: many people have already installed Ubuntu to play with Luz, including artists, designers, women, and even children. Recently, one 8th grader in the weekly kids event we do told me he has begun volunteering at FreeGeek to earn an Ubuntu computer, so he can play with Luz at home. How cool is that!!?

Luz is just starting to get global attention, like the recent article on CDM.

Luz is a toy that I want to continue to see grow in quality and popularity.

Quite humbly, I ask for your support in making that happen. I invite you to:

It has taken 6 years and a lot of work to get this far, and there’s lots left to do to create the ultimate open-source live motion-graphics editor!

Thanks for reading, thanks for your support, and I hope you enjoy playing with Luz! 🙂

More on shader snippets and the new Text plugin doing live motion graphics typography:

Luz project page at
Luz tutorials.
Luz videos on YouTube.

Similar Technologies
HLSL Fragments

Luz Project Page gets Simple Install Instructions

March 11, 2011

The Luz Project Page now has simple install instructions; just a single copy and paste into a terminal on any Ubuntu desktop!

Please leave a comment with any feedback on the process.

While you’re waiting, you may like these Luz Tutorials.

And, coming soon…

Driving visuals and DMX lighting from live 3D body motion, including ridiculously easy Johnny Lee style head-tracking using just a Kinect.

If you like where this is going, please consider leaving a tip.  It all goes directly to development of “killer app” open-source software for the linux desktop!

Luz gets DMX lighting control

February 12, 2011

DMX is a standard for controlling lighting and other show devices (lasers, smoke, fog, heat machines). Many of the club and venue lighting systems are DMX, especially the larger ones.

Luz can now output DMX commands via an Enttec DMX USB Pro, a serial-usb device that is plug-n-play in Linux.

Luz meshes VERY well with DMX!

Conceptually, DMX is a 512-channel data bus, each holding a 0-100% activation (0-255). Each lighting fixture is configured to listen to a subset of those channels (often 3 for RGB). That’s it.

Luz Variables form an unlimited-channel data bus, each holding a 0-100% activation:

Luz Variables get their values from human inputs, or from animations over time or beats.

Luz as a live DMX lighting controller feels very natural.

What does DMX control get us?

Luz can now make the house lights dance. It can animate color and brightness changes on the beat of the music, or via human input on any sort of input device (Wacom Tablet, Wiimote, MIDI piano), or via the Spectrum Analyzer, or any other application that can send OpenSoundControl to Luz on a LAN or Wifi network.

You could control a stadium’s lighting with a Wiimote. You could play the club lights with a midi piano.

Below is a video of some early experimentation with projector-lights synchronicity. I directly connected the Spectrum Analyzer’s signals to color control of the primary actor on the projector, and also to the RGB color of four Colorsplash JRs LED DMX lights:

Right now I’m at the point where all the technology works, and I’m deciding how best to integrate DMX into the Luz Studio experience.

I’m thinking that DMX plugins will be in your Directors. So each of your Directors will have a handful of Luz Actors creating the projector image, and a handful of DMX plugins controlling the house lights.

This makes it easy for users to create full projector + lighting packaged experiences, and even to crossfade between lighting experiences in sync with the visuals on the projector transitioning gradually between Directors (using Director Cycle or Director Voyage).

I’m open to your thoughts, ideas, questions, proposals for collaboration (Portland, OR).

Luz is very easy to install on Ubuntu. See the easy Luz installation instructions on the project page.

Luz Studio is free and open source software. My favorite toy and my gift to you! If you like Luz and would like to donate a small amount or become a $1/month Luz Project Supporter that would also be most appreciated!