Introducing the Luz Body Tracker

Luz is an open-source live motion graphics editor and DMX controller.

I just checked in a major new feature, the Luz Body Tracker:

In this test, the hands control the heights of the bars.  The ‘praise jebus’ variable goes to 100% when both* hands move above the shoulders, and it in turn drives the amount of the Pixel Storm effect on the scene renderer.

*This effect is accomplished by setting the variable’s ‘Combine Method’ to ‘Multiply’, as seen in the screenshot. (%56 * %73 = about 41%)

With the Luz Body Tracker, any effect in Luz including DMX lighting can be easily controlled directly by human body movement.

Want to control a theater’s lighting entirely from body movement? Luz can do that.

Luz Body Tracker was graciously tested by the kids at the after school play space LightTroupe visits. (And apparently the Kinect can track kids on swings. Whodathunkit?)

Luz Body Tracker

The Body Tracker sends values like “Human 01 / Left Hand / X”.

It can track multiple people simultaneously, and you can limit the maximum number of people (to whatever number you choose to support in your Luz project).  It’s all unattended and automatic. The Luz Body Tracker is intended to facilitate museum installations.

The raw x,y,z data for each spot on the body (shown as circles above) is converted to the 0.0-1.0 range and sent to Luz via OpenSoundControl.  Elbow, knee, and hip bend angles (0.0-1.0) are also calculated and sent.

Like in the Input Manager application, where each axis of each device is auto-calibrated, here each body part is individually auto-calibrated.  This means that someone whose elbow can’t physically raise above their shoulder will still be able to send 100% activation for “Elbow / Y”.  It means shorter people can still reach a full “Head / Y” value.

Further, elbow and hands are calibrated relative to shoulders, knees and feet are calibrated relative to hips.  In other words, your expressive limbs’ calibrations are in body-space, not stage-space.  It feels great– if you stick your right arm out as far as it’ll go, “Right Hand / X” will always hit 1.0.

Luz Body Tracker uses OpenNI+NITE for skeleton tracking. Because these aren’t provided as debian/ubuntu packages yet, the Luz Body Tracker is a little harder to compile than the rest of the Luz Studio suite.

Big thanks to Ether Snow who began the Luz Body Tracker app, got it fully working, and then handed the code and a Kinect over to me for integration into the Luz world.

Luz has enticed many non-traditional users– artists, women, kids– to install Ubuntu. It has the potential to bring thousands of non-technical folks to freedom and fun.

Please support the Luz Project— it all goes directly to feature development, creation of tutorials and documentation, and the drive towards a 1.0 release.

One Response to Introducing the Luz Body Tracker

  1. Did you see http://info.ee.surrey.ac.uk/Personal/Z.Kalal/ ? Maybe you can use that technology in Luz too?

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: