An interview with Tyler Freeman about inspiration for the VJacket at Maker Faire.
Next is a short clip of the installation we made for Maker Faire Bay Area 2011: Sensors embedded in the VJacket trigger sounds in Abelton Live and video effects in Resolume Avenue. The video features an avatar created from a live 3d feed from a hacked XBox Kinect. All the sounds and effects were customizable, and in fact they ended up being much different at the end of the faire compared to this clip.
Audience participation was a must for us, so we let passers-by put on the VJacket and try for themselves. Even when one person was using it, the others in line couldn’t wait and ran up to be in range of the Kinect’s camera so their avatar was on screen as well! It was a big hit among kids, geeks, and professional performers alike.
The kids who tried on the VJacket picked up on the interface quickly; we noticed that when the sounds were off, the players had a harder time discerning which movements triggered an effect. With the instant and obvious feedback of the sound, however, they quickly learned to master the VJacket.
Some especially energetic youngsters invented new ways to play the Vjacket: some would dance with a furious convulsive fervor, whipping the too-long sleeves back and forth which activated all the piezo triggers in a frantic drum solo. This gave me the idea to play the triggers without using hitting them with my hands – instead using arm-twitching gestures and the looseness of the jacket to trigger two effects at once (this way, effects became grouped by left or right side – setting off both shoulder sensor and belly sensor as a “combo move”). During a crescendo in the music, more frenzied dancing would create a barrage of video effects appropriate for a climax. Another technique was to swipe the light sensor on the wrist past the LED on the other wrist, modulating one effect, while simultaneously triggering a piezo by a sudden shrug of the shoulder to layer effects.
Discovering techniques like these illustrates the potential virtuosity that is unique to each wearer/jacket combination, since each person’s body type and jacket size dictates the style of gestures available. Overall, the Maker Faire was amazing and the participants had many interesting and creative uses of the VJacket. Many came back several times to play again and again, and hopefully we inspired some people to build their own!
Diagram of the installation setup
Thanks to Ryan Huber for designing the sounds and helping build the new leather VJacket, Battlehooch for busking in front of the VJacket booth to demonstrate its VJing capabilities, Paul Spinrad for helping bring our vision to the Faire, the VJacket Maker Faire team, and of course all who came out to the Maker Faire! Hope to see you next year!
If you are interested in using the Xbox Kinect in Resolume, you can download the open-source Quartz Composer patch we used here:
This demo shows the individual sensors of the VJacket and how they can be used to control a popular VJ program, Resolume Avenue. A webcam input and projector on the wall create the feedback which is mixed with triggered clips and then recorded directly in Resolume.
VJing is a rhythmic manipulation of moving video and animation. A mouse and keyboard are very rigid and unforgiving – not ideal rhythmic controllers. Anyone who’s tried to scrub a video rhythmically with a mouse knows that the mouse is too precise in position and too slow in response for gestures of rhythm, which are innately imprecise in position but very precise in tempo and response. Knobs, sliders and touchpads are faster in response than a mouse, but still require tiny movements for grand rhythmic gestures, and require precise position and attention (most knobs can only be turned by two fingers at once: a precise movement).
A more natural controller for such tempo-centric manipulation would mimic the movements of dancing, since dancing is the ultimate form of rhythmic expression. We see this concept in several new experimental VJ interfaces, including the Maraca-like interface Rhythmism  (by S. Tokushisa, Y. Iwata and M. Inakage), and the WiiJ Video interface which uses the Nintendo Wii remote to control video using sweeping, rhythmic hand gestures.
We sought to extend the gestural concept, past just the hands to all parts of the body, to enable a VJ to control video simply by dancing. We embedded sensors into clothing to insure minimal interference from the interface. The VJacket allows for wide, imprecise movements with a precise rhythmic response. The VJ will not have to fumble for knobs and buttons, will not have to look at the screen to be sure he’s clicking on the right thing – he will be freed to control the video using his body movements alone.
Since it is wireless, the VJ will be free to interact with the audience and musicians – on stage or even walking through the crowd – something which most hermit-like VJs do not usually experience, since they are often delegated to the back corner of the club behind the video inputs and lighting controls. With a wireless system, a VJ becomes not just an engineer behind the curtain, but an actual live performer – one whose movements are directly connected to the video projections. The audience will be able to see the VJ’s gestures in connection with the video, and thus will become more interested in the performance itself.
It is connecting the audience with the VJ that is our ultimate goal, since doing so will create a more legitimate space for VJing in the performance community, and hopefully encourage more people to try VJing themselves. With video and consumer production becoming more ubiquitous, projectors becoming cheaper and smaller – even integrated into our cell phones and cameras – soon every rock band, DJ, and karaoke bar will have their own VJ. If they don’t, someone with a pocket projector and mobile VJ setup will guerrilla VJ anyway. This is the future we imagine, and we are trying to shape it with the VJacket.
Here some photos of the new Vshield. We want to use the jacket for a dance performance and trigger sound effects / videos with it.
We stick to the clever design of the Vshield v.1.0 that was developed earlier with Tyler. For this version we adjusted the sensors to:
- 3 piezo sensors
- 2 bend sensors
- 1 softpot
Here you see a sketch with the placement of the sensors:
Here a breadboard screenshot of the setup:
A photo of the material:
- 1x 3 Break away male pins
- 2x 5 Break away male pins
- 2x 5 Break away female header pins
- 1x 3 Break away female header pins
- 1x 2 Break away female header pins
- 3x 1M Ohm
- 2x 470 Ohm
- 1x 100 Ohm
Capacitors (not in the picture)
- 3x 1nF
- 1x Stripboard (12x 20 holes)
First we soldered the header pins on the side. We just soldered the outer pins and then put the vshield on the arduino. Therefore the pins could easier adjust to the arduino board and fitted perfectly!
This is a sketch of a new version for a jacket. It is designed for a local stepper / crumper who wants to use it for his performances. The jacket will include 4 piezo sensors and 2 bend sensors. Stay tuned!
The VJacket is a wearable controller for live video performance. Built into this old bomber jacket are all kinds of sensors to control visuals on the screen: hit sensors, light sensors, bend sensors and touch sliders. This way, the VJ is freed from the boring, cumbersome interface of mouse and keyboard, and instead can use the very clothes on his body to control the videos and effects with a precise dance converting convulsing limbs into luscious light shows. We are transforming this bomber jacket, a symbol of war and destruction, into a tool of creative expression and a symbol of peace. We are also going to release all the related hardware and software as open source in order to spread this transformation across the globe.
The VJacket uses a standard Arduino microcontroller board to relay the sensor data to the computer. To take it from there, we built the Arduino2OSC bridge: an easily configurable graphical interface that creates customizable OpenSoundControl messages from the sensor data. It also allows you to adjust the analog input data from the Arduino to your exact needs – scaling input and output values, adding cutoff thresholds, etc. – with enough options to (hopefully) cover all your Arduino input requirements: no matter if your sensor is a continuous slider or a one-hit piezo contact mic, no matter if you are manipulating a video effect or triggering audio samples, we tried to make it flexible enough so you’re not stuck reprogramming a new patch for every project – just make a new preset and you’re done!
For the above video demo, we used the VJacket through Arudino2OSC to send OpenSoundControl messages to Resolume Avenue, a popular VJ program. The Arduino2OSC bridge interface is generic enough to send any type of OSC message to any program that accepts them, including other video or audio programs like Arkaos Grand VJ, Max/MSP/Jitter, Kyma, etc. You can even send the messages over the LAN for networked performances!
We will soon make available the circuit designs, Arduino code, and Arduino2OSC Max/MSP patch/application – all under an open source license – so stay tuned to make your own VJacket!