Skip navigation

Monthly Archives: November 2010

VJing is a rhythmic manipulation of moving video and animation. A mouse and keyboard are very rigid and unforgiving – not ideal rhythmic controllers. Anyone who’s tried to scrub a video rhythmically with a mouse knows that the mouse is too precise in position and too slow in response for gestures of rhythm, which are innately imprecise in position but very precise in tempo and response. Knobs, sliders and touchpads are faster in response than a mouse, but still require tiny movements for grand rhythmic gestures, and require precise position and attention (most knobs can only be turned by two fingers at once: a precise movement).

A more natural controller for such tempo-centric manipulation would mimic the movements of dancing, since dancing is the ultimate form of rhythmic expression. We see this concept in several new experimental VJ interfaces, including the Maraca-like interface Rhythmism [1] (by S. Tokushisa, Y. Iwata and M. Inakage), and the WiiJ Video interface which uses the Nintendo Wii remote to control video using sweeping, rhythmic hand gestures.

We sought to extend the gestural concept, past just the hands to all parts of the body, to enable a VJ to control video simply by dancing. We embedded sensors into clothing to insure minimal interference from the interface. The VJacket allows for wide, imprecise movements with a precise rhythmic response. The VJ will not have to fumble for knobs and buttons, will not have to look at the screen to be sure he’s clicking on the right thing – he will be freed to control the video using his body movements alone.

Since it is wireless, the VJ will be free to interact with the audience and musicians – on stage or even walking through the crowd – something which most hermit-like VJs do not usually experience, since they are often delegated to the back corner of the club behind the video inputs and lighting controls. With a wireless system, a VJ becomes not just an engineer behind the curtain, but an actual live performer – one whose movements are directly connected to the video projections. The audience will be able to see the VJ’s gestures in connection with the video, and thus will become more interested in the performance itself.

It is connecting the audience with the VJ that is our ultimate goal, since doing so will create a more legitimate space for VJing in the performance community, and hopefully encourage more people to try VJing themselves. With video and consumer production becoming more ubiquitous, projectors becoming cheaper and smaller – even integrated into our cell phones and cameras – soon every rock band, DJ, and karaoke bar will have their own VJ. If they don’t, someone with a pocket projector and mobile VJ setup will guerrilla VJ anyway. This is the future we imagine, and we are trying to shape it with the VJacket.


  1. Satoru dangkang Tokuhisa, Yukinari Iwata, and Masa Inakage. 2007. Rhythmism: a VJ performance system with maracas based devices. In Proceedings of the international conference on Advances in computer entertainment technology (ACE ’07). ACM, New York, NY, USA, 204-207. DOI=10.1145/1255047.1255089

Finally we have a first instructables on how to make your own Arduino Vshield for the Vjacket.

Here the link to the tutorial: