I’ve been a bit quiet lately. The reason why is that I have been engrossed in some new work I’ve been doing with a very exciting company.
Back on April the 9th I got an intriguing direct message on Twitter asking if I would like to go chat to RjDj about doing some reactive music composion for them. I went along and met up with Michael the CEO and Florian another reactive music composer, who were really great. I started working on a number of fascinating projects almost immediately that afternoon !
RjDj is an amazing iPhone application which creates mind twisting hearing sensations by weaving your environment into reactive music. Voices, cars, your walking speed.. these and many other things can be used to shape your experience. If you have an iPhone or iPod Touch, try it!
I had been interested in RjDj from the moment I heard about it, and helped test during its beta phase. It’s come a long way since then with loads of new scenes added. Scenes are reactive musical programs within RjDj, which respond to your environment and movements in different ways. Here are some videos of various RjDj scenes :
Not only does it enable all this funkyness.. but RjDj allows users to record their unique mixes and upload them, via the iPhone to your own user space and Twitter + Facebook.
The scenes themselves are made in Pure Data, a real-time graphical programming environment for audio, video, and graphical processing. As I’m getting further into the wonders of creating reactive musical patches for RjDj, I’m discovering that Pure Data is a truly amazing environment for creativity. I also feel very honoured to be working with the great group of pioneers in this field which work at RjDj.
We are working on a few different projects at the moment, some of which are confidential. However I am working on a scene of my own, which I think strongly relates to my previous reactive work in interactive music, particularly the Parsec project. I’m hoping to release it as a scene through RjDJ soon.
The parallels between creating programmed RjDj scenes and working with programmatic music in games surprises me every day. Working with this technology gives you the power to associate musical events with any sound or physical movement, much like the scripting language extending out into the air around you.
In this respect it really feels close to an audio augmented reality or the idea of the virtual spilling over into reality. Afterall, visual augmented reality is kinda tricky – but audio augmented reality is here now. In this respect I have sort of come to a point where I’m starting to disregard the terms virtual as being something different to physical reality. For me everything virtual is becoming part of reality now, everywhere. Saying that I’m very excited about the possibilities of linking the technologies of this type of augmented reality and virtual worlds.