NUS Arts Festival. A round up.
I managed to catch month of the events under the NUS Arts fest. And here is a review of all the events which I attended.
Press Play by EML.
Apart from the little ‘adventure’ I had while getting the tickets at the door, it was a really fun evening. The theme of the day was machines, and they approached it in a chronological order.
Starting with ‘old machines’. These were mostly peices influenced by old school synthesied sounds like 8bit sounds from old Nintendo games or other classics like Contra. There were some really interestingly videos along with the sounds.
They followed with comtemporary stuff, including making sounds with vaccum cleaners and other interesting devices. Finally, they played some futuristic pieces, which had various flavours.
While it was not amazing or surprising, I was satisfied to get what I expected at EML. There were many interesting toys they used to create the sounds including a couple of Kaoss Pads and a GameBoy sequencer and lots of midi controllers.
Spectral Spaces by Kim Cascone.
This was a very hardcore computer music event. Kim played 2 live peices on him Mac. They were basically created in his own software based on MAX/MSP. The cool part of it was that the sounds which he used to create his peices were randomly selected from his huge library of random sounds. He then used a simple Trim-pot based controller and some buttons on his Mac to controll the sound.
His idea was similar to something they have in Indian Classical music. There is “freedom in discipline”. He’s free to express himself but his discpline is the sound which his program had selected. It brought about some interesting sounds, some melodic, others just noise. I was surprised that I was very much attracted to the melodic sounds.
Kim also conducted a 3-day workshop where the participants played with various sounds and at the end presented an interactive peice, which I must say was very well done. I should really be going for such events. :(
An Interactive Visual Art Concert.
This was basically a project by the Mixed Reality Lab, in NUS. The idea was that the movements and sounds of a triplet playing Er-Hu, Cello and Piano were sensed and used to create graphics.
While the music was interesting, I was unable to link the music and the vizualizations. I started thinking about mental models and intuitive controls. I though that the mapping between the sound and the graphics was not very intuitive. It did not conform easily with my mental models.
But then again, it was a good try. Some of the people in the team were undergraduates doing their FYP. I am sure it was a fulfilling experience.
It was interesting to see them use MAX/MSP on a Mac to grab all the control data from the sensors and the sounds, and pass it over to another computer to render or control the video. This was done over ethernet.
Hmm.. possible usage scenario for OSC??