We had our Phonos concert last Thursday (January 31), and in spite of not being 100% prepared – in spite of a big final-week push – I think it was a success. Probably 30-40 people attended in the “sala polivalent” at the UPF, and they were treated to a very complex setup and six very different pieces. The setup took most of the day; some of us were there from 10h to 22h, and the concert was at 19h30. Eight speakers in a ring around the audience, with eight tables and one or two performers seated at each table. In addition, eight channels of video, fed directly, or (for lack of hardware) in some cases reshot by cameras from external monitors to capture the output (quality of video was not great, in these cases). All those feeds were sent through video mixers to create two projections, each with a 2x2 grid of videos. In some pieces (e.g. Light Scratch), these showed the faces of the performers, in most other pieces they showed the contents of our screens (e.g. Six Pianos). Laptop music can be rough going if the audience has no sense of the interaction between what the people sitting behind the computers are doing, and what they’re hearing. Hopefully the video feeds helped a bit with that.
But really, it’s all about the music. Minimalist classics, in most cases, reworked to give them our own touch. We played our reinterpretations of six pieces:
In C (Terry Riley) – based on a Pure Data patch by our director Josep, I reworked this piece to run in SuperCollider. We played it as people came in, to create a mesmerizing ambience to open the concert. Audio is a simple synthesizer, to allow people to focus on the interesting phasing effects produced by each performer’s place in the score.
Rimandi (Ivano Morrone) – a piece that uses contact microphones stuck to the laptop, and makes ring-modulated noisy goodness from internal computer noises, fingers tapping, rubbing and scratching the laptop itself.
CliX ReduX (Ge Wang / BLO) – the original piece, that makes rhythmic clicks based on networked typing, was create in ChucK (Ge Wang, Princeton Laptop Ensemble), but I totally redid it in SuperCollider. Our version sounds similar when in “clix mode”, but beyond that it’s barely the same thing. Now you work with a longer buffer of characters, can change the rhythm and also can change the sound of the “clicks” to be sample-based. We created audio and video samples with fragments of “interesting things” scavenged from various sources (for ASCII characters which are not letters), and for the letters of the alphabet, we use each performer’s voice (and face) making the letter’s sound. We created a video player, written in Open Frameworks, that plays back “video samples” that complement the sound. You can get an idea of what this can look like (for one performer, at least) with the videos in a previous post. This latest version of the piece was very well received.
Light Scratch (BLO) – created by one of our members, Nadine Kroher, it uses the webcam to look for bright light spots, and do funky things with an audio samples based on the user moving a light source, or jamming their face up close to the camera, waving hands, etc. It can be quite entertaining (or frightening) to see a macro view of Enric’s nostrils or Jan’s eye…
Variations II live (John Cage / William Brent) – this one is a live reworking of an installation piece, created for us by William Brent. It involves each player making a series of very simple sketches (each with six lines and five points), which treated as mini-scores and sent to a central server and used to create the audio of the piece. There is also visual feedback of the scores as they are played. William was located across the Atlantic in Washington, contributing sketches remotely, along with the rest of us. We also had him on a Skype connection, and placed him on a pedestal (literally) as the piece was being performed. This was the premiere for his piece.
Six Pianos cover (Steve Reich / BLO) – this is another favourite, as it is very obvious that we are actively doing something. Each player has a webcam pointing down at their workspace (playspace?), with a small light illuminating the space. An Open Frameworks application uses OpenCV and Gaussian classifiers to detect blobs of colour, with the colour indicating scale degree and size indicating octave (big = low, small = high). The playspace acts like a step sequencer, with time-steps along the horizontal, and the vertical axis used to control volume. It is called Six Pianos because that’s the piece that inspired it. In this concert, we performed an excerpt of Steve Reich’s piece, using this new visual instrument. Each player’s notes are sent to a SuperCollider program that is responsible for playing the synchronized audio. The instruments are high-quality sampled pianos, using NI’s Kontakt, output via a six-channel audio interface, and each output going to the speaker of its corresponding performer.
Tonight we have another gig, at Niu (an art centre in the Poblenou neighbourhood of Barcelona). I’m not sure what kind of audience we’ll have, since our concert listing is included in such websites as ClubbingSpain and Le Cool. Ah, if only we truly were. (Cool, I mean.)
Tonight we’ll just perform a few pieces: CliX, Six Pianos and Light Scratch. Even though only a week has passed, two of these pieces have already evolved (software-wise or performance-wise). Tonight we only have two loudspeakers, and a much more intimate space, so we decided not to use pianos but rather six distinct instruments (to distinguish individual players a bit). Also, we’ll jam with these pieces for a bit longer, improvising as we go. We had a good rehearsal last night, where we tried this more “free form” Six Pianos. Take a look (note that the audio level is quite low, so best to listen amplified, or with headphones to get the full effect).