This past Tuesday we were invited, by the Insectotròpics theatre troupe, to participate in an event presenting the Programa Suport a la Creació 2013 (production grants) from FiraTàrrega (an international performing and street arts festival in Tàrrega, Catalunya). The event was held at Barcelona’s beautiful Fàbrica Moritz (historical brewery), recently redesigned by architect Jean Nouvel.
Our friends from Insectotròpics are incorporating our Teatrillu software into their upcoming theatre production, BZZ, and we were delighted to be asked to be there with them at this event. And not only because of the free Moritz beer…
It was a great to be part of this event, which gave the public a preview of some early work on the Insects’ next piece, to be premiered at FiraTàrrega this September. Since January 2013, we’ve been invited to some of their rehearsals and production sessions, helping them incorporate our Teatrillu into their show. Among other things, our software will allow them to do live stop-motion animation and interactive visual trickery — allowing bits of paper, blobs of paint and other objects to take on a life of their own.
Here is some more (raw) footage from the Insectotròpics’ Moritz event, some of which shows the Teatrillu in action (thanks to Vicenç):
I finally managed to put together a video of my live coding performance at Niu. Enjoy! (that is, if you have 23 minutes to kill)
2nd annual “Live Coding Sessions” evening at Niu, on March 22, 2013
Coding was done in SuperCollider 3.6, and I set myself the constraint of only using the most basic building blocks of sound: sine waves (as oscillators, LFOs, envelopes, even as arpeggiators driving patterns of scale degrees). I also tried to create everything pretty much from scratch, although I did “cheat” a few times (cheating, perhaps, according to live coding purists). As I’m still wishing for macro functionality in SC’s new IDE (if I had some time I ought to just contribute it myself), I decided to use an external macro program to speed the coding in a few cases (just a marginally sexier version of cut & paste). It’s too bad you have to strain a bit to read the projected text in the video — that’s the most interesting part of live coding: seeing the code that goes along with the sounds you’re hearing! Otherwise, it can be a bit long… (-;
Audio was captured on a Zoom H1 recorder, including room ambience (not to mention me pounding the keyboard). I also taped a contact mic to my laptop, giving an enhanced sense of “liveness” to the coding). The sound is probably best appreciated with headphones (for the lower sine wave frequencies).
(Spot the glitch! From about 4:10 to 5:10, I make a mistake, and have to work my way through it. In trying to pack a lot into 20 minutes, I make an error and it takes me a moment to figure out where I’ve gone wrong… I should have scrolled back to read the errors in the post window right away, but my pride was trying to avoid doing that, to make it seem as if all were under control. Live coding is an interesting mental challenge — It’s easy to code from the comfort of home or office, but your brain works differently (or doesn’t ;-)) under “live” (i.e. people watching you) conditions! It’s not unlike meditation, in that it needs extreme focus and concentration. Turns out I just needed to keep calm and carry on…and add that missing variable name!
Merci beaucoup à Anna Duriez — for bringing her SLR camera and recording the video (and letting me use it)! It was quite flickery (a strobing effect from the shutter speed interfering with the lights and projector), but I managed to even it out by blending frames together — luckily there’s not a lot of action or camera moves in live coding, so I could get away with it.
El Teatrillu (a Catalan diminutive of teatre, meaning: “little theatre”) is software for the performing arts. It’s an application I’ve been working on with the other members of the Wù:: collective. Alex and Roger had already created the first version of this software — a mix of interactive theatre, live stop-motion animation, puppeteering and digital sleight of hand — when I came along last summer and asked if I could join the party. Once indoctrinated into their “collective”, I helped organize and tidy the code, added some new features, and started thinking about how to take the ideas from their prototype and develop them more completely in a rewrite. Then, in November 2012, we won a grant from Telenoika, a Barcelona-based “creative audiovisual community”, to continue this work and ultimately release a more refined second version to the public as open source software.
Since the start of this year, we’ve been working with the folks from Insectotròpics, with the idea that they use our software in their upcoming theatre production. We’ve found (not surprisingly) that speaking with real users has helped us to discover the possibilities and limitations of our own program, to figure out what works and what doesn’t, and to add to our never-ending list of “cool ideas” we’d like to implement. (Unfortunately, each of us likes to keep many plates spinning at the same time, so work recently has lagged on Teatrillu.)
In order to get more feedback, and to re-energize us, tomorrow we have an “evening of open experimentation” with our current Teatrillu software, to let the public play with it, give us their thoughts, ask us questions. It’s not a workshop — hopefully that will come in the future — but more of an open (play)house. Thanks to Telenoika for offering us their space in el Raval (c/ Sant Pau, 58) to hold this event, Thursday April 4, from 18h to 21h.
Live coding: I know, I haven’t yet posted any comment on my live coding performance of March 22. It went well; a small but enthusiastic crowd of maybe 25-30 people(?) came out. After weeks of trying all kinds of experiments, fretting and rehearsing, I was glad to get on with it, and ended up quite happy with my performance. The reaction from the crowd and comments afterwards were very favourable (Josep, the Laptop Orchestra’s director, even said that it: “…reminded [him] of the image of Bach improvising a fugue” — then again, he’s known for being extremely generous with his praise!). For me, it was a nice, relatively stress-free introduction to this new kind of performance. Thanks to Gerard and Graham for letting me in on this, their 2nd annual event!
I recorded ambient audio in the room, but it’s not too exciting without also seeing what’s going on on the screen at the same time (and even then…). I’m still hoping to get ahold of some video footage that was shot at the event and that shows the screen and code clearly enough. If I get some, I’ll put something up on Youtube or Vimeo, and link to it here.
A nice new video of our January Phonos concert is available now on Youtube (thanks, Sònia!).
Since then, we had a more intimate and playful performance (February 8) at a small art space called Niu — it went down really well (maybe drinks helped — audience and/or performers ;-)). We performed three pieces, including extended and more improvisatorial versions of CliX ReduX and Six Pianos (which didn’t use pianos at all this time, instead Hammond organs, electric guitar/bass and a few other funky things).
I updated CliX to use a synchronized clock (MandelClock) from BenoitLib. Proper synchronization between machines helped the piece a great deal, allowing us to get into some really interesting grooves, especially with the sampled sounds and projected video snippets. Caballé is always a hit… However, we did have a few glitches (still not 100% sure why), where the tempo would occasionally change without warning. It corrected itself within a few seconds, but was quite disconcerting (although several people in the audience claim not to have noticed anything wrong). In subsequent rehearsals the problem wasn’t as severe (I made some changes to reduce network traffic), but did still occur from time to time. I suspect it’s to do with lost or out-of-order OSC messages, which happens regularly on busy WiFi networks.
Most recently, the Barcelona Laptop Orchestra performed (March 8) as “pre-dinner entertainment” at the Polifonia conference (a mostly-European grouping of music conservatories), held in Barcelona. It was located in the restaurant area of the Museu Marítim, in a beautiful stone building that used to be part of the old shipyards. We were only performing CliX ReduX, and had managed to build to a nice “welcome” point after five or ten minutes, when suddenly — BOOM! — our power went out. (Everyone applauded; I assume because it was a particularly dramatic stop, but perhaps they were simply glad they could start eating.) Somewhere, we had tripped a circuit-breaker.
It took at least 15 minutes until we found a functional plug (downstairs, using an extremely long extension cord) and got the projector working. Our laptops waited patiently, chugging along on battery power. But by then, we’d lost some of our vibe, and the audience had moved on to chit-chat, toasting and appetizers. We performed the last section of our show, but I wouldn’t claim it was a huge hit. We did get free dinner out of it, though. I’m sure many of the classical music professors were thinking: “Hah – all this new-fangled technology, what a disaster! That’s why violins, pianos and oboes are better!”
In other news:
I’ve agreed to perform at a live coding event at Niu, on March 22. Yikes, my first time flying solo. I’ve been spending the last few weeks trying things out in SuperCollider, but still (with only a week and a half left to go) haven’t found a good flow. I decided to set myself a constraint — skipping fancier synthesis techniques and only working with sine curves. Well, that’s the plan…
Glen Fraser (Canada) has always preferred “live coding” to dead coding. Although he’s programmed interactive graphics and sound for fun and profit for a quarter century, it’s always been from the relative safety of his home or office. This will be his first time doing it for an audience. In this performance, Glen will use SuperCollider to explore what he calls “Sines and Symbols”. He is currently a member of the Barcelona Laptop Orchestra and of the Wù:: Collective, where he develops technology for the performing arts.
The concert is also mentioned on Modisti (though I prefer my own English translation…)
We had our Phonos concert last Thursday (January 31), and in spite of not being 100% prepared — in spite of a big final-week push — I think it was a success. Probably 30-40 people attended in the “sala polivalent” at the UPF, and they were treated to a very complex setup and six very different pieces. The setup took most of the day; some of us were there from 10h to 22h, and the concert was at 19h30. Eight speakers in a ring around the audience, with eight tables and one or two performers seated at each table. In addition, eight channels of video, fed directly, or (for lack of hardware) in some cases reshot by cameras from external monitors to capture the output (quality of video was not great, in these cases). All those feeds were sent through video mixers to create two projections, each with a 2×2 grid of videos. In some pieces (e.g. Light Scratch), these showed the faces of the performers, in most other pieces they showed the contents of our screens (e.g. Six Pianos). Laptop music can be rough going if the audience has no sense of the interaction between what the people sitting behind the computers are doing, and what they’re hearing. Hopefully the video feeds helped a bit with that.
But really, it’s all about the music. Minimalist classics, in most cases, reworked to give them our own touch. We played our reinterpretations of six pieces:
In C (Terry Riley) — based on a Pure Data patch by our director Josep, I reworked this piece to run in SuperCollider. We played it as people came in, to create a mesmerizing ambience to open the concert. Audio is a simple synthesizer, to allow people to focus on the interesting phasing effects produced by each performer’s place in the score.
Rimandi (Ivano Morrone) — a piece that uses contact microphones stuck to the laptop, and makes ring-modulated noisy goodness from internal computer noises, fingers tapping, rubbing and scratching the laptop itself.
CliX ReduX (Ge Wang / BLO) — the original piece, that makes rhythmic clicks based on networked typing, was create in ChucK (Ge Wang, Princeton Laptop Ensemble), but I totally redid it in SuperCollider. Our version sounds similar when in “clix mode”, but beyond that it’s barely the same thing. Now you work with a longer buffer of characters, can change the rhythm and also can change the sound of the “clicks” to be sample-based. We created audio and video samples with fragments of “interesting things” scavenged from various sources (for ASCII characters which are not letters), and for the letters of the alphabet, we use each performer’s voice (and face) making the letter’s sound. We created a video player, written in Open Frameworks, that plays back “video samples” that complement the sound. You can get an idea of what this can look like (for one performer, at least) with the videos in a previous post. This latest version of the piece was very well received.
Light Scratch (BLO) — created by one of our members, Nadine Kroher, it uses the webcam to look for bright light spots, and do funky things with an audio samples based on the user moving a light source, or jamming their face up close to the camera, waving hands, etc. It can be quite entertaining (or frightening) to see a macro view of Enric’s nostrils or Jan’s eye…
Variations II live (John Cage / William Brent) — this one is a live reworking of an installation piece, created for us by William Brent. It involves each player making a series of very simple sketches (each with six lines and five points), which treated as mini-scores and sent to a central server and used to create the audio of the piece. There is also visual feedback of the scores as they are played. William was located across the Atlantic in Washington, contributing sketches remotely, along with the rest of us. We also had him on a Skype connection, and placed him on a pedestal (literally) as the piece was being performed. This was the premiere for his piece.
Six Pianos cover (Steve Reich / BLO) — this is another favourite, as it is very obvious that we are actively doing something. Each player has a webcam pointing down at their workspace (playspace?), with a small light illuminating the space. An Open Frameworks application uses OpenCV and Gaussian classifiers to detect blobs of colour, with the colour indicating scale degree and size indicating octave (big = low, small = high). The playspace acts like a step sequencer, with time-steps along the horizontal, and the vertical axis used to control volume. It is called Six Pianos because that’s the piece that inspired it. In this concert, we performed an excerpt of Steve Reich’s piece, using this new visual instrument. Each player’s notes are sent to a SuperCollider program that is responsible for playing the synchronized audio. The instruments are high-quality sampled pianos, using NI’s Kontakt, output via a six-channel audio interface, and each output going to the speaker of its corresponding performer.
Tonight we have another gig, at Niu (an art centre in the Poblenou neighbourhood of Barcelona). I’m not sure what kind of audience we’ll have, since our concert listing is included in such websites as ClubbingSpain and Le Cool. Ah, if only we truly were. (Cool, I mean.)
Tonight we’ll just perform a few pieces: CliX, Six Pianos and Light Scratch. Even though only a week has passed, two of these pieces have already evolved (software-wise or performance-wise). Tonight we only have two loudspeakers, and a much more intimate space, so we decided not to use pianos but rather six distinct instruments (to distinguish individual players a bit). Also, we’ll jam with these pieces for a bit longer, improvising as we go. We had a good rehearsal last night, where we tried this more “free form” Six Pianos. Take a look (note that the audio level is quite low, so best to listen amplified, or with headphones to get the full effect).
This project started from the recording of a clock. I recorded our kitchen clock, but unfortunately it ended up a bit noisier than I’d have liked. There’s some of hiss in there (more obvious once you layer dozens of versions of it!), and there might be the odd muffled street noise. Okay, so I don’t have a silent recording studio. Texture, yeah, that’s what I call it. (-;
I took that single monophonic sample (about 23 seconds long) and then used it as a buffer in SuperCollider, making various drone-like synths, plenty of funky ticking patterns and some weird warping transitions and granular stuff. The opening seconds are pretty much the original sound, albeit layered several times.
I wanted to go for a hypnotic, dream-like effect (yes, you are getting very, very sleeeeeepy) where I could move smoothly between different phases. In some, you’re aware of time being slowed down, in others it flies by, and at other moments you don’t even notice it’s there.
Instructions: This week’s project requires you to make a field recording to serve as the source audio. These are the steps:
Step 1: Locate a clock that has an audible, even if very quiet, tick to its second hand. A watch or other timepiece is also appropriate to the task.
Step 2: Record the sound of the clock for at least 30 seconds, and do so in a manner captures the sound in the greatest detail. A contact mic is highly recommended.
Step 3: Adjust and otherwise filter the recording to reveal the various noises that make up its tick. The goal is to get at the nuance of its internal mechanism.
Step 4: Create an original piece of music employing only layered loops of that sound. These layered loops can individually be transformed in any manner you choose, but at least one unaltered version of the original recording should be included in your piece.
Have been working slavishly on several pieces for the Barcelona Laptop Orchestra. Among them, an optical-recognition piece that took Steve Reich’s Six Pianos as its starting point (or — more accurately — it’s ultimate goal, and we’re not quite there yet!), and a piece we call CliX ReduX, inspired by Ge Wang and the Princeton Laptop Orchestra’s original CliX.
We have our Phonos concert coming up next Thursday (January 31, 2013), at the Universitat Pompeu Fabra. Full details of the repertoire we’ll be performing is available on another page (only in Catalan, sorry). You can also read a description of the pieces, by director Josep Comajuncosas (also in Catalan) here.
First up, a sample of CliX ReduX, showing several enhancements, such as the audio and video snippets seen in this video. In the video, I just run through the alphabet a few times, giving a taste of how it looks and sounds (when in “Vox” mode). The audio and UI components are written in SuperCollider, the video sampler program in Openframeworks.
Next, I show the more “classic” version of CliX ReduX. This one has sound that’s more in keeping with Ge Wang’s original piece, but adds visual display of “flying letters”, and also the possibility of multiple syncopated character streams per player. The text here is from Hamlet’s famous soliloquy, and runs from: “To be, or not to be, that is the question” through to: “Tis a consummation / Devoutly to be wished.” At first there is only one stream, so it’s relatively easy to follow the letters (if you know what to expect!), but after a few lines I put it into “syncopated” mode, where more than one letter can play simultaneously. It’s like a spelling bee on steroids…
Finally, here’s another example of the CliX ReduX piece (this one featuring another BLO member, Andrés, “speaking” the first part of the famous Hamlet soliloquy — “Whether ’tis nobler in the mind to suffer the slings and arrows of outrageous fortune, or to take arms against a sea of troubles, and by opposing end them.“):
I recorded myself dropping and shaking ice in a pint glass. Then I used this single sample (of about 7 seconds) to produce all the sounds (percussion, drones, semi-pitched) in this track. Produced entirely in SuperCollider.
Instructions: Please record the sound of an ice cube rattling in a glass, and make something of it.
Background: Longtime participants in, and observers of, the Disquiet Junto series will recognize this single sentence as the very first Disquiet Junto project, the same one that launched the series on the first Thursday of 2012. Revisiting it a year later provides a fitting way to begin the new year. A weekly project series can come to overemphasize novelty, and it’s helpful to revisit old projects as much as it is to engage with new ones. Also, by its very nature, the Disquiet Junto suggests itself as a fast pace: a four-day production window, a weekly habit. It’s beneficial to step back and see things from a longer perspective.
This fall I joined the Barcelona Laptop Orchestra, a technically-saavy musical ensemble founded by folks from the Sonology Group at ESMUC (l’Escola Superior de Música de Catalunya) and the Music Technology Group at UPF (Universitat Pompeu Fabra). They also allow a few of us non-affiliated “outsiders” to join, thankfully…
If you’re up for a bit of Catalan practice, you can read this great blog post/interview about us (essentially, trying to answer the question: “what is a laptop orchestra?”).
If you’re up for more Catalan practice (hey, you can never have enough), there was a TV report about our most recent performance (with wine pairings!) on November 10, at the Claustre Sant Francesc in Vilafranca del Penedès. This was for Vinfonies, a sonically-experimental wine festival. It had several major benefits (for us, at least):
try out new repertoire in a low-stress setting
drink some really good, performance-enhancing, wine (for free)
You can watch the report from RTV Vilafranca here.
I am busily (and happily) coding away on the framework for some of our next pieces.
I think we all realize that “laptop orchestra” can be a confusing term…what does it mean? Does it have some dubious connotation (e.g. if the music is lively, might it lead to “laptop dancing”?). Thankfully, my friend Roger (who introduced me to the group) — a wonderful illustrator — has given us this clarifying comic: