There’s nothing more personal than creative expression. And so experimenting with how you make music is more than just novelty: it’s a way to understand the fundamentals of how we relate to machines. And thinking outside the normal avenues means the ability to reach new people, as SoundLab is doing with audiences with learning disabilities. Ashley Elsdon joins us to give us the latest of how the project is going.
A little while ago, CDM kindly posted a piece on our SoundLab project, which aims to help people with learning disabilities make music and collaborate in music creation. That was as we were starting out at the beginning of the project’s 12-month lifespan. Almost five months in, and we have learnt a lot and made more progress than I could have hoped for. However, in doing that, we’ve raised both awareness of the project and expectations of it in a big way.
So what have we been up to? The short answer is quite a lot. But here are the highlights since you last heard from us …
NIME: SoundLab were involved in the NIME hack day and we met some amazing people doing great work in music technology and accessibility. It was a crazy day. Playing with some amazing technologies and making connections with some great people, too.
The Liberty Festival: SoundLab were invited to run an installation at the London Mayor’s Liberty Festival at the end of July. It was already a busy time for us, but we wanted to be a part of it, and it was really worth it. We decided to use a setup which had Ableton at the center. We used Live to set the clock for all our devices and act as the backbone of the whole installation. Here’s what we used:
- Ableton Live was running drum loops controlled from an iPad via MIRA. The MIRA patch also controlled the pulse sent to our two [Moog] Thereminis so that they would pulse in time to the drum loops. We used a Max for Live patch to send a volume ramp to the Thereminis to achieve this.
- We had two of our three Moog Thereminis at the event. These we set to the same scale and key and pulsed as I mentioned above. They were incredibly popular, but more of that in a bit.
- iPad running Thumbjam. We had another iPad running Thumbjam standalone. Thumbjam is such an accessible instrument that it also proved to be incredibly popular.
- Lastly, we had a late addition to the set up, in the form of an AlphaSphere.
How did it go and what did we learn?
Well, it was a huge success. We were due to be at the Liberty Festival for six hours and in that time we probably had about two or three minutes where we weren’t busy, which was unexpected in itself.
The Thereminis were a massive hit with everyone who came to our marquee. The advantage that they had was that people didn’t see them as a traditional instrument and because of this, they didn’t feel as though they were doing something alien. People who had never made music before found them a joy to play and satisfying to work with as they could learn very rapidly how to control the pitch and volume through fluid movements.
The other huge hit was the AlphaSphere, which was constantly occupied throughout the day. I think its tactile nature and feedback makes you think of it as a playful device and not a musical instrument and so people weren’t put off by it at all. Also just the amazing look of the device was a pull for anyone who saw it. Of course, we barely scratched the surface of what it can do, but for our purposes it worked like a dream and we were so pleased that nu desine brought it along on the day on the off chance that it might be handy.
Of course, the back bone of the whole set up was Ableton, which ran the show — although no one knew that at the time.
The Beautiful Octopus Club: This was our next event and certainly our most ambitious to date. This was the second time that SoundLab had been at the Beautiful Octopus Club and this year we decided to be three times bigger than before. We had three sites inside the Royal Festival Hall; they were:
The Cage: –which sounds awful! But it was where we had two really popular technologies. On one side, we had Native Instruments’ Maschines mikro. We had three of these, which were all set up to run the same project. Maschine is a complex technology, and we had to do a lot of work to make it accessible enough for anyone to walk up and just play and at the same time get a satisfying experience for even just a few minutes. But it worked and it worked well.
The Digital Band: This was basically the same setup as for the Liberty Festival but with the addition of another AlphaSphere, which just made it an even more enticing experience. This space was really busy throughout the night.
The Workshop Space: The last, but by no means the least, of the spaces was our workshop space, which is where we had Dentaku with their Ototo board and Mogees with their amazing app and lots of interesting objects too play with. We were really lucky to have both of these running so well together and also have the amazing Jo Hinchcliffe running the space. The response to both of these technologies was tremendous.
But if that wasn’t enough we were also at Music Tech Fest, as well:
For Music Tech Fest, we were lucky enough to be included in their hackathon, setting one of the challenges over the same weekend. I wasn’t able to be at all of MTF this year, but what I did see was truly inspiring and the responses to our hack challenge were amazing. We were also very happy to have the prize for our challenge given by FXpansion.
So, overall, it’s been a busy time for SoundLab. We’ve achieved so much in a really short space of time thanks to the amazing help of the music tech community. Our next challenge is to start to put our research into a code framework to help users and developers connect music technologies together in ways that make sense for them and that enable their creativity. More of that another time, though.
What have we learned?
We’ve learnt a great deal about getting people to engage in making music and understanding the barriers that are present that stop people from getting involved.
We used our Thereminis’ sync with Live to get people making music without touching an instrument – as this is a real barrier to starting.
We also had the whole rig running even before people came in contact with it, so that they didn’t feel that they had to start making music, instead it was already running and so it removed another barrier to making music.
In more pure research terms, we’ve learnt about how different technologies can work together == things like CV, gestural and motion-based, and tangible interfaces, and have started looking at how these will be brought together into our eventual code framework.
So we’re halfway through, and that’s an interesting point. We have also learnt a lot about how we run our own workshops and how to make sure that we get the right outcomes from them.
Video of Dentaku / Mogees: