Stress. It’s literally killing us. Which is why a mobile game like ‘Breath of Light’ is so rejuvenating. It’s part zen rock garden, part puzzler, and part ambient music instrument. The game’s audio producer, Matt Ridgway, from Melbourne, Australia is better known for his cool, instrumental guitar act Winterpark, but the longtime creator has broken into mobile games in a big way in 2015. The critically acclaimed casual game was named one of CNET’s Best Games of March 2015 – due in no small part to its holistic audio approach. Inspired by a low-tech meditation aid, the Buddha Machine, Ridgway executed the totality of the game’s user interface effects and soundtrack in Live and Push. And it went pretty stress-free.
Composing for Games: Matt Ridgway and ‘Breath of Light’
Are you a big gamer or mobile gamer?
I’m not really a big gamer at all. In terms of the iOS games, I don’t use a lot of them, but what I do use is music creation apps. That’s my experience with using iOS: Sampler, Animoog Filtatron, Bloom. That’s how I wanted to approach this – sort of make a music creation tool as well as making a game.
This game is an instrument!
Pretty early on I thought that would be a good way to sort of create the soundscape – using interfaces to contribute to the tonal underscore of what’s happening.
What was the brief from game makers Many Monkeys?
That they’re making this meditative game where they wanted an atmospheric soundtrack. I thought, ‘Oh, that’s obviously the sort of music that I make anyway.’ I was given four seasons of music, each meant to have a different tonal landscape – summer, autumn, winter, spring – corresponding to different stages of the game, with different color schemes within the game. I proposed the idea that the user interface sounds become an interactive part of that soundtrack and contribute to the tonal underscore of the sound.
Did you get inspired by any ambient, yoga, New Age, or mediation albums?
Not particularly. I received one of those Buddha Machines for Christmas – it’s a little plastic box and you turn it on and it just plays a loop over and over again. You can pitch it up and down and there’s maybe 10 or so sounds of these just different loops. I thought, ‘Oh, wow, that’s sort of the sound of this.’ It has to be droney, but drones that don’t just make one note, they’re harmonically evolving.
What made you turn to Live to execute the idea?
I always just knew I was going to do it in Ableton Live. It’s just the tool that I always use, and for sound creation – it’s killer. I’m most familiar with it and I’ve been using it for 15 years now. I knew I could use Drum Racks to demo the user interface.
I came to the meeting with my Push and a Midi Fighter 3D and said, ‘OK here’s the background underscore idea, and then here’s a Drum Rack.’ Then I told them, ’When you touch the object in the game, I want you to also press one of these buttons on the Midi Fighter 3D to see what it does and if it works.’ They were like, ‘Oh my god, that’s pretty cool.’ And I think that sort of sold it for them.
I’m struck by the similarity between actuating sound loops in games and in the Live environment.
I had an early version of the game and I could watch it and play music along to it and get inspired by the look. I initially went through my library of stuff and through a variety of different sample Packs, one them being Spectral Textures. Unsurprisingly enough, that Pack has a bunch of sounds that made it into the user interface sound. There was one patch called Borealis, that became the level end sound. It just was like, ‘Aww, man, this sounds perfect for this blossoming end sound.’
How did Live output to the game’s engine? Was that a challenge?
For each level, I created a scene in Session View, and I had 100 scenes of different combinations of these loops for each level on the game. We created a Dropbox folder and a Google Sheets spreadsheet for each level. A Google Sheets spreadsheet looks remarkably like the Ableton Live Session View window and I copied what was in each of those into the spreadsheet and [Many Monkeys] wrote a script for Unity [the game engine] to be able to link to the Google spreadsheet and get all the info across.
How did you incorporate randomization in this project and why?
Particularly in user interface sounds, I didn’t want it to just be the same sound every time. Instead it will randomly choose one of five sounds that are tonally the same instrument, just different notes or chords which match the underscore. The user interface and underscore loops are slightly different bar lengths, they don’t all loop at 16 bars or 8 bars. Some of them loop at 37 bars and, 5 bars and so if someone is playing the game for a long time it should tonally evolve as well.
How did you approach percussion in this project? It seems to function like a “drop” in house music, signifying progress and release.
I wanted each season to be played like a track. There were 25 scenes per season of music. Every scene is a different part of the song. My world is not gaming, my world is music and so I wanted it to be a musical composition. You evolve through the seasons and you evolve through a musical composition.
Check out the great free Ableton Live Racks and production tutorials on Matt Ridgway’s Winterpark music website.