Stress. Itâs literally killing us. Which is why a mobile game like âBreath of Lightâ is so rejuvenating. Itâs part zen rock garden, part puzzler, and part ambient music instrument. The gameâs audio producer, Matt Ridgway, from Melbourne, Australia is better known for his cool, instrumental guitar act Winterpark, but the longtime creator has broken into mobile games in a big way in 2015. The critically acclaimed casual game was named one of CNETâs Best Games of March 2015 â due in no small part to its holistic audio approach. Inspired by a low-tech meditation aid, the Buddha Machine, Ridgway executed the totality of the gameâs user interface effects and soundtrack in Live and Push. And it went pretty stress-free.
Composing for Games: Matt Ridgway and âBreath of Lightâ
Are you a big gamer or mobile gamer?
Iâm not really a big gamer at all. In terms of the iOS games, I donât use a lot of them, but what I do use is music creation apps. Thatâs my experience with using iOS: Sampler, Animoog Filtatron, Bloom. Thatâs how I wanted to approach this â sort of make a music creation tool as well as making a game.
This game is an instrument!
Pretty early on I thought that would be a good way to sort of create the soundscape â using interfaces to contribute to the tonal underscore of whatâs happening.
What was the brief from game makers Many Monkeys?
That theyâre making this meditative game where they wanted an atmospheric soundtrack. I thought, âOh, thatâs obviously the sort of music that I make anyway.â I was given four seasons of music, each meant to have a different tonal landscape â summer, autumn, winter, spring â corresponding to different stages of the game, with different color schemes within the game. I proposed the idea that the user interface sounds become an interactive part of that soundtrack and contribute to the tonal underscore of the sound.
Did you get inspired by any ambient, yoga, New Age, or mediation albums?
Not particularly. I received one of those Buddha Machines for Christmas â itâs a little plastic box and you turn it on and it just plays a loop over and over again. You can pitch it up and down and thereâs maybe 10 or so sounds of these just different loops. I thought, âOh, wow, thatâs sort of the sound of this.â It has to be droney, but drones that donât just make one note, theyâre harmonically evolving.
What made you turn to Live to execute the idea?
I always just knew I was going to do it in Ableton Live. Itâs just the tool that I always use, and for sound creation â itâs killer. Iâm most familiar with it and Iâve been using it for 15 years now. I knew I could use Drum Racks to demo the user interface.
I came to the meeting with my Push and a Midi Fighter 3D and said, âOK hereâs the background underscore idea, and then hereâs a Drum Rack.â Then I told them, âWhen you touch the object in the game, I want you to also press one of these buttons on the Midi Fighter 3D to see what it does and if it works.â They were like, âOh my god, thatâs pretty cool.â And I think that sort of sold it for them.
Iâm struck by the similarity between actuating sound loops in games and in the Live environment.
I had an early version of the game and I could watch it and play music along to it and get inspired by the look. I initially went through my library of stuff and through a variety of different sample Packs, one them being Spectral Textures. Unsurprisingly enough, that Pack has a bunch of sounds that made it into the user interface sound. There was one patch called Borealis, that became the level end sound. It just was like, âAww, man, this sounds perfect for this blossoming end sound.â
How did Live output to the gameâs engine? Was that a challenge?
For each level, I created a scene in Session View, and I had 100 scenes of different combinations of these loops for each level on the game. We created a Dropbox folder and a Google Sheets spreadsheet for each level. A Google Sheets spreadsheet looks remarkably like the Ableton Live Session View window and I copied what was in each of those into the spreadsheet and [Many Monkeys] wrote a script for Unity [the game engine] to be able to link to the Google spreadsheet and get all the info across.
How did you incorporate randomization in this project and why?
Particularly in user interface sounds, I didnât want it to just be the same sound every time. Instead it will randomly choose one of five sounds that are tonally the same instrument, just different notes or chords which match the underscore. The user interface and underscore loops are slightly different bar lengths, they donât all loop at 16 bars or 8 bars. Some of them loop at 37 bars and, 5 bars and so if someone is playing the game for a long time it should tonally evolve as well.
How did you approach percussion in this project? It seems to function like a âdropâ in house music, signifying progress and release.
I wanted each season to be played like a track. There were 25 scenes per season of music. Every scene is a different part of the song. My world is not gaming, my world is music and so I wanted it to be a musical composition. You evolve through the seasons and you evolve through a musical composition.
Check out the great free Ableton Live Racks and production tutorials on Matt Ridgwayâs Winterpark music website.