Clint Sand: Electronic Polymath
You won't find Clint Sand where you last saw him. Constantly exploring the boundaries of electronic music, video and new media, Clint's musical style evolves in step with his varied working process and ever-changing choice of tools. With his latest effort, a dark trip into ambient and drone, Clint combined analog modular synthesis with customized Max for Live devices to forge his unique sound. We caught up with Clint to discuss Ableton Live and Max for Live, as well as his development of www.maxforlive.com, the largest online user library of Max for Live devices.
Clint, you've played and worked in many different bands and projects. Please tell us about yourself and your creative work.
I began my electronic music career creating dancefloor-inspired industrial music as cut.rate.box. After several successful CD releases and multiple US & European tours between 1999 and 2003, I formed Mono Chrome to experiment with melodic pop structured electronica. Today, as synnack, I'm focused on fusing IDM/experimental-inspired music with video art installations. I've done five releases under this name so far, and there are many more in the works.
The video work began in 2007 in collaboration with video artist Jennifer McClain to create a custom video backdrop for use for the synnack live performances. In 2008 she became a full-time partner in what we call 0xf8 Studios to represent the combination of synnack's music and our multi-media video collaboration. This early collaboration has now morphed into an exciting new direction for synnack.
Can you tell us about the equipment you use and how you arrived at your present set up?
In the early days we were using mostly hardware synths & samplers and only used a computer for sequencing. As software tools evolved and decent audio support stabilized in affordable DAWs, my studio contained less and less hardware. Today, the combination of Ableton Live, which allows you to actually perform electronic music as a writing method, with analog modular synths is the perfect studio. My studio now is essentially just a Mac Book Pro, Mackie monitors, a few USB controllers and a Doepfer a-100p6 rack full of fun stuff.
When you decided to consolidate your studio set up, you chose Ableton Live and Max for Live as your only two software tools. What brought you to this decision?
For years I had been creating different ideas in different files in software and when it came time to combine them into coherent songs, it was a nightmare of exporting and importing. The fact that, in Ableton Live, you can just drag and drop songs into each other, and even preview them from the browser was totally revolutionary and still is. Another component that is often overlooked is Operator. Operator at first seems like a basic and simple FM synth. However once you really delve into it, the simplicity is actually its power. The final thing that really turned me to Ableton Live is the stability of the audio engine. The fact that you can drag and drop effects into a song while it's playing, and even reorder them with no audio dropouts is insane. It's brilliant.
Many times I thought to myself that it would be amazing if there was a way to leverage the sequencing, global transport, and preset management of Ableton Live in Max/MSP. When Max for Live was announced I couldn't believe it. Not only could you do exactly that, but you get access to the Live API itself. I was sold immediately.
What role does Max for Live play in your music? What kind of things are you getting it to do that are not already in Live?
In my last release, v2.5, I used it as an LFO to modulate aspects of Operator instances that are otherwise static. I also created a Max for Live device that would color code clips based on their play history. This facilitated a new type of workflow that ended up having a big impact on the musical result.
Recently, I built a series of Max for Live devices that analyze and interpret audio into a series of numbers that get sent to a Jitter instance for use to control visual effects. We plan to use these to automate video creation for a collection of videos for the synnack v2.5 release; to create dynamic video projection for the synnack live performances, and as controllers for art installations.
I have Max for Live devices that send information to Jitter, including raw frequency band data, BPM, global transport status, amplitude, length of time since play started, and so on. Jitter interprets this data to generate visuals based on what the audience is hearing. You can read all the details of the new video work that leverages Max for Live on my blog.
How have Ableton Live and Max for Live played a part in your collaborative work? Have you found other collaborators through patch exchanges?
The moment Max for Live was announced, I started thinking about how open source development methodologies might be applied to Max for Live devices. I created the maxforlive.com site as an attempt to provide a resource for Max for Live users to build a community of sharing and collaborating. As of December 2010, there were hundreds of shared devices and over 10,000 users. I have gotten to know many brilliant Max for Live programmers who are passionate about the possibilities Ableton and Cycling'74 have given us. Maxforlive.com is not just about sharing and consuming; it's also about learning. Generally if I'm thinking about how to build a device I want, I can find something similar in the site library and see how someone else thought about the problem. This is a big jumpstart when building your own devices.
You've recently expanded into using modular analog synthesizers. How does Max for Live let you interface Live with these?
For my latest release, I spent a ton of time on sound design with the analog modulars, recording clip after clip of cool sounds. I ended up with hundreds of clips in one massive Live Set. To create the final tracks, I wrote a new Max for Live device that would change the color of a clip upon being played. This way I could easily ensure during the performance that no single clip would get fired twice. Then I did a series of live recordings using an Akai APC40 where the clips were fired and live dub-inspired mixing/effecting techniques were used. The results of this live performance in 0xf8 Studios resulted in the finished tracks. You can read all the details here.
I'd like to thank Ableton for supporting my work. It is not always you find a vendor so willing to support their user community.