Controlling Live with Oculus Rift’s Virtual Reality
Virtual reality (VR) has, for many years, been a medium suggestive of a far-off future and technologies more likely found in science fiction than in real life. But the past two years have seen the possibilities of VR back in the spotlight, driven in no small part by Oculus VR’s Rift headset - a head-mounted display unit made available for developers interested in exploring the potential of the medium. In 2015 VR is showing the early signs of potential as a serious tool for music production.
We recently spotted New Zealander Byron Mallett’s efforts building Pensato, a VR project geared towards hands-free music composition. Using Live as the driving engine, Pensato allows Mallett to pluck parameters out of thin air and rub macros between his fingers. Is this the way of the future? We spoke to the man himself to find out.
What was the initial catalyst for working with VR? Do you have a background in this sort of technology? And more importantly, did you have any specific goals in mind before you set out?
Before seeing the initial Kickstarter campaign for the Oculus Rift, I had always been interested in virtual reality but considered the possibility of designing for it something of a pipe dream. When I finally received my DK1 Rift in July 2013, I quickly started trying to hack it into one of my favorite games, which was a good starting point for figuring out how VR worked. I had just begun my research for my Masters of Design Innovation thesis at Victoria University of Wellington around this point, and VR was becoming a tantalizing avenue for research. The goal from this stage on was to see if it would be possible to create an interface that could act as a virtual hub for controlling an entire live music performance without having to touch any hardware at all.
I learned a lot with regards to what worked and what didn't when designing Pensato to work in three dimensions. One of the key points was that the interface had to react in ways that would make sense in the real world, so I ended up hacking the Hydra controllers into a pair of gloves that would track my hand position and gestures. This allowed me to design the interface in such a way that I could physically “touch” its components.
Can you explain what sort of relationship VR facilitates with Live? What are some of the possibilities that this interface offers that aren't achievable by other means?
In order to interact with Ableton Live, Pensato functions like a music hardware controller. This allows me to capture the entire Session View of Live, including all the tracks, clips and device parameters that can be controlled as part of a performance, and turn them into virtual controls.
After I pull the Session View layout into Live, I can interact with tracks, clips and parameters as if they were physical objects. When wearing the Rift and using my gloves, I can pick up a track (represented by an icosahedron) using a grasping motion and place it into a working area hovering in front of me. By pressing my index finger to the object, I can display all of the available device parameters and clips associated with the track. The displayed controls can be operated by using my index finger to press buttons, or by sliding my finger across the surface of a slider to adjust a device parameter. A two-finger gesture (like scrolling on a touchpad) can be used to scroll through hidden controls and bring them within reach.
What's next in developing the interface? Are there new features you're working on implementing? Do you see this sort of interface moving into common usage in the future?
Since putting up the demo video of Pensato, I've mainly been concentrating on finishing the written component of this project for my thesis, though I have a few plans in the future for improving the interface, both on the software and hardware sides.
On the software side, I'd love to focus more on reactive visual elements when controlling music. Rather than being just a fancy hardware controller, I'd like to look further into the relationship between music and reactive visuals based on sound and see how a VR environment can help enhance both of these properties. I also want to look into expanding the interface to be more collaborative in the future. I'd love to see multiple people performing in the same shared virtual space and throwing and catching tracks and clips to and from each other.
With regards to seeing this type of interface becoming more common in the future? Well, to be honest, most of Pensato has been designed with my own “what ifs” in mind as the basis for how a VR musical interface can be designed. I'm hoping some of these ideas might be useful or encourage others in creating their own VR music interfaces in the future. I think that there's definitely a future for a VR approach to musical performance, from both a performer's and audience's perspective.
We’re always interested in how Live is being used to push the boundaries of technology; keep us posted on what you’re using Live for by hashtagging #MadeWithLive
Follow Byron Mallett on Twitter.