As someone with many talents - director, producer, writer, animator & designer, how did you get into filmmaking?I studied graphic design and began exploring animation and motion while in school, basically taking what I was learning about typography, color, and composition and making it move. We’d have these bookmaking projects in which we were supposed to create a perfectly bound book with typographic studies on linen paper, and I’d turn in an After Effects animation with some moving type and try to bullshit my way into showing that it was an appropriate substitute for the requested book. Sometimes it worked, sometimes it didn’t.
Wanting to do graphic design for a screen instead of a printed page ultimately led me into Interaction Design, and after graduating I took a job at Apple in the Human Interface Group where I contributed to the UI and Visual Design of the iPhone, iPad and MacOSX. At Apple, I learned from incredibly talented team of designers about best practices for creating screen media, and how to think like a user when creating experiences.
At the same time, I was exploring incorporating this fresh knowledge into other areas of personal creative interest, namely animation, and my growing interest in filmmaking. Some of my animation & video sampling work caught the attention of some DJs, namely Diplo, who hired me to create the first visuals for Major Lazer’s debut tour. I think I had a week to design and animate 45 minutes of footage, and then spent a sleepless weekend of rendering and re-rendering to get them out on time.
Director Ryan Staake
After creating this project and getting more interest from other DJs, bands and labels, I began to incorporate more live action filmmaking methods into my animation, working with new filmmaker collaborators to understand this world of filmmaking that was pretty much completely new to me.
In 2009, I became so excited by it that I left Apple to start Pomp&Clout, a Brooklyn-based production company/design studio, and have been doing that ever since. We’ve expanded over the years, and the current team includes myself as executive producer, creative director Aaron Vinton, lead VFX artist Pete Puskas, my brother Kevin Staake working in treatment development, and a growing director roster that includes Jay Buim, Anthony Sylvester and myself.
You’ve worked on music videos for a range of genres, how do you decide which artists you want to work with and does the genre of music alter your process?
Regardless of genre, I really enjoy working with artists who embrace new ideas and don’t feel like they have to be front and center in all of their visual output. I remember a quote from music video director turned feature film director Ruben Fleischer in which he basically said music videos should have a “central thing”.
I try to approach all of my music videos with this goal in mind, searching for some deceivingly simple but interesting visual theme to exploit for the entirety of the video. Often times, this is a visual effect, new method of shooting or a camera technology. When I’m lucky and persistent, my process will reveal a visual theme which is interesting to look at on its own, doesn’t overstay it’s welcome before the end of the video, and expands on the ideas of the song in an unexpected way.
You’ve used different technologies when filming to create interesting visuals, how do you decide on what new or “unconventional” filmmaking tools to use and on which projects to use them?
Most of the time, I’ll find an exciting new technology I’m itching to work with and sit on it for a bit waiting for the right project to make use of it on. For Route 94, I’ve always loved the look of night vision and thermal vision, and thought it would be an incredible look for a video, albeit a blurry, low-res one. It wasn’t until I started to look into the technology that I found the resolution and fidelity of the image had matured to beautifully sharp 1080p footage, with all these vibrant color palettes you could choose from.
For the Booka Shade video, I’d seen stills of “little planet” panoramas, and did some basic research with my DP TS Pfeffer into how we might create a similar effect in motion. We quickly arrived at 360° GoPro rigs, then saw a need for mounting them on a moving rig to achieve a sense of rotation in the little planets. We’d just previously created a Major Lazer video with a couple drone shots, and decided to fully rely on this aerial technology to carry a 7x GoPro rig 100s of feet above the ground.
We shot the video across almost all of California over the course of a week, aiming to get as wide a range of locations and natural environments that we could. Once back with the footage from the different GoPros and working on the stitching process, which maps these different shots into a panorama, we became aware of a new product called Oculus Rift, which our footage was perfectly suited. More on that later.
For Booka Shade 'Crossing Borders', how did your desire to use the “little planet” panoramas tie in with the artist, lyrics and central concept?
Again, I really like this look of these “little planet” panoramas and wanted to see them in motion. “Crossing Borders” lyrically speaks about getting over our differences, whether in relationships, between cultures, or whatever it might be. I liked this idea of just trying to show the range of places in the world as a way of visualizing this concept, shrinking them to the point that they all appear like tiny little playthings.
Was this your first project using a copter/drone for filming or 360 panoramas? What did you learn from the process that made it different from other projects?
It was our first time with 360° panoramas. I learned about the production realities of using these camera rigs, how the subjects need to be a certain distance from the lens, how the stitching is a huge pain in the ass (but critical), and how you occasionally need to hide behind objects to get out of your shot. Once we began to convert our footage onto the Oculus headset for virtual reality, I learned that the basics we take for granted in traditional filmmaking are all but erased: edits can be jarring, everything feels POV, the viewer can miss whole chunks of what you filmed if they’re looking the wrong way, and sound is extra critical.
In terms of virtual reality, my team and I have been moving more towards virtual reality that makes use of 3D scanning, motion capture and live-rendered 3D via Unity. We’ve basically come to the realization that 360° video has its place but is too limiting for virtual reality. It feels like you’re stuck in space or on a track, when one of the most satisfying elements of VR is seeing something in the distance and deciding to walk up to check it out.
We’re beginning to see more and more immersive content, what emerging technologies are you most excited about? What technologies are you planning to use for future projects?
Augmented reality is incredibly interesting. We’re doing a lot of R&D with live camera tracking and 3D scanning for mobile phones and tablets, exploring how we might add onto the world around us. Specific to the music industry, the idea of music that’s somewhat malleable and influenced by, or controlled by, the viewer or their world is really interesting.
What do you think the potential is for VR in the music industry or beyond?
This idea of having a 2D screen as a window to this digital space is going to quickly evaporate as the viewer is simply brought into the space. Computing and human-computer interaction has consistently followed this paradigm of simplifying and trimming the intermediaries between the user and the data, and this will continue. Personal computing arguably began at the 2D graphical user interface with the Altair and the Mac, allowing us to interact with tangible-looking representations of data via the intermediary of a mouse. Touch screens then cut out the keyboard and mouse and let you to touch the data itself with your God-given fingers, while wearables let you wear the data that we touch. VR will basically go into a representation of a physical space depicting the data. Who knows exactly what that will be, but to me, that’s the plausible progression of this model of removing intermediaries between the user and the data.
What are you working on next that we should be on the lookout for?
More explorations of exciting fringe technologies —some bleeding edge, some not. For example, we just received a shiny new 3D scanner, as well as a rickety standard-def, analog endoscope from some dental supply place on eBay. That excitement in everything from high to low tech / high to low brow ideas keeps our work evolving, and is at the core of Pomp&Clout’s approach.
Follow Pomp&Clout on Twitter and Facebook, and visit http://www.pompandclout.com/ for the latest updates on upcoming projects.