Ideation / Design / Animation / Colour / VFX / Editorial / Creative Technology / VR / AR
Craft the first-music-video-of-its-kind using cutting-edge creative technology and a virtual production pipeline to deliver a seamless relationship between the real world and the simulated.
Jake Schreier is known for his highly-choreographed storytelling that takes place within a single take, previously seen in music videos such as Chance the Rapper ‘Same Drugs’ and Haim ‘Want You Back’. In order to facilitate this method within a fully-CG environment, The Mill’s artists and technologists executed an ambitious real-time rendered shooting technique.
The result is not only a visually stunning promo, but an advanced application of real-time technology that pushes the boundaries of creativity and innovation, resulting in a huge response from the music and technology press celebrating the innovation. The video received over 1,325,652 views on Vevo.
Filmed on a motion-capture stage, a cutting-edge virtual production pipeline was required to deliver a seamless relationship between the real world and the simulated. A display screen rendering the CG character, landscape, props and structures in real-time, allowed the director to, in effect, direct the CG character as opposed to the live-action dancer, as she moved through the stylized world while being shot with a handheld camera. This allowed the filmmakers to adjust the characters actions, the lighting, and even environmental textures, all whilst still on set.
Ultimately, the character is a fictional feline Princess dancing in a magic world a song. For the character development, our design team worked closely with Magnus Høiberg aka Cashmere Cat and Jake Schreier on some fascinating references they provided, ranging from Japanese anime to early ‘00s games. Ultimately we wanted to craft a cat character that incorporated Nordic influences, as Magnus is Norwegian. Designers Sasha Vinogradova and Sidney Tan worked on a couple of rounds of sketch explorations and Magnus refined what he liked from what was presented. Sasha then sculpted the final character in ZBrush, which was later translated into Unreal engine and helped the director visualize the form and dimension of the character for approval.
“We came to The Mill with a strange idea of a beautiful virtual cat, and I had so much fun working with them to bring her to life”
The concept of the music video is to show the cat dancing alone in her fantasy world, before pulling out to a final scene that reveals that Magnus (Cashmere Cat) and the performer Margaret who are sharing a dance in real life. Jake Schreier’s work is all about impressive single takes and single camera moves. It was crucial for us to ensure this technique remained possible for Jake, even when we cut from the augmented world to live action. Therefore, our creative approach was to shoot a motion-capture performance and to implement a virtual production pipeline to enable as much creative control on set as possible.
“When Magnus first described what he was looking for to me, he said he “wanted to disappear.” We had been playing a ton of Fortnite together (we are both terrible), so when The Mill proposed working in the Unreal Engine it was a natural fit.”
We knew an ambitious virtual production pipeline would be required to deliver a seamless relationship between the real world and the simulated, for everyone on set. A display screen rendering the CG character, landscape, props and structures in real-time, allowed Jake to, in effect, direct the CG character as opposed to the live-action dancer, as she moved through the stylized world while being shot with a handheld camera. This allowed everyone to adjust the characters actions, the lighting, and even environmental textures, all whilst still on set, which meant we could try out different iterations without lengthy waits for rendering. The process required a two step approach. The first step was to get the general layout and as many assets in place to create the look. The environment was then lit, textured and elevated to create something as close to finished as possible. Once everyone was happy, we would test the interactivity and refine it.
The process started with customizing the design of the virtual camera system Jake wanted to shoot the music video with as well as creating a 3D model of the environment, mocap stage and the practical rigging set pieces that the dancer would interact with. The virtual camera system was connected via wireless to a huge 24 foot 3D-Live LED screen showing the real time Unreal Engine rendering. The environment layout was carefully planned so the ‘sunlight’ shining over the lake and lighting the cat emanated from the LED screen onto set creating an immersive ambiance similar to the 3D scene, to help elevate the performance and inform Jake’s decisions.
“It’s great to see The Mill pushing virtual production in this way. Princess Catgirl represents a shift in music video production, and both Jake Schreier and Cashmere Cat have shown fearless creativity for this innovative result.”
“I’d like to imagine this equals the excitement of early film pioneers when they first produced movie magic! Our team moved virtual boulders, trees, and castles in the far off sky, all in-service of illuminating Jake's vision.”
Alongside development duo Hiro Miyoshi and Mars Wong, The Mill’s Creative Technology team dove head first into architecting an elegant technology flow leveraging the best in class tools available. Epic Games’ Unreal Engine served as our hub — all queued by Mill smart+virtual timecode slate we ‘live linked’ talent performance via the Vicon motion capture stage and displayed on 3D LIVE LED display. Practical lighting included custom-built scene aware ARRI SkyPanels. There’s a moving moment when both cats finally embrace, triggering a game event that transforms both the virtual and real world from day to night —like magic!
Given that this project is a music video, timing was everything. The dancer’s movements had to match the beats of music. Every step had to be synched up and match perfectly. In terms of the virtual pipeline, an area we had to troubleshoot was that we were at the mercy of how the engine resolved the simulation of clothes and fabric. It was changing with each iteration so some trial and error was required to get to a point where the engine was consistently replicating iterations correctly, pushing it to its limits.
This first-of-its-kind music video is a study in what’s possible through creative exploration on set leveraging motion capture, realtime interactive visualization and interactive set extension. It is a fun and refreshing pursuit in production efficiency empowered by technology in the service of the creative. It is a signal of a larger paradigm shift fueled by virtual production. What is captured on set is near-final. The final stage is clean up, additional character modeling and iterations, such as adding volumetric mist and fog. They were all very minor adjustments.
“I’m so thankful for The Mill, we couldn’t have seen our vision through without them. We came to them with a strange idea of a beautiful virtual cat, and I had so much fun working with them to bring her to life. I would also like to give a special shout out to Millton, the cat who lives in the parking lot of The Mill studio. I love him.”