The Mill crafted a compelling and engaging film for the launch of Sony’s groundbreaking Spatial Reality Display. A leap forward in 3D display technology, and a truly immersive, multi-dimensional viewing experience, this innovative tool allows artists, architects, product designers, and creators the freedom to see their creation in three dimensions while they craft it. With the power of Sony’s high-speed vision sensors and face tracking technology, the monitor responds to the movement of the viewer in real-time, creating a 3D imaging effect not possible on typical 2D screens and displays.
To demonstrate this technology, The Mill was tasked with creating a complex and captivating CG scene, powered by Unity’s game engine technology, that showcases the stunning power and visual fidelity of this unique new tool.
Mill Creative Director, Andrew Proctor, created an engaging scene featuring a futuristic metropolis and flying vehicle for the display. Andrew comments “We wanted to show how far creators could take this technology and explore the limits of Sony’s Spatial Reality Display. Working with Unity, we could craft a seamless scene for the audience to explore. The Sony Spatial Reality Display monitor is a huge leap forward for our industry. I’m excited to see how we can utilize the technology to expand our creative potential even further!”
3D Artists, Jason Kim and Troy Barsness led the construction of the three-dimensional world with the flexibility of Unity’s real-time platform. Jason comments “Using Unity, we were able to see how things were working in real-time as soon as the elements were ingested into the engine. For example, the team changed things such as materials of the hero assets, lighting direction, color schemes, and layout of the environment on the fly, and didn’t have to lean on render farms to see the final output. This was a highly efficient pipeline. Additionally, we had the amazing opportunity to work directly on Sony’s Spatial Reality Display, which was specifically built for Unity integration. As a result, it was possible to play our film directly in the engine and see the output on the Spatial Reality Display unit in real-time, with eye-tracking enabled, and without having to generate a build every iteration.”