3D artists are still able to use familiar tools or software such as Maya, Max, ZBrush, and Substance for content creation, whereas other 3D content such as models, textures, or character rigs can all be integrated into game engine software. Unity or similar game engine software gives the artist real-time visual feedback. This enables the artist more time to be an artist by removing costly and time induced tasks, such as framing cameras and adjusting lighting scenarios in real-time with instant visual feedback. While for example calculating shadows, reflections, and camera depth of field featured in traditional rendering workflows are essentially obsolete. Unity provides us an opportunity to explore new dimensions of design in real-time.
It’s often been said that we’re at the dawn of merging various mediums of entertainment being utilized in advertising, movies, games, or user-based experiences, and much more. Programs like Epic’s Unreal Engine “Unreal Fellowship” are inspired and geared towards merging these mediums. It’s clear that real-time tools will continue to improve and advance and I believe 3D artists can only benefit from adapting to learn real-time techniques or software.
Simply creating forty-five minutes of content within a specific timeline is no easy feat. The show required crafting a storyline for each character or musician given their unique complexity and personal traits. The story had ultimately evolved through a lengthy yet thorough pre-visualization process.
It’s difficult to select favorite scenes or shots within the show given the length of the piece. I can say how incredibly grateful I am to all the talented individuals and artists who helped create the visual content for the concert. I’m also extremely proud of the work we were able to produce over the course of a few months.