Develop a title sequence for the 2018 AICP Awards (encompassing the Show and the Next Awards) that embodies innovation.


Utilizing technologies that have never before been combined, our teams of creative technologists and designers developed a piece of custom software that had the power to generate animated content live.


During the 2018 AICP Show and Next Awards, our custom software generated a live title sequence that produced unique visuals and audio upon every iteration. The sequence incorporated camera feeds from around the MoMa (Museum of Modern Art), enabling guests to see themselves reflected in the animated design.






Concept / Direction / Creative Coding / Software Development / Design UI & UX

When tasked with crafting the iconic AICP Title Sequence, we wanted to push the boundaries of title sequence design and do something that had never been done before. With an open brief, this was the perfect opportunity for The Mill’s creative and technical minds to join forces and develop something truly groundbreaking.

The project started with a simple question: How can we create a title sequence that is unique each time it’s played? To achieve this, we built a custom piece of real-time software capable of procedurally generating motion graphics.


Laying the foundation

The first step was designing a procedural grid system. Knowing that the majority of the content would be pseudo-random, this grid system provided a reliable backbone for an otherwise unpredictable piece. We were able to dial in parameters for things like subdivisions and count, which resulted in a wide variety of grid layouts.


Populating the grid

After designing the grid system we created a library of generative modules. These modules included solid shapes, outlines, blur panels, live camera feeds, live text, and audio-reactive shaders.

Once the grid system and modules were defined, we used the software to procedurally generate a wide variety of content, including dozens of interstitials. Each composition was truly a surprise.

Controlling the chaos

A custom GUI (Graphical User Interface) gave us ultimate control over a wide range of decisions. While the goal was to create a system full of unpredictability and surprise, we still wanted to define certain aesthetic parameters. This GUI allowed us to explore different attributes, such as probability, frequency, and color.

The overall visual aesthetic was inspired by the principles of traditional graphic design: grid-based proportions and compositional balance. The custom software was then used to create live content for Base Camp, the AICP Show and the Next Awards. Each iteration was unique both in visuals and sound and will never be repeated again.

The use of this technology demonstrates the vast potential of human-machine collaboration, a subject that The Mill has been exploring through the use and development of AI, real-time animation, and generative art. Additionally, it highlights the beauty of discovery, as no one iteration of the audiovisual narrative will ever be the same.


Emerging Tech
Emerging Tech The Mill
Director William Arnold
Producer La-Râ Hinckeldeyn
Animation Chet Hirsch
Lead Developer Eric Renaud-Houde
Audio Developer Michael Dunkley
Graphics Developers Jimmy Gass, Joji Tsuruga
Audio Composition Antfood