Spotlight | Will Adams on real-time animation in Sport & Entertainment

Thought July 15, 2020

Respawn Entertainment | Live from Mirage’s Holo-Day Bash | The Game Awards

Talk us through the world of sports and real-time animation. What’s the result when the two worlds combine?

Sports and real-time animation have overlapped for years now. Many of the earliest sophisticated examples with it were in games like FIFA, Madden, etc, as sport titles pushed the limits of real-time 3D engines. Over the years the hardware got better, the graphics matured, the animation became more realistic and the AI drastically improved. Playing the game became much more like watching the real thing. In this way, sports paved the path for what could be possible for realistic game engine experiences, and created a natural simpatico between the real and the virtual.

In the past decade, motion capture brought an extra level of realism to the player movements but still with some issues, especially as animation cycles blend together. Machine learning has helped to close the gap. We can train models on mocap data to provide realistic interactive player locomotion. That’s an extreme generalization of what’s happening, so check out the work done here and here by Sebastian Starke and team from the University of Edinburgh.

The result is fast paced, natural, human motion that accurately responds to interactive events, whether that’s being driven by joysticks or by live data captured in a stadium. Imagine instant replays of a live NFL game, recreated in real-time using data captured on the field, where the audience can move the camera around freely in ultra slow or frozen time. Or using said capture information to train more realistic performance models of players in video games, updating how they move and behave (not just their stats) as the season goes on.


What do you see as the short and longer term future of crowds in sport and entertainment?

We will always have a need for crowd work in visual effects. Doing it in real-time and taking advantage of AI / Machine Learning will only make it better and faster. This all seems like it’s on a pretty steady path for improvement. What is a little more questionable is the fate of live event crowds during the pandemic, so let’s explore that quandary.

We could have real-time crowds respond to action happening in a game or performance, but we should proceed with caution. This starts to get uncomfortably close to the equivalent of a laugh track. We are taking the human element out of the crowd and filling the void with something very artificial.

A crowd is more than just moving bodies in the stands, it’s a collective energy that is highly infectious. In many ways, the crowd has a symbiotic relationship with the action: players and fans feed off each other’s energy. The Seattle Seahawks famously recognize their fans as the 12th player on the field and I’m certain many other teams and clubs around the world have a similar mythos. For better or worse, many players will perform differently without that crowd’s energy coursing through the stadium.

So instead of defaulting to something hollow, we should be thinking about ways of visualizing that energy in a stadium, both for the audience as well as the players. We are already seeing this in live streams for musicians and gamers: fans spamming the comments with a salvo of emojis and messages. And we were already seeing these types of requests from stadiums going through digital transformations: looking for solutions of how they can better extend the venue experience to fans at home. Maybe it’s a companion app that has a robust feature set for audience participation? Maybe it’s as simple as a hashtag on Twitter? Either way there needs to be an interactive mechanic that connects the thousands if not millions of people around the world to some epic audio visual system in the venue.

In the short term, I hope we see creative visualizations that are powered by the action of the game as well as the audience watching around the world. In the long term, fans will return to the stands; there is no virtual experience that replaces the roar of 50,000 people erupting in unison. But many will be returning to venues that also facilitate audience participation from home as well.


The Mill worked with Respawn to bring gaming character Mirage from Apex Legends to life, what do projects like this teach us about opportunities in other sectors?

The live Mirage performance was amazing, though I think it was hard to tell that it wasn’t a pre recorded video on playback during the show. The audience needed more clues that it was responding live in that moment. Regardless, it’s a fine example of the quality we can achieve with virtual characters in a live setting. And Mirage wasn’t our first: we have actually done a variety of different characters for live events, stage shows, broadcast and streaming, all using similar technologies.

We’re used to seeing characters like these in a scripted context, delivering the same array of responses over and over, most notably in video games and interactive displays. The veil of story immersion deteriorates as we see the same response cued up for the 10th time and we can feel the limits of their programming. Adding a human performance to the character gives the personality endless potential and you begin to lose yourself in their reality. That moment of suspended disbelief, however brief it may be, is magical.


Find out more about Mill Experience here and get in touch with them via

Mill Experience Director, Will Adams