How to... animate the future
Boo Wong, Group Director of Emerging Technology at The Mill, explains how real-time character animation systems like Mill Mascot are revolutionising filmmaking.
Whether for flight or fancy, we have always been fascinated by motion.
From the superimposed actions of animals depicted in Paleolithic cave paintings, to drawn sequences of battles on darkened murals in Egyptian tombs, shadow puppets, magic lanterns, zoetropes, hand-drawn cels, armature wire and mystery wind-blown apes, and the first advent of computer-generated toy characters to the digital humans and photoreal VFX we have today, we have always found a way to capture and then tell the stories that would otherwise not be seen.
Anyone who has ever worked in animation knows how hard it sometimes is to simply add a few frames of head and tail to each animated performance.
Technology has kept up with and been driven by this very human need to share a vision and communicate. Today, we have come to expect and rely on physically-accurate simulations, photoreal and real-time rendering, and machine learning that superpowers image, motion and data analysis.
Above: Boo Wong, Group Director of Emerging Technology, The Mill
Let’s stop there for a moment.
Last year, as we wrapped production on a multi-month VFX-filled commercial of a giant furry character tearing through a city, our client circled back around to say thanks, nicely done, and now can we talk about creating 30 more spots of the same said creature? But in a fraction of the time? Did we already say three weeks?
In normal history we might’ve said thanks so very very much, it was simply amazing to have had the chance to work with you on the one fab spot but please do come back when you find several more months in your calendar. Instead, as fate, innovation and the state of game engines would have it, The Mill’s Creative Technology team were just then pushing the limits of visual fidelity on interactive AR and VR projects. And we were inspired, therefore, to bring Monster’s VFX character into the real-time world.
Credits
powered by- Agency The Mill
In collaboration with one of our creative directors, Jeff Dates, we built a real-time character animation system we later called Mill MASCOT that uses live inputs from puppeteer to drive a photoreal furry creature for our clients at Monster and KBS. We set up a day-long “shoot” where our puppeteer performed and we captured, live on set, over 4 hours of final pixel animation - i.e. no post rendering or composites.
This was a huge advancement from a standard CG animation job, where it could take weeks or even months to see a digital character rendered and comped into a scene. Using the Mill MASCOT workflow and a live animation session, our render times ran at 42 milliseconds per frame, meaning that final pixel performance and capture were, for all extents and purposes, instantaneous.
Credits
powered by- Agency The Mill
Anyone who has ever worked in animation knows how hard it sometimes is to simply add a few frames of head and tail to each animated performance. And here for Monster, we had over 4 hours and multiple versions of final rendered shots for the editor to use as selects from which to cut the thirty spots.
What Mill MASCOT has shown is that using real-time technology is not just for interactive products. It has the potential to be the most powerful tool yet in our arsenal of filmmaking.
The companies making game engines have very much realised this, too. Both Unity and Epic Games have enterprise divisions targeting Film and TV, or Animation. It’s not just a way of producing content faster, although game engines do that exceptionally well as in the case of MASCOT. Real-time technology brings with it all the game logic that allows for smart cameras to auto follow objects in scenes, trigger lighting or cue other behaviors, develop character AI, set up A/B testing for animation and performance, as well as utilize built-in scalable and concurrent workspaces for collaborators.
Above: Rogue One's K-2SO was rendered in real-time.
Game engines are both a platform of creation and the delivery system. It opens up user-generated content, and it wraps back around to allow for interactive content which was its original strength.
Beyond MASCOT, we’ve continued to push the limits of game engines to create linear commercials and branded content, as well as to generate visuals for fashion shows and concerts. ILM have worked with Epic Games to render out the droid K-2SO in Rogue One and also their ray-traced short Reflections. Meanwhile, Disney and Unity created three shorts for Big Hero 6, starting with Baymax Dreams of Evil Sheep.
It’s up to us to remember our roots, be inspired and continue to find new ways to tell the stories that would otherwise be impossible.
At The Mill, we’re also using game engines and augmented reality on set in virtual production pipelines through Mill Cyclops. Film companies like WETA have also brought virtual production into their workflow on projects like Avatar.
The very human drive to invent and entertain has led us through this modern era of animation through the innovative and imaginative work of Émile Cohl, Georges Méliès, Max and Dave Fleischer, Ray Harryhausen, Disney and Pixar, amongst many many others. It’s up to us to remember our roots, be inspired and continue to find new ways to tell the stories that would otherwise be impossible.