Framestore: better animation, faster

Oct 30, 2020

As part of the Industrial Strategy Challenge Fund, the Audience of the Future programme, delivered via Innovate UK (part of UKRI), has enabled the development of a number of innovative and ground-breaking projects across the creative industries. This article is one in a series, written to highlight the exciting developments and achievements the funding has led to.

Framestore: better animation, faster

Bafta and Oscar-awarding winning creative studio Framestore have been working on FIRA, the Fast Immersive Rigging and Animation toolset that enhances and accelerates the production of character animation, reducing the time and costs of producing immersive content. Over a third of the FIRA project funding came via the Audience of the Future Challenge, and the new tools have played a role in productions already, including the new His Dark Materials series. Framestore’s VFX supervisor Theo Jones explained to us how they did it.

What’s the big idea?

Framestore’s Fast Immersive Rigging and Animation project was designed to address two big barriers holding back growth in immersive animated content – the quality of character, creature and performance animation and the time it takes to create new work.

The project addressed two strands. The first was research and development led, looking at machine learning to develop faster animation rigs (rigs being a technique in computer animation used to create 3D character models with full skeletal and muscle definition).

Faster rigs mean animators spend less time waiting for edits to their work to update. It can also improve animation quality when moving a character from a visual effects (VFX) package into a games engine. Historically, this process compromises visual quality in order to run at the correct frame rates.

High end VFX rigs often comprise millions of nodes that describe the complex relationship between a character’s input animation controls and the final output that gets rendered to the screen. FIRA’s technology uses machine learning, or AI, to learn this complex relationship and store it in the form of a neural network. This allows all those millions of nodes to be replaced by a single neural network node, resulting in massively reduced computation times.

The second strand was a partnership with Weightshift, a technical product design agency. Together they are developing new physics-based animation tools that make animation mechanics more time-efficient for animators. For example, foot placement as a character runs, leaving more time to work on subtle, complex character performances.

Framestore won its first Oscar (2008 Achievement in Visual Effects) for The Golden Compass

Who is it for?

Framestore’s own animation department and immersive development team are already using the tool. It will be available to external animation studios who are interested in buying licenses for Weightshift’s technology.

How has Audience of the Future supported the project?

Profit margins in the VFX industry are tight meaning paying projects often win out over R&D time. Framestore was already discussing ways to test Weightshift’s tools in production processes so the funding allowed Framestore to formalise the relationship between them with appropriate resources. With £230,000 contributed by Audience of the Future towards the total project cost of £660,000, Framestore could also invest more heavily in machine learning R&D.

Is there anything unique about the partnership with Weightshift?

Framestore has given Weightshift full access to the whole computing power behind its render farm – a high performance computing system typically used to create film and television visual effects – as well as its animators and riggers. This filled a gap in Weightshift’s expertise and helped mature the technology faster. Framestore, meanwhile, got early mover access to the cutting-edge physics-based animation tools.

What stage is the project at?

The funded project is now complete and Framestore is using Weightshift technology in client work, including Disney’s Lady and the Tramp and the first season of the television show His Dark Materials.

The FIRA project has also included developing a plug-in for the real-time 3D creation platform Unreal Engine, to help get its new machine learning accelerated rigs working in real time for immersive projects.

The team is still working on improving rigging specifically for facial animation and has developed a promising prototype and proof of concept. It aims to complete that work by the end of 2020.

What impact has the project had?

The output from the FIRA project is already helping Framestore win more work, including more immersive projects.

The FIRA team surveyed animators already using Weightshift tools in production work and found the new tools help animators save up to 50% of their time on certain processes. When FIRA was presented to Framestore’s worldwide teams many were in touch looking to add the technology into their tool kit.

The impact beyond the UK will also be extended, as Framestore is part of an EU-funded research and innovation programme called PRESENT, via Horizon 2020. The output from the FIRA project is reaching into the future, informing this three-year research programme.

What next?

A further £175,000 grant from Innovate UK will help Weightshift improve the efficiency of its artificial intelligence so animators can run its next generation AI animation tools on local machines, bypassing the render farm and cutting wait times for processing from 30 to 5 minutes.

Framestore can also now work with Weightshift to develop new crowd system technology. This will allow Framestore to create computer-generated people for live-action scene backgrounds, particularly relevant at the moment since the COVID19 pandemic has placed major restrictions on the number of extras allowed on set.

Theo Jones, VFX supervisor, Framestore

Theo Jones is an Academy Award-nominated VFX Supervisor, based in London. During his time at Framestore he has worked across a wealth of titles and jobs, spanning the Film, Integrated Advertising and Immersive divisions.

Theo is known for his work on Christopher Robin (2018), Guardians of the Galaxy: Vol. 2 (2017) and Doom (2005).