The Lion King

Q&A with Elliot Newman, MPC VFX Supervisor

Technicolor’s MPC Film was a creative partner with the filmmakers of The Lion King from beginning to end – from when VFX Supervisor Adam Valdez first pitched a methodology to Disney and director Jon Favreau in October 2016 while they were still wrapping up The Jungle Book – through location scouting and virtual production and on to responsibilities for all VFX and Animation.

MPC VFX Supervisor Elliot Newman discusses how far the VFX and Animation evolved from The Jungle Book.

 

Q - As VFX Supervisor along with Adam Valdez, how early was your involvement with The Lion King?

A - Basically the first year or so I was supporting our efforts in London while Adam was focusing more on the virtual production side of things in Los Angeles.

Our first concern was the workflow. This was something new [so] what would it take to go from the virtual production stage, to turning over the shots to post-production, and then delivering those shots? There was some technical work in terms of what the pipeline needed to be.

A number of months later, when Adam returned after the shoot in LA, we really started getting into what the final shots needed to be in order to make it into the finished film. The first round we worked on were the canyon shots for the stampede sequence.

 

Q – How was working on The Lion King different from The Jungle Book?

A - Because we had the experience of working on The Jungle Book, with the same group of filmmakers, we had the advantage of hindsight moving on to The Lion King.

So it was less about: how do we do this? And more about: how do we do it better and continue to refine it? Working on Jungle Book, we had already figured out a lot of new territory. On Lion King, virtual production was the real new territory. Most everything else – how do we render fur, muscle simulation, all those types of things – were version two from Jungle Book to Lion King.

 

Q -What was your specific role on The Lion King?

A - With my technical background, Adam and I complement each other quite well. While the shoot was happening [with Adam back in LA], I was figuring out the technology processes we needed to put in place to help make things better, such as improving grass and tree assets, and all the things we needed to get really right on this movie.

I was focused on establishing the workflows for the artist group, the technical group, and the software group. I tried to be the glue, so that once we were in the actual post-production phase, turning over shots – we were ready to receive them, we had a workflow down, and the technology was ready for it.

I was also focused on look development, fur shading, lighting and techniques and processes. Once all those things were established, we had a good grounding for our first final shots. Then the question became: how do we handle the volume – the 1500-1600 shots we had to deliver?

We decided to build unit teams as a way to handle the division of labor and sharing of the workload. For the last 12 months of the project we were in that phase of unit delivery. Adam and I would regroup and consult during review sessions, looking at each other’s shots and discussing what we were learning from these sequences.

 

Q - You were closely involved with the lighting. How did you approach it?

A - It started with the brief from Jon [Favreau], and he often said in early reviews how he didn't want it too perfect – the perfect sky, the perfect lighting conditions, the perfect dappled sunlight – that makes you instantly believe it’s not real. He liked the fact that sometimes in nature, you get imperfect situations. It’s the reality that's important. Reality and documentary-style and photographic quality were the big attributes for Jon.

We demonstrated the philosophy of capturing reality on one of the African trips. We had a team of data capture people there mainly to photograph foliage, species of plants and trees, but also to capture lighting environments. We took big 360-degree HDR images of the sky – and figured out exactly how to capture the sun itself, which is quite tricky as you can imagine.

We built up a big library of these high-resolution images – and this is the library that we then placed into one of the early tests we did – with Rafiki – basically putting him through five or six different lighting designs. We tried to keep the naturalism as close as possible – and it was probably the first moment Jon really responded to and said: this is what I can see the movie becoming.

It also demonstrated the technology we'd been working on – the new skin shading and fur shading; the new approach to lighting more realistically; and the scientific approach to capturing nature.

 

Q - How did virtual production affect the work that you were doing?

A - On this movie, there was no physical stage, so there were no limitations in terms of lighting and things like that. You can really make it as if you’re on location, photographing it, and then capture that as much as possible in the CG rendering.

At the same time, there is a story to tell, and using cinematography to help tell the story, and striking a balance of what it would be like on a natural location shoot. As we evolved through the process, I think we struck a nice balance between a cinematically pleasing film without breaking the reality of it.

With Rob Legato [VFX Supervisor] and Caleb Deschanel [cinematographer] on board, they had a strong vision for every shot and every scene in the movie. There was a lot of handcrafting involved in every shot to tell the story and make sure it was visually appealing. But if you look at the result, you will believe that's the real sun, and the light is natural to that of being on location in Kenya. We put a lot of research into things like:

- How does hair respond to natural light, and how do we shade that correctly?

- How do we take advantage of new renders that can produce more physically accurate ray tracing – so that we know the shading model we're putting into the characters is as close as possible to being physically correct?

- How do we mimic light scattered/transported through skin layers, that’s as physically plausible as possible?

 

Q - What was it like getting to work so closely with the cinematographer, Caleb Deschanel?

A - It was quite inspiring to have him here and to spend time with him. It's quite rare that we get to have the DP with us in reviews, talking about what they like and don’t like. And he did it in a way that’s the perfect level of detail, to extract what you need to know. He has an amazing ability to say in just a few words what needs to change, so you get it straightaway. We were very fortunate to be involved in the creative process, in the filmmaking process, with him.

It was interesting watching someone with such a wealth of experience adapt to the technology and the toolsets we have – working with VR and game engines. He lit every scene on stage through the video game process, the Unity tools, and saw that translate into proper rendering.

 

Q - Can you also talk about working with VFX Supervisor Rob Legato?

A - Caleb did a couple of talks with Rob, and it was fun seeing them work together and the respect they have for each other. When Caleb wasn’t there, Rob knew a lot about the cinematography aspect and could help us with staging.

Because it’s fully CG, with a full 3D environment and characters, if a shot wasn't working in the cut, you could go back to the [virtual] stage and it could be reshot – with Rob and the virtual production team back in Los Angeles.

It’s very dynamic in that respect and really opens up the creative scope, because you’re not locked into anything. If the lighting isn't working, if the camera isn't working, you don't have to live with it. And it doesn’t require a big expensive reshoot. We can relight it, or make other changes at any point, and the performances don’t change.

It gives the filmmakers more creative freedom to make better make choices.