Avatar: Way of Water. Blackmagic Used Extensively In All Aspects of Production

Avatar: Way of Water

As the newest and much anticipated sequel to Lightstorm Entertainment’s “Avatar” series began pre production, the team understood that while the story and visuals would need to be taken to the next level, so would the technology that supported it. “Avatar: The Way of Water” would push the skills and capabilities of the most advanced production pipeline, so Geoff Burdick, Lightstorm’s senior vice president of production services and technology, began looking for ways to handle the new demands.

Set more than a decade after the events of the first film, “Avatar: The Way of Water” tells the story of the Sully family (Jake, Neytiri and their kids), the trouble that follows them, the lengths they go to keep each other safe, the battles they fight to stay alive, and the tragedies they endure. Produced by Lightstorm Entertainment’s James Cameron and Jon Landau, the film was directed by Cameron and distributed by 20th Century Studios.

Managing a massive pipeline that comes with an “Avatar” production is more than just data processing; it’s also providing the tools to evaluate content as it’s being shot. “We evaluate live camera feeds in a manner as close to the theatrical experience as possible, so we can make real time decisions on set,” said Burdick. “This saves time during shooting, benefits Weta Digital, our visual effects vendor, and helps streamline our post production and mastering process.

Production intended to shoot 4K HDR at a 47.952 frame rate which would support the stereoscopic process but feeding that amount of data on set was a complex ask at the time. “We needed to enable that spec through our entire production pipeline, involving real time feeds to our DCI compliant ‘projection pod,’ which we used to view live camera feeds in 3D 48fps in both 2K and 4K, 3D 24fps in 2K and 4K, and 3D 24fps in HD,” said Burdick. “Obviously, there wasn’t a lot of existing hardware available to support that.”

Burdick and his team contacted Blackmagic Design early on, explaining their goals. “There were no instant answers, but they understood the vision, and had ideas as to the best pathways to make it happen,” added Burdick.

Working closely with the production’s 3D Systems Engineer Robin Charters, Burdick and his team began to drill down on every aspect of functionality. They chose to incorporate the Teranex AV standards converter, Smart Videohub 12G 40×40 router, DeckLink 8K Pro capture and playback card, UltraStudio 4K Extreme 3 capture and playback device and ATEM 4 M/E Broadcast Studio 4K live production switcher as the management hardware for the various feeds.

During live action photography in 2019 and 2020, the Blackmagic team was in constant contact, ensuring that every piece of their hardware performed perfectly,” noted Burdick.

The pipeline, with real time conversions handled by the Teranex AV and fed through the Smart Videohub 12G 40×40 and the ATEM 4 M/E Broadcast Studio 4K to provide playback and review throughout the set, worked seamlessly. While being able to review footage immediately, the value of having a multi resolution playback system also served as a necessary quality control solution.

This is very important as we move into shooting higher resolutions, frame rates and dynamic ranges, with exhibition technologies capable of displaying all this and more,” said Burdick. “As critical as the cutting edge technology is, it’s all in service of the story. The goal is for people not to notice the tech. When the audience loses themselves in the movie, we know we’ve succeeded.

Colour Grading

Additionally, Colourist Tashi Trieu has worked with James Cameron’s Lightstorm Entertainment for a number of years as a DI editor, including the remaster of “Terminator 2” as well as “Alita: Battle Angel.” For “Avatar: The Way of Water,” Trieu moved up to colourist, working closely with Director Cameron.

We sat down with Trieu to discuss his work on “Avatar: The Way of Water” and his grading process in DaVinci Resolve Studio.

I assume you worked closely with Director James Cameron developing looks even prior to production. Can you talk about that process?

I was loosely involved in pre production after we finished “Alita: Battle Angel” in early 2019. I looked at early stereo tests with Director of Photography Russell Carpenter. I was blown away by the level of precision and specificity of those tests. Polarized reflections are a real challenge in stereo as they result in different brightness levels and textures between the eyes that can degrade the stereo effect. I remember them testing multiple swatches of black paint to find the one that retained the least amount of polarization. I had never been a part of such detailed camera tests before.

The look development was majorly done at WetaFX. Jim has a close relationship with them and as the principle visual effects vendor on the project, their artistry is thoroughly ingrained in everything from live action capture through fully CGI shots. Their approach left a lot of creative latitude for us in the DI and our show LUT is an elegantly simple S curve with a straightforward gamut mapping from SGamut3.Cine to P3D65. This left plenty of flexibility to push moments of the film more pastel or into an absolutely photorealistic rendition.

 

Naturally, a lot of this movie takes place underwater. One of our priorities was maintaining photorealism through huge volumes of water. That means grading volume density to convey a sense of scale. Closeups can be clear, contrasty, and vividly saturated, but as you increase distance from a subject, even in the clearest water, the spectrum fades away to blue. This was something we could dial in quickly and interactively in the DI. Anytime we needed to convey depth, we’d add more blue and subtract red and green.

What was it like working with Cameron during the mastering process?

I’ve never worked with a director who can so quickly and precisely communicate their creative intention. I was blown away by his attention to detail and ability to instinctually make very thoughtful and creative decisions – he would often voice his rationale for even simple grading and framing decisions. As groundbreaking as the film is, his priorities never stray from the characters, the story, and enhancing the audience’s connection with them.

You grade with DaVinci Resolve Studio. Did your work remain confined to the Colour page or did you use other pages such as Fusion or Edit?

I have a background as a DI editor, so I’m very hands on in the conform and editorial process. I spent almost as much time on the Edit page as I did in Colour. I didn’t go into Fusion on this job, but that’s mostly due to the improvements in the ResolveFX toolset. Almost everything I needed to do beyond the grade could be done with those tools right on the Colour page. This was advantageous because those grades could be easily ColourTraced and propagated across multiple simultaneous grades for different aspect ratios and light levels.

Can you share with us a workflow you use and how it is affected by the stereoscopic work?

I’m a big fan of keeping things simple and automating what I can. I made heavy use of the Resolve Python API on this project. I wrote a system for indexing VFX deliveries once they arrived at the DI so that my DI Editor, Tim Willis from Park Road Post, and I could very quickly load up the latest versions of shots. I could take an EDL of what I currently had in the cut and in seconds have an update layer of all the latest shots so we could make our final stereo reviews in scene context.

This film was doubly challenging, not only because of the stereo, but also because we’re working in high frame rate (48fps). Even on a state of the art workstation with 4x A6000 GPUs, real time performance is difficult to guarantee. It’s a really delicate balance between what’s sustainable over the SAN’s bandwidth and what’s gentle enough for the system to decode quickly. Every shot was delivered as OpenEXR frames with as many as five or six layers of mattes for me to use in the grade. Ian Bidgood at Park Road had a really clever idea to have WetaFX write the RGB layer as uncompressed data, but ZIP compress the mattes within the same file. This meant we had rock solid playback performance, really fast rendering for deliverables, and the file sizes were barely more than if they didn’t contain the mattes.

Were there challenges between the SDR and HDR grades?

We had the relatively unique luxury of working in Dolby Vision 3D from day one. Our hero grade was the Dolby 3D version at 14fL in extended dynamic range. This is a great way to work because you can see everything so well. This is critical in stereo reviews where you need to see if the compositing is working correctly or if there are tweaks to be made.

Once you grade for Dolby Vision, standard digital cinema 2D at 14fL is a relatively simple transformation, with some custom trims. You lose your deep blacks with a traditional DLP projector, but it’s just as bright as Dolby 3D. One of the biggest challenges was creating the 3.5fL grade that, unfortunately, is the standard for most commercial 3D digital cinemas out there. It’s an exacting process to create the illusion of contrast and saturation with so little light. We have to make certain decisions and allow background highlights to roll off early in order to preserve contrast in high dynamic range scenes, like day exteriors. Night scenes are much more forgiving.

What’s your “go to” tool in DaVinci Resolve Studio?

ColourTrace was critical for me on this film. Each reel of the film was contained in its own Resolve project, each containing ultimately 11 unique timelines for each of our various theatrical picture deliverables. Tim Willis, my DI editor, kept those in editorial parity across cut changes and VFX updates. When we’d lock a grade in one format, I’d ColourTrace those grades to other formats and trim those further. If we made changes to framing and composition in one, I could easily ripple those changes back through the other formats without overwriting the grading decisions. It’s simple, but the amount of time saved and the elegance of that sort of workflow is what kept us from working too many late nights.

Was there a favorite scene from the movie that you loved grading or presented a unique challenge?

There’s a “town hall” scene between the Sully family and the Metkayina clan that takes place during a rainstorm. It’s an absolutely gorgeous scene that evokes Rembrandt. The cold, overcast skies wrap around the characters and a subtle warm accent light gives the scene a really nice dynamic. It’s insane how absolutely real everyone looks. You have to actively remind yourself, “everything in this shot is artist generated, none of this beyond the actors’ performances is ‘real’ per se.” It’s truly a generational leap in visual effects artistry and technology. It’s absolutely extraordinary.

Yes! Subscribe me to your website

News, reviews, tutorials and more from the worlds of video, photography and imaging

Get much more with a monthly or annual membership for less than $3 / month

Click the
images at top right
to sign up

Or...

Basic Membership is FREE!

Cover of GoPro Special Edition

Leave a Reply

Your email address will not be published. Required fields are marked *