#NABShow 2014 Day 02.02 Shooting and post for EDR/ HDR #edr @dolby

Panel was: Curtis Clark, ASC (Moderator) Matt Litwiller, Observatory Media Collin David, Observatory Media Travis LaBella, Observatory Media Rick Taylor, Dolby Laboratories

Panel was:
Curtis Clark, ASC (Moderator)
Matt Litwiller, Observatory Media
Collin David, Observatory Media
Travis LaBella, Observatory Media
Rick Taylor, Dolby Laboratories

 

 

 

 

EDR stands for Enhanced Dynamic Reproduction, which is Dolby’s High Dynamic Range (HDR) application. This discussion talked about EDR, in reference to one specific short film that was put through post as an EDR project, Telescope.


images

Telescope was a short film that shot on the F65 and followed an ACES workflow color pipeline (more on ACES here), so was able to adapt during finish to an EDR color and mastering pipeline.

EDR has an extended contrast range, which is viewable on the Dolby Pulsar monitor, which has a 4000 nit response (a nit is a measure of luminance).

EDR mode on the F65 has a theoretical 19 stops of latitude, and is fine within an ACES colour pipeline, which can handle up to a 30 stop dynamic range. There are only a few cameras that can encapsulate this dynamic range- F65 and RED Dragon were chosen for this very specific task.

Both DPs on stage spoke about the fantastic dynamic range, which enables far superior detail in highlights and shadow areas, and both spoke about the fact that the descisions when mastering were about what to leave out, not what you could hope to include. The wide gamut and deep colour saturation gives a very different palette with which to tell the story, and allows a real finesse to the grading of the material shot.

The high latitude allowed for a high degree of practical lighting from the art direction- lighting that comes from the actual set, in this case from the LED buttons in the spaceship and ambient light from the lights that are part of the set. This style of shooting will give cinematographers a greater freedom because they will not need to fill light quite so much- they can rely on the latitude a lot more. This will be a gift to night and low-light shooting.

What has not caught up yet is the on-set monitoring and scopes available, so while it was shot, the film was only monitored down to REC709 color space, discarding a lot of the detail.  In fact, quite a few of the participants said that they did not realize the wide detail and color range available on the EDR shots until they got into the DI and mastering.

This goes right into the politics of workflow- how do you manage expectations and get consensus on the ‘look’ of the film when you can’t view even an approximation of the final product while you’re on set? Dolby is showing a prototype at this NAB show to address this- an onset monitor that displays 2000 nits, and would be priced to be an effective onset monitor. The VFX unit on Telescope was taking 16 bit linear EXR files anyway, so they were aware of the maximum latitude available, even though they do not have the monitoring or compositing techniques to take advantage of this at the moment.

TelescopeScifiShortFlyingLinestsr01

Dolby uses the Dolby Vision system in order to create the various format deliverables, and this system was able to land onto the many different colour spaces required for a full deliverable list.  The grade was still done in P3 colour space, and regulated by ACES. In fact, during this process, the ACES RRT (Reference Rendering Transform) needed to be adjusted, as it rolled off dramatically at 14 fL (foot Lamberts), but the EDR had a nmuch higher latitude, beyond this point. Now because of this process, the new RRT gives an extended head room within the RRT to allow for EDR images.

“The beauty of aces is that it allows for these issues with new formats to be resolved, so it is a constantly evolving workflow tool”– Curtis Clarke

 

The promise of this style of shooting is only just being exploited. The panelists imagined, for example, a horror film shot completely in the low end, deep blacks of an image, and what this could do for storytelling. This was a major thread of the talk from every panelist- that these techniques need to support the storytelling. This is just the start, and you can imagine just how a new generation of filmmakers will take advantage of the new latitude- a generation that had never known that 9 or 12 stops in image latitude was the limit anyway.

 

New Camera Test

Colour chartBack home in New York this week, where I went to a camera test seminar downtown, comparing most of the new generation of digital cameras and DSLRs side by side, also throwing two 35mm film stocks in there as well.

What was interesting was that the company doing the tests was a camera rental company, who actually stated up front that they were ‘only concerned with image quality, and wouldn’t be covering workflow’ in the discussion.

But if you’re comparing, say, Sony’s new F3 camera against a Red One against an ARRI Alexa against 35mm, all with different colour spaces, gamma curves, processing options, compression and dynamic ranges, that can be manipulated via various metadata and after-the-shoot control options, all with a very real effect on the image, then the workflow is absolutely a determinant of the image quality. 

To put these images side by side with no discussion of, or even interest in discussing, the workflow used to get to these images is missing the point a bit. And anyone who tells you differently is probably just trying to sell you something.

ARRI Raw Dailies Workflow

This article first appeared in Definition Magazine, UK, in 2011.


ARRI Alexa

What were your thoughts when you first discovered that you were going to be taking an ArriRAW film through dailies?

Sixteen19 was involved with some of the very first ArriRaw jobs anywhere. We shared the workflow execution with another company on Extremely Loud, Incredibly Close in New York in 2010, and then set up Gambit soon after in London.

The Codex recorder had just been certified by Arri to be used as a recording device for ArriRaw, which was a significant development because Arri themselves hadn’t made a recorder for this format- they were relying on a secondary market of companies to build accessories for the camera.  There are a few factors that go into investigating any new format,

For feature film work the first thing you need to find out is what the format is doing in it’s colour space and debayering. Basically, how the raw data will be processed in order to look like film.

In simple terms the colour space is represented by a ‘colour curve’, which basically describes how the image colour  tonality is represented within a digital image. Sony has S-Log, Panavision has PanaLog, and Arri has ArriRaw- each of these are manufacturer-specific ways of encoding and representing a digital image. These in turn almost always reference Cineon encoding, which was the original file type that Kodak developed to represent a film frame as a film scan.

The debayering really just means how software further down the workflow will interpret the raw digital image data into a coherent picture image.

Once you’ve worked out how these two elements work together to produce an image out of the camera system, then you are ready to try and use these things to give the DP the look he is after on the dailies, either through a DIT colour grading on set, or by having a dailies colourist go a more traditional route and do this work after shoot is finished.

It’s best not to lose sight of what the dailies pipeline is for- it’s not just about creating pretty pictures, although that is the most immediately obvious role that dailies have. Doing the daily rushes on any film is as much about organising the material that’s coming from set, so that it can be coherently used by an editor and post production team for sometimes four times as long a time period as the shoot. So the hidden side of any new camera is working out as well just how you can get as much information from the process as possible. This is called the metadata, the information about the data you hold.

How much involvement did you have with establishing the on-set workflow? How important is collaboration between DIT and DoP on the one hand, and equipment on the other?

In my role as Workflow Supervisor, I usually kick around the workflow between the DP, editor and producer/ studio for a few weeks in Pre Production. The DIT usually comes into that conversation as well, which is a factor. There are also tests that need to be run first of all from the set to dailies/ editorial, then right the way through to VFX plate delivery and the digital intermediate finishing for the feature. It’s a mix of technical knowledge, equipment product knowledge and politics. The diplomacy and politics can measured exactly between the requirements of the principal creatives and the shortfalls of any budget.

The collaboration starts with how the DP wants to work with a DIT, which can range from a simple monitoring setup that shows people on set the camera images, and can grow into a whole grading setup ‘in the tent’, with full colour management and image control.

This collaboration also extends to the editorial crew, in working out precisely how they want their files collected and organised. The quality control (QC) pass to check each file is intact, sound synching to picture, and laying everything down into the correct bins and structure are all negotiated with the editor and crew.

And finally the archive is made, which is arguably the most important process of all, because this is where the real asset is collected- it’s stored on tape called an LTO5 tape, and is akin to the exposed negative of a film shoot- treat with extreme care.

The job of Workflow Supervisor is to make sure that the transition between each step is managed and controlled to ensure that every frame gets to editorial and onto an archive tape.

How did you ensure monitor calibration throughout the chain?

You hire a colour scientist to generate the required Colour Management Look Up Tables (LUTs) and calibrate monitors to the required specification. This used to be real voodoo, but now comes down to standards that can be pulled from software. The Arri website even has quite a good LUT generator that can make you LUTs for a few different colour management systems.

It’s probably just as important to ensure that all of the key monitors around the set and dailies environment all match in terms of calibration. Everyone looking at a particular monitor that’s out of alignment with everything else can cause quite a few problems down the chain, not the least of which is the confidence in the pipeline itself. Confidence is sometimes the hardest thing to get back on-set, and so again this work should be done during the camera testing period.

This is one of the most important things to stress. The advent of digital cameras trashed the notion that there was any ‘one way’ to execute post production on a feature film. It has actually done the opposite- given filmmakers an unrivalled set of options for post, to improve the production values of the film, so it can compete with films that have double or triple the budget. The camera tests before shoot are no longer just camera tests- they should be extended into the teams that will execute dailies, vfx, and the finishing of the film. One doesn’t have to lock down every detail of every shot before the first day of shoot, but the whole process should be described, agreed on by heads of department and tested before the first day of shoot.

The material came in to you via Datapack – a simple process?

Fairly straightforward. It’s basically just like copying any other data- it just takes time. After a few tests, you start to get a feel for the time factor- 2x runtime, 3x runtime, and this should be communicated to all so people have an idea of the timing.

The more important process is cycling the mags back to set. Because a film’s budget can only afford an allocation of a set amount of mags, and the camera department shouldn’t have to stop filming just because a mag isn’t available. It’s not just a matter of copying and sending back, however, as the digital assets from that datapack should only be cleared when the archive has safely written out to tape- only then is the asset ‘secure’ and can be

How did the dailies workflow operate?

The private joke about workflow diagrams is that the actual work isn’t in the bubbles, but in the lines.

2013 ARRI RAW Workflow document.

Any special considerations that had to be taken into account due to using the ArriRAW format?

They are not special considerations, but people need to be aware of the file size- ArriRaw is a large, high bandwidth, high latitude file, and so takes considerably more time in data movement and  processing Pro Res or smaller compressed files. Also that there are subtle differences between the video line out of the camera that you use for monitoring and the actual file that the camera records. During your testing phases with the DP and crew, you should be able to describe this and factor it into your working methods.

And finally, say I’m coming to you with my latest movie and I’m planning on shooting it in the format, what are your three top bits of advice?

Give plenty of time for testing during your camera tests – don’t leave it until first day of shoot.

Consult not only with the DP, but also with the editor and post supervisor down the chain- insist on a workflow test from the camera tests right through to a conform of the data at your DI house, that takes into account all the stages and people that the data has to pass through.

Don’t have one person do everything – this may be the cheapest way out, but once shooting ramps up, you’ll regret your decision when that guy is hospitalised due to working for days straight. It’s not pretty…