#NABShow Day 03.02 @CodexDigital booth

An evolutionary year at NAB for the guys at Codex Digital, who were featuring two really interesting developments for the company.

Codex thunderbolt dock + Codex review + Vault

20140408-092753.jpg

20140408-092806.jpg

20140408-092818.jpg

Codex has completely revamped their UI which looks really nice, modern and streamlined and added an impressive suite of features. The demo we had showed metadata editing, CDL grading, QC review. The performance from the connected thunderbolt dock was really impressive (500 meg/sec from dock to vault), and the playback control from the little touchscreen on the vault was astonishing when we found out we were looking at 4k playing back. I can see this suite of tools being the first real competition for Colorfront OSD in years.

Codex Action Cam.

20140408-093812.jpg

20140408-093757.jpg

This is really where you see that Codex is a company that has what a lot of other companies are only talking about on their roadmap- a technology platform that allows them to explore products that are outside their ‘core business’ but when you see them make so much sense and that also have that much-sought-after ‘cool’.

These little cameras that they’ve developed can be used as POV angle, witness cam, or to just give your VFX department a slightly offset element to extend plates. You could see how they could be applied for a 3d shoot as well. They are powered directly of an attached Codex Onboard and would give the DP an extra tool to maximize coverage on a shoot.

They look good and are just such a neat idea for the company.

#NABShow 2014 Day 02.02 Shooting and post for EDR/ HDR #edr @dolby

Panel was: Curtis Clark, ASC (Moderator) Matt Litwiller, Observatory Media Collin David, Observatory Media Travis LaBella, Observatory Media Rick Taylor, Dolby Laboratories

Panel was:
Curtis Clark, ASC (Moderator)
Matt Litwiller, Observatory Media
Collin David, Observatory Media
Travis LaBella, Observatory Media
Rick Taylor, Dolby Laboratories

 

 

 

 

EDR stands for Enhanced Dynamic Reproduction, which is Dolby’s High Dynamic Range (HDR) application. This discussion talked about EDR, in reference to one specific short film that was put through post as an EDR project, Telescope.


images

Telescope was a short film that shot on the F65 and followed an ACES workflow color pipeline (more on ACES here), so was able to adapt during finish to an EDR color and mastering pipeline.

EDR has an extended contrast range, which is viewable on the Dolby Pulsar monitor, which has a 4000 nit response (a nit is a measure of luminance).

EDR mode on the F65 has a theoretical 19 stops of latitude, and is fine within an ACES colour pipeline, which can handle up to a 30 stop dynamic range. There are only a few cameras that can encapsulate this dynamic range- F65 and RED Dragon were chosen for this very specific task.

Both DPs on stage spoke about the fantastic dynamic range, which enables far superior detail in highlights and shadow areas, and both spoke about the fact that the descisions when mastering were about what to leave out, not what you could hope to include. The wide gamut and deep colour saturation gives a very different palette with which to tell the story, and allows a real finesse to the grading of the material shot.

The high latitude allowed for a high degree of practical lighting from the art direction- lighting that comes from the actual set, in this case from the LED buttons in the spaceship and ambient light from the lights that are part of the set. This style of shooting will give cinematographers a greater freedom because they will not need to fill light quite so much- they can rely on the latitude a lot more. This will be a gift to night and low-light shooting.

What has not caught up yet is the on-set monitoring and scopes available, so while it was shot, the film was only monitored down to REC709 color space, discarding a lot of the detail.  In fact, quite a few of the participants said that they did not realize the wide detail and color range available on the EDR shots until they got into the DI and mastering.

This goes right into the politics of workflow- how do you manage expectations and get consensus on the ‘look’ of the film when you can’t view even an approximation of the final product while you’re on set? Dolby is showing a prototype at this NAB show to address this- an onset monitor that displays 2000 nits, and would be priced to be an effective onset monitor. The VFX unit on Telescope was taking 16 bit linear EXR files anyway, so they were aware of the maximum latitude available, even though they do not have the monitoring or compositing techniques to take advantage of this at the moment.

TelescopeScifiShortFlyingLinestsr01

Dolby uses the Dolby Vision system in order to create the various format deliverables, and this system was able to land onto the many different colour spaces required for a full deliverable list.  The grade was still done in P3 colour space, and regulated by ACES. In fact, during this process, the ACES RRT (Reference Rendering Transform) needed to be adjusted, as it rolled off dramatically at 14 fL (foot Lamberts), but the EDR had a nmuch higher latitude, beyond this point. Now because of this process, the new RRT gives an extended head room within the RRT to allow for EDR images.

“The beauty of aces is that it allows for these issues with new formats to be resolved, so it is a constantly evolving workflow tool”– Curtis Clarke

 

The promise of this style of shooting is only just being exploited. The panelists imagined, for example, a horror film shot completely in the low end, deep blacks of an image, and what this could do for storytelling. This was a major thread of the talk from every panelist- that these techniques need to support the storytelling. This is just the start, and you can imagine just how a new generation of filmmakers will take advantage of the new latitude- a generation that had never known that 9 or 12 stops in image latitude was the limit anyway.

 

#NABshow 2014 Day 2.01- Distributive Creativity

Or: How I learned to stop worrying and love ‘the cloud’..

Speakers: Josh Rizzo, Hula Post (Moderator) Joe Beirne, Technicolor/ Postworks NY Ramy Katrib, Digital Film Tree Matt Schneider, Technicolor/ Postworks NY

Speakers:
Josh Rizzo, Hula Post (Moderator)
Joe Beirne, Technicolor/ Postworks NY
Ramy Katrib, Digital Film Tree
Matt Schneider, Technicolor/ Postworks NY

A great session, that spoke to a lot that’s going on in post production and computing at the moment.

 

We’ve had 100 years of filmmaking, 30 years of computing and a 10 year timeframe to integrate them together, with some successes and some failures. The inevitable total merge of these processes is, that catch-all concept, ‘the cloud’. But in this context, the cloud just really means centralized networked storage, combined with the ability to bring creative and technical processes to that storage.

In this environment, Post Production is facing extreme competition from clients that can and will hire commodity machines to get work done at lower prices. The post facility needs to be smarter than just a place of for-hire rental, and needs better technology than the client can source for themselves.

OpenStack

OpenStack

Ramy Katrib from Digital Film Tree took up this theme by talking about OpenStack, an open source cloud project initiated by Nasa and Rackspace. Openstack gives the ability to set up an open cloud environment, that is “like Amazon & Azure, but way cooler”. It gives the ability to set up storage and networking in the same place, is highly scalable and avoids vendor lock-in. And because it is open source it engenders what Ramy called “Co-opetition”: a platform that is used among highly competitive entities, who develop individually and share common technology at the same time.

In Openstack, you can have many ‘stacks’ that are geographically distributed but appear as one storage block. It has intelligence to distribute and sync files to all parts of the stack- in this case, Production Office, Studio Archive, VFX, Editorial may all be separate stacks, and OpenStack works out what files are needed where.

Simple stack

 

Each stack in the whole can have all data, or a security-restricted part of the whole relevant to the work being done at that location.

Metadata is replicated through all parts of the OpenStack, and then heavier data is drip-fed based on bandwidth and delivery time. The files are allowed to form relationships with each other through the many users and applications interacting with the files, to build a ‘community of data’. The data is managed by rules and algorithm- applications still need to be developed to support decision making on large data sets. An archive information management client that I’ve been working with refers to the principle of ‘disposition’- determining what’s valuable in a near archive, what gets moved into deeper, longer-term archive, and what gets deleted and disposed of- this process will be managed by humans presiding over smart algorithms in the future.

Joe Beirne finished by speaking about a principle that is coming through education now with astonishing results- the Flip Classroom. The students do the homework with teachers during the day, and then go home to listen to the lectures online. This same principle is now being applied most to the post facility- it’s only the really intensive data-crunching work and work that required calibrated environments that needs to be done indoors, at the facility. Everything else can be done away from the facility- distributed processed brought to storage.

But even these calibrated environments can now be built wherever a client wants. Once the data is everywhere, school is out.

 

ARRI Raw Dailies Workflow

This article first appeared in Definition Magazine, UK, in 2011.


ARRI Alexa

What were your thoughts when you first discovered that you were going to be taking an ArriRAW film through dailies?

Sixteen19 was involved with some of the very first ArriRaw jobs anywhere. We shared the workflow execution with another company on Extremely Loud, Incredibly Close in New York in 2010, and then set up Gambit soon after in London.

The Codex recorder had just been certified by Arri to be used as a recording device for ArriRaw, which was a significant development because Arri themselves hadn’t made a recorder for this format- they were relying on a secondary market of companies to build accessories for the camera.  There are a few factors that go into investigating any new format,

For feature film work the first thing you need to find out is what the format is doing in it’s colour space and debayering. Basically, how the raw data will be processed in order to look like film.

In simple terms the colour space is represented by a ‘colour curve’, which basically describes how the image colour  tonality is represented within a digital image. Sony has S-Log, Panavision has PanaLog, and Arri has ArriRaw- each of these are manufacturer-specific ways of encoding and representing a digital image. These in turn almost always reference Cineon encoding, which was the original file type that Kodak developed to represent a film frame as a film scan.

The debayering really just means how software further down the workflow will interpret the raw digital image data into a coherent picture image.

Once you’ve worked out how these two elements work together to produce an image out of the camera system, then you are ready to try and use these things to give the DP the look he is after on the dailies, either through a DIT colour grading on set, or by having a dailies colourist go a more traditional route and do this work after shoot is finished.

It’s best not to lose sight of what the dailies pipeline is for- it’s not just about creating pretty pictures, although that is the most immediately obvious role that dailies have. Doing the daily rushes on any film is as much about organising the material that’s coming from set, so that it can be coherently used by an editor and post production team for sometimes four times as long a time period as the shoot. So the hidden side of any new camera is working out as well just how you can get as much information from the process as possible. This is called the metadata, the information about the data you hold.

How much involvement did you have with establishing the on-set workflow? How important is collaboration between DIT and DoP on the one hand, and equipment on the other?

In my role as Workflow Supervisor, I usually kick around the workflow between the DP, editor and producer/ studio for a few weeks in Pre Production. The DIT usually comes into that conversation as well, which is a factor. There are also tests that need to be run first of all from the set to dailies/ editorial, then right the way through to VFX plate delivery and the digital intermediate finishing for the feature. It’s a mix of technical knowledge, equipment product knowledge and politics. The diplomacy and politics can measured exactly between the requirements of the principal creatives and the shortfalls of any budget.

The collaboration starts with how the DP wants to work with a DIT, which can range from a simple monitoring setup that shows people on set the camera images, and can grow into a whole grading setup ‘in the tent’, with full colour management and image control.

This collaboration also extends to the editorial crew, in working out precisely how they want their files collected and organised. The quality control (QC) pass to check each file is intact, sound synching to picture, and laying everything down into the correct bins and structure are all negotiated with the editor and crew.

And finally the archive is made, which is arguably the most important process of all, because this is where the real asset is collected- it’s stored on tape called an LTO5 tape, and is akin to the exposed negative of a film shoot- treat with extreme care.

The job of Workflow Supervisor is to make sure that the transition between each step is managed and controlled to ensure that every frame gets to editorial and onto an archive tape.

How did you ensure monitor calibration throughout the chain?

You hire a colour scientist to generate the required Colour Management Look Up Tables (LUTs) and calibrate monitors to the required specification. This used to be real voodoo, but now comes down to standards that can be pulled from software. The Arri website even has quite a good LUT generator that can make you LUTs for a few different colour management systems.

It’s probably just as important to ensure that all of the key monitors around the set and dailies environment all match in terms of calibration. Everyone looking at a particular monitor that’s out of alignment with everything else can cause quite a few problems down the chain, not the least of which is the confidence in the pipeline itself. Confidence is sometimes the hardest thing to get back on-set, and so again this work should be done during the camera testing period.

This is one of the most important things to stress. The advent of digital cameras trashed the notion that there was any ‘one way’ to execute post production on a feature film. It has actually done the opposite- given filmmakers an unrivalled set of options for post, to improve the production values of the film, so it can compete with films that have double or triple the budget. The camera tests before shoot are no longer just camera tests- they should be extended into the teams that will execute dailies, vfx, and the finishing of the film. One doesn’t have to lock down every detail of every shot before the first day of shoot, but the whole process should be described, agreed on by heads of department and tested before the first day of shoot.

The material came in to you via Datapack – a simple process?

Fairly straightforward. It’s basically just like copying any other data- it just takes time. After a few tests, you start to get a feel for the time factor- 2x runtime, 3x runtime, and this should be communicated to all so people have an idea of the timing.

The more important process is cycling the mags back to set. Because a film’s budget can only afford an allocation of a set amount of mags, and the camera department shouldn’t have to stop filming just because a mag isn’t available. It’s not just a matter of copying and sending back, however, as the digital assets from that datapack should only be cleared when the archive has safely written out to tape- only then is the asset ‘secure’ and can be

How did the dailies workflow operate?

The private joke about workflow diagrams is that the actual work isn’t in the bubbles, but in the lines.

2013 ARRI RAW Workflow document.

Any special considerations that had to be taken into account due to using the ArriRAW format?

They are not special considerations, but people need to be aware of the file size- ArriRaw is a large, high bandwidth, high latitude file, and so takes considerably more time in data movement and  processing Pro Res or smaller compressed files. Also that there are subtle differences between the video line out of the camera that you use for monitoring and the actual file that the camera records. During your testing phases with the DP and crew, you should be able to describe this and factor it into your working methods.

And finally, say I’m coming to you with my latest movie and I’m planning on shooting it in the format, what are your three top bits of advice?

Give plenty of time for testing during your camera tests – don’t leave it until first day of shoot.

Consult not only with the DP, but also with the editor and post supervisor down the chain- insist on a workflow test from the camera tests right through to a conform of the data at your DI house, that takes into account all the stages and people that the data has to pass through.

Don’t have one person do everything – this may be the cheapest way out, but once shooting ramps up, you’ll regret your decision when that guy is hospitalised due to working for days straight. It’s not pretty…

Rise Of The Workflow Producer

This article was first published in Inside Film Magazine, Australia- (http://if.com.au/), 2011

 

We landed through Heathrow Thursday, and spent the Royal Wedding Weekend in an Ealing Studios rehearsal room, booting up gear, pulling cables, software updating, and going over the last minute things we needed to buy the next day to set up. The keys to the cutting room came late, and we had a full day’s and night installation to get done to make it to pre-shoot day. Besides, we’d promised the editor it’d be up and running Monday. Our trucks arrived with two burly South African removalists, and two on-loan runners from Technicolor turned up with a trolley. Soho is no joke with the skinny stairwell, and we had to disassemble our server half way up just to get it up and around a corner. We’d moved in on that first day with two mac towers, two avids, a server, and we had a 50 day shoot starting the next day. This is how films are now made.

Earlier this year at the Hollywood Post Alliance Tech Retreat in Palm Springs, a getaway conference for film technologists and the post community in LA, the shift was palpable. In the wide selection of speakers and panelists, everyone was talking about their own, special, bespoke, unique, ‘snowflake’ workflows and practices for digital shooting- no two workflows are the same. The proliferation of digital formats, of commodity computers running mass-market software and yet another generation of digital cameras coming through the ranks have split at the seams the 100 year traditions of filmmaking, and people are finding all sorts of innovative ways of working that embrace both ‘film discipline’ and the responsiveness, flexibility and innovation of digital capture.

Locally, we hired a colour scientist who was able to handle all of the colour management for the Alexa camera through a £600 plugin for shake on his Mac tower. He had worked at a larger VFX house for a few years, but prefers freelancing now and is in demand by smaller workflow companies such as ours, who hire him by the week. We worked with 4KLondon, the Digital Image Technician agency, who were able to match us up with an excellent freelance guy who was our eyes and ears on set, who chased the director of photography around with his cart, showing him his beautifully shot images in realtime on a calibrated monitor, and determined the colour balance that the DP wanted. And we set up ourselves, to pull this all together, to create the dailies and most importantly, the data archive of the digital negative, the primary asset of the production.

The standardisation of 35mm and 16mm for shooting and printing formats in the early 20th Century was a boon to the business of distributing films. It meant leaving behind many formats (9.5mm! 26mm Freise- Green! Dufay Colour!), but gave an economy of scale for the business. Digital shooting has been a wild west for a little while now, with every camera manufacturer working in a slightly different colour space, resolution and proprietary format. This year at HPA the Academy Of Motion Picture Arts and Sciences introduced to the world their IIF- ACES colour workflow for digital cameras, which allows manufacturers to still maximise the depth and precision of their images and sensors, but provides clear pathways for the careful handling of those images in post production. IIF-ACES is an excellent attempt to at once institute a standard of colour management and reproduction for digital cameras, and at the same time still allow individual camera companies to push the boundaries of what their camera can do.

The customisation of the camera and it’s images within the boundaries set out by the standard still gives an enormous range of creativity and technical calibration for the technician and artist. And as the talent that used to be locked up in VFX and Post houses becomes freelance, smaller, more lightweight ‘insurgent’ teams area assembling all over the place to get films made. If I worked in a lab, I would say I was worried.

After 10 weeks, we’ll move to New Mexico. Well, I say move: The dailies team will stay here on the 3rd floor in Berwick Street, a block up from the markets. An operator that we know in LA will go across to New Mexico, where we’ve booked a high speed uplink, and will upload the ARRI RAW files directly to the dailies & cutting room in London after shoot every day.This bandwidth is getting cheaper all the time, and satellite can be booked like you’d order a book from Amazon, and is certainly much cheaper than sending the data across state lines to a lab in California, or flying the dailies team into New Mexico. After they’ve shot for a week, action will resume in London. We’ll be all done and gone in 11 weeks, and moved onto the next job, in another location, with another format, another workflow and different problems to solve.

With so many different formats and options, how does one production actually ever get started? There’s no one industry standard workflow anymore. HPA was abuzz with talk of how this works, and a new position that is gaining precedence: the Workflow Supervisor. This is the person who would be able to deal with all of the technical and creative issues when deciding to shoot digitally. They would liaise between the DP, the editor, producer and director, and would have to support each position with specific knowledge, and a very strong communicated idea of the way everything fits together. It’s a strange mix of networking IT, Avid support, cinematography, VFX and production skills embodied in one person, and these people do exist. They would not only take a film right through from the camera tests to the finished Digital Cinema masters technically, but also be able to assemble small and medium sized teams where ever they are, to get the job done. Some of these people are still directing a facility to the proper workflow, but more and more the workflow supervisor is responsible for setting up and managing the workflow from conception to build to execution.

Naturally there are only a few of these people at the moment, who can just pop up a shop, work together for a short amount of time, and then disband and go do something else, but it’s the same as people who crew on set in shooting production have been doing most of their careers, working in groups that assemble just for the film at hand. Now that digital pipelines have freed the brave to venture out from beyond a facility’s walls, they can see that their skills and experience are very valuable and can slot in to a production.

And leading this whole team, putting it together, planning the pipeline, executing and getting the film started and then maintained, is the Workflow Supervisor. If you’re planning a digital shoot, and you haven’t talked to anyone yet who can perform this role for you, you’re nowhere near ready to dive into the shoot. They will not only save you money and time, but they’ll get your film made while they are doing it.

Next month it’s Pittsburg, and then a film in Vancouver. The summer may be a surfing film in Northern California, but by November it’s definitely Alaska for the teen comedy. It’s a way of working that has been known in the film industry, but not by the technology people on the back end. The mobile sector is a small but rapidly growing sector of the film industry with a few companies jockeying for position and market share. With contraction evident everywhere else in the big-ticket end of film technology, a light-weight, scalable, movable and robust service that will be whereever the film is, that works directly with the filmmakers, without expensive premises or million dollar kit, has got to be a way forward.

– See more at: http://www.sixteen19.com/blog-entry/rise-workflow-producer#sthash.gexmjDvt.dpuf

Going Onset

This article first appeared in IF Magazine, 2009.

 With the portability of gear to be deployed onset, I think that some are losing sight of the overall goals of each stage of the filmmaking process, and are possibly creating more chaos than necessary. Despite recent hype from some players, it is far better to make a distinction in our minds of where best each operation should take place, to hit the goals of the overall production.

What processes are best to come onset to help make the shoot more efficient, and what processes are best kept nearset, just next door even, but away from the intensity and pace of a busy shoot?

not your grandma's grading
 The film set is a chaotic place, of course. Money being spent at a rapid rate, measured in the thousands of dollars per minute. Directors are trying to get their shots and keep the actors focused. Producers are trying to keep the shoot rate up and the daily schedule shot. Everybody else is trying to do their jobs and not get in the way.

Into this, because of digital technology, we’re not throwing all sorts of processes that used to occur back in the facility. Colour grading. Encoding of dailies. QC. VFX plate approval.

While it makes perfect sense that digital shooting has untethered a lot of these processes and brought them closer to the shoot, it doesn’t mean that these things should be occurring directly onset.

Colour grading for a start. The time it takes to hand back proper, decently graded, consistent dailies to an editorial department is very different to the shooting schedule of a camera department onset- these two things by themselves move at very different rhythms. A dailies grade should give a consistent look for the scenes, as negotiated between the creative team, camera department and dailies colourist. Too often grading onset chases it’s own tail, has a camera department experimenting with ‘looks’ that need to be re-graded anyway back at the nearset, and fails to achieve it’s goal- consistency of colour for an editorial department to start cutting.

And the worst-case scenario is that a DP looking at a graded signal on set isn’t quite aware that lifting a light a further 10% and seeing that light change represented through a particular grade on his onset grader’s monitor doesn’t represent the exposure of the shot actually being recorded by the camera sensor.

This disconnect can lead to the camera department unknowingly seeing a graded image that seems correctly exposed, but leaving a huge headache for a DI unit down the process. Things move so fast on set that the margin of error for a colourist to try and impress a DP with a look, but conceal exposure problems in the captured ‘raw’ image is considerably increased. I’m am not theorising here- this is happening on digital shoots as colour grading comes onset, and is a real issue.

Also when you put into the equation the colour management issue, it’ll soon render the grading onset to be a moot point- is the grader using REC709 so that the grade is meaningful for the dailies, but not representing the entire range of the camera sensor, or is the grader using P3/ XYZ, meaning someone will have to grade a REC709 version for dailies down the track anyway? What is the purpose of all of this time and money being spent onset? A CDL is not a LUT, not now, not ever.

Far better to give a DP a set of emulation LUTs, simulating known negative and print stock combinations, let them use the light meter as they would on a traditional shoot to guide them, and then communicate with the dailies colourist near set. Setting your colour management in pre-production and calibrating the camera department to this management will save a lot of costly, valuable time onset, and make the nearset dailies process much more efficient. Far more than putting even the most powerful grading desk onset ever will.

Let’s not lose sight of the fact that the dailies process isn’t just about grading or making H264’s for someone’s iPad or iPhone. The main purpose of the dailies process is to set together all of the picture, sound and now considerable metadata elements from the chaotic shoot environment, and organise them for the filmmaking process going forward. This process at best takes into account the multiple deliverable elements and colourspaces that will necessarily need to be addresses in the filmmaking process, from executive dailies to VFX plate delivery to DI, and gets everything ready from the film shoot, for the much longer post production process ahead.

If dailies gets completely sucked into the chaotic, onset environment, this process of organisation can be compromised, making the rest of the filmmaking process considerably more chaotic and hellish. We’ve all worked on a film that hasn’t quite been organised that well, and know the pain that it can lead to down the track, pain that only compounds and multiplies the longer you leave it to organise these issues.

A well established nearset team, calibrated to work directly behind the shoot whenever the onset team can deliver, located as close as possible to the set, can still be delivering material out to everyone within hours of raw material being received. Let’s not forget this process used to take 12-15 hours only two years ago, and a good nearset team should be able to nail it in a third of that time.

Establish a good nearset team close by, communicate well with them, and the whole process can enjoy the benefits of digital shooting, without the counterproductive crowding-out of a set that seems to be happening at the moment.

Noone’s Going To Bring You A Hard Disc

Extending on from the last post, the Old Post businesses actually can see the threat that’s posed from the New Way of working, but wonder how they can get involved to protect their market share. Generally, the approaches are the same, but prioritise the Old Post systems and ways of doing things to the detriment of the New Way.

An example: Many managers, in thinking about how to deal with digital cinematography, have looked at the way their existing business is structured around film dailies, and in their mind have only taken the leap as far as exchanging a film can with a hard drive, but everything else staying the same. The trouble is, once you swap film formats for digital acquisition, Everything Changes.

How many times have you seen the following press release in the last few years:

Facility X is adapting it’s processes to the new digital shooting technology, and now would like to introduce their new Data Lab/ Digital Dailies/ Data Processing/ Digital Data Lab Dailies Processing service to the local market. It works exactly the same way as a film lab works- all producers need to do is drop off a hard disk of dailies before11pm every night, and send a runner around the next morning to pick up another hard disk of editorial files and your DVDs. It’s That Easy.

Hey Good Lookin'..
 Fundamentally, this is the Old Post model tarted up a bit to feel sexy. It doesn’t realize yet that:

  • The images are no longer intrinsically tied to the format. This is the most important thing to understand, and that few traditional places get: Unlike 35 or 16mm film, the hard disk doesn’t actually determine characteristics of the digital file- it’s just the transport mechanism. A transport mechanism that can be interchanged without loss of quality of the images themselves. Begs the question: nowadays, with internet speeds what they are (and even once-prohibitive satellite now being relatively cheap) why do you physically have to take anything anywhere? And do your bond insurers really want a hard disk getting around town with your images on it? Really?
  • As above, why is your material being delivered back on a physical format? Why on earth are you still making crappy DVDs with burnins to evaluate your images during the shoot? If you needed a physical format at all, why wouldn’t you be able to download a disk image that’s relevant, and burn it on your own laptop?
  • With the cost of the equipment dropping to near commodity prices every day, what’s stopping traditional post operators getting out onto set and working directly with production. This person can be processing the images and handing to your editor straight away after being acquired onset. With a good pipeline designed beforehand an editor can be cutting a few hours behind shoot, which will lower reshoots and continuity problems, and increase shooting speeds. I’ll cover some of the great hardware that’s coming out now for this purpose in future posts.
  • Why is it an overnight service? This was typically a chemical-lab imposed deadline, because you needed to chemically process everything in a linear fashion all at the same time, and you needed to impose a cut-off so that everything could be delivered and then be set off for the evening. With digital processing now able to be done at least at runtime, why do you need 7 hours to ‘process’ two hours of dailies?
  • Asset management and online services are usually an outside service that is an optional extra, and not at the very core of the system. It’s not about the processing anymore, if it ever was, because everyone with Final Cut can process. It’s all about accessibility now, and online should be at the centre of the service, not at the periphery.

 Of course,  RED users have been doing this for years with the DIT position on set, but this near-set or cloud workflow is now breaking through to become available for higher-end cameras as well as affordable for indie productions.

Not everyone wants to own gear at the end of their show, but still want to work in the new way and there are a variety of businesses now emerging to compete with the existing players on their own terms. As just one example:

I’ve had a few chats with Michela Ledwidge from Rack & Pin (@rackandpin), and think that these guys show that they understand the new environment for post, which isn’t about physical facilities and gear, but networked cloud computing. At their heart, they refer to themselves as a “Cloud service for media productions”, and focus on the client’s interaction with their own material, not the processing. Sure, they have a physical address that you can still drop your hard drive into if you want, but it’s not the office- they find partners in existing storefronts that upload onto R&P’s cloud based server system. And their business isn’t just the walled garden of dailies- as the website says, it’s ‘Useful Screens To Manage Your Data’. Within the one system, they can service dailies, vfx approval, production management etc, and even offer to create bespoke applications and workflows for their clients, so that as new business models emerge from an increasingly digital approach, they can adapt and scale accordingly.

They don’t have one hard product to sell you, developed around existing film infrastructure they are still madly trying to keep relevant. They are mobile, adaptable, in the cloud, and are focused more and more on people managing media directly as a subscription service where ever the clients happen to be. They and businesses like them are going to have an increasing impact, especially on the Old Way businesses waiting for your hard drive, and on an emerging filmmaking community that is looking for web-based secure services and has grown up with online services.