#NABshow 2014 Day 2.01- Distributive Creativity

Or: How I learned to stop worrying and love ‘the cloud’..

Speakers: Josh Rizzo, Hula Post (Moderator) Joe Beirne, Technicolor/ Postworks NY Ramy Katrib, Digital Film Tree Matt Schneider, Technicolor/ Postworks NY

Josh Rizzo, Hula Post (Moderator)
Joe Beirne, Technicolor/ Postworks NY
Ramy Katrib, Digital Film Tree
Matt Schneider, Technicolor/ Postworks NY

A great session, that spoke to a lot that’s going on in post production and computing at the moment.


We’ve had 100 years of filmmaking, 30 years of computing and a 10 year timeframe to integrate them together, with some successes and some failures. The inevitable total merge of these processes is, that catch-all concept, ‘the cloud’. But in this context, the cloud just really means centralized networked storage, combined with the ability to bring creative and technical processes to that storage.

In this environment, Post Production is facing extreme competition from clients that can and will hire commodity machines to get work done at lower prices. The post facility needs to be smarter than just a place of for-hire rental, and needs better technology than the client can source for themselves.



Ramy Katrib from Digital Film Tree took up this theme by talking about OpenStack, an open source cloud project initiated by Nasa and Rackspace. Openstack gives the ability to set up an open cloud environment, that is “like Amazon & Azure, but way cooler”. It gives the ability to set up storage and networking in the same place, is highly scalable and avoids vendor lock-in. And because it is open source it engenders what Ramy called “Co-opetition”: a platform that is used among highly competitive entities, who develop individually and share common technology at the same time.

In Openstack, you can have many ‘stacks’ that are geographically distributed but appear as one storage block. It has intelligence to distribute and sync files to all parts of the stack- in this case, Production Office, Studio Archive, VFX, Editorial may all be separate stacks, and OpenStack works out what files are needed where.

Simple stack


Each stack in the whole can have all data, or a security-restricted part of the whole relevant to the work being done at that location.

Metadata is replicated through all parts of the OpenStack, and then heavier data is drip-fed based on bandwidth and delivery time. The files are allowed to form relationships with each other through the many users and applications interacting with the files, to build a ‘community of data’. The data is managed by rules and algorithm- applications still need to be developed to support decision making on large data sets. An archive information management client that I’ve been working with refers to the principle of ‘disposition’- determining what’s valuable in a near archive, what gets moved into deeper, longer-term archive, and what gets deleted and disposed of- this process will be managed by humans presiding over smart algorithms in the future.

Joe Beirne finished by speaking about a principle that is coming through education now with astonishing results- the Flip Classroom. The students do the homework with teachers during the day, and then go home to listen to the lectures online. This same principle is now being applied most to the post facility- it’s only the really intensive data-crunching work and work that required calibrated environments that needs to be done indoors, at the facility. Everything else can be done away from the facility- distributed processed brought to storage.

But even these calibrated environments can now be built wherever a client wants. Once the data is everywhere, school is out.



Database as Film Deliverable?

How long will it be before studios start requiring a database as a deliverable item?

Stanley Kolowski's funhouse.

With digital acquisition comes an increased stream of data- not only picture and sound data but metadata from every device and department. This is significant within a production, but increasingly studios and networks need access to this metadata as well, to sort through the volume of material afterward that they get delivered, for legal, compliance, promotional and other internal purposes. As one studio VFX guy said to me the day before a huge 160 day digital shoot was about to kick off, “Are you ready to drink from the firehose?” The data stream is huge, persistent and keeps on a-comin’.

We’re also seeing each of the studios and networks building in various ways their own asset management/ digital rights management systems at various levels. There is an acknowledgement now of a lifecycle of data that exists within any one production, across multiple productions, and then is needed after production to sell, market, distribute and archive the films for future sales cycles.

Remember, organising the data manually into hierarchies is so 1999- there’s no longer any way of getting an intern to sit down and file this type of data into an asset management-like hierarchical system and expect to keep up. There’s so much data that the poor intern could go her whole life and not catch up with the data coming in every day, let alone historical data that was there before she started. It’s more about preparing this data to be fed out to a few different systems, so that data can be formatted in ways suitable for the end use. The studio exec will be pulling the data available on the studio system into a local, department specific application, which formats in relevant ways.

Looking forward, it’s obvious that some kind of database/ xml stream will be put onto the contract deliverables list to producers, alongside the physical and Ip deliverables required, which will post the data and metadata onto a studio  system. The studio system will be able to track data from it’s inception on set right through ideally to point of sale over IP, delivering into someone’s house from a shopping cart.

The studio system would not just be one huge, honking software program, but a platform of related software products that are all open and interoperable, and created for the studio to develop it’s own digital strategies under. The advantage this has for a studio is huge- with everything speeding up, becoming more fluidly digital, there is a need for systems not people, to track this data, and the requirement will come from the studio to production to supply this in pre-formatted ways. And once this is a studio requirement, it’ll be very quickly commercialised as a service first of all, then as a commodity adjunct to the existing gear hire and workflow requirements of production to their vendors. And it will be smart vendor companies that will develop turnkey products in this domain, to enable the formatting of individual, gear-specific formats into a standard data stream, ready to import onto the studio data and asset management system.

Like cities giving access to bus and train timetables creates a whole slew of apps for the consumer to navigate their city, once this kind of data is available to studio personnel, there’s no telling what new applications they will need to navigate this ocean of data. All I know is that there are huge opportunities for the right companies and partners to help them deal with this new problem, one piece of the puzzle at a time.