This is a report on how we realized the 4K DCP of Tears of Steel for theater distribution. Our tools of choice where:
Blender (generation of the 16bit TIFF frames)
OpenDCP (conversion of the frames to JPEG2000 and wrapping of picture and sound)
EasyDCP Player (checking integrity and specs of the DCP)
In this simplified process of DCP mastering we will go through few linear steps, such as:
Gathering raw images and sound in a DCI (Digital Cinema Initiative) compliant format
Wrapping them separately into mxf containers
Indexing and inserting them in the Digital Cinema Package
First off we need to generate an image sequence of the whole short film, using an appropriate image format, such as TIFF 16bit sRGB. In our case we are talking about 17616 frames, taking 850GB of space. The 4K DCI compliant resolution for our aspect ratio is 4096x1716px.
Once the export has succeeded, we can convert that image sequence into a JPEG2000 12bit XYZ sequence (with a .j2c extension). In order to do this operation we use OpenDCP, a great Free tool that offers both a GUI and a CLI. Once converted, the image sequence for the short film will become around 12GB.
In order to speed up the process and skip the 16bit TIFF generation, we tried to export the edit in JPEG2000 XYZ directly from Blender (with the benefit of using a render farm) but the image format was not accepted by OpenDCP for the mxf wrapping. Hopefully this will be fixed in the future as it saves quite some time and storage space. Continue
Hello everyone! We are approaching the end of the Tears of Steel 4K project. It has been very exciting and some aspects of the movie have been visually improved, along with some great features added in Blender to enhance the 4K compositing experience. A specific post about this topics is on the way!
Right now we would like to ask for support in the testing of our DCP pipeline output. Here is a test 4K DCP4K DCP4K DCP that any owner of a Digital Cinema media server can download and check out. Feedback on the quality and on any issues encountered would be much appreciated.
In this brief post I will illustrate an important part of our pipeline: the shot creation and development. The process described happens for every shot in the movie and appears slightly simplified (footage input and simulations are not taken into account). Here we have a picture of the process, plus a few notes about how production files are organized (on which layers objects should be placed on, and some naming conventions).
Everything starts with a blendfile where the shot is tracked and solved (the track file). This file is then duplicated and used as a starting point for keying and masking (masking file) and also for the main file, where layout, simulations, effects, light and compositing will be used. As soon as the main file is created, all the required libraries are linked in from environment and charachter files, so that the tracked camera can be placed in the right place in the scene. Once that the layout is approved it is possible to proceed with basic lighting and compositing, by creating the proper renderlayer setup. If simulation or effects like gun blasts, explosions or haze are needed, they are added to the same file, but in separate scenes. This system allows us even to use both blender internal render engine and cycles at the same time!
Given the size of our team it makes sense to keep workflow steps as compact as possible, since we can cohordinate on which shot everyone is working. A separate file for animation is created only when needed and it contains mostly liked libraries or objects, such as the camera, from the main file. At the same time, the main file links in masks and actions, so that they can be used in the compositor and on rig proxies.
This system allows a good degree of freedom and flexibility and its proving itself quite reliable.
Other posts about how we deal with footage, simulations and editing will come in the future:)
Today I will demonstrate what we do when something goes wrong or has to be fixed inside a blendfile! Recently we changed the way the armguns file works in order to make it more efficient (long story short: there used to be multiple proxies referring to multiple armatures, which were automatically generated with a script, and now the armature is the same and it is shared by the proxies).