Yesterday’s weekly meeting we had visitors from Chile, USA and Germany! Always great to get fresh feedback on your work… they all confirmed they were absorbed with what goes on at the screen – even with unfinished greenscreen shots and without (good) sound.
Below the usual random selection of past week’s work:
Everyone’s busy… we’re on target to be finished end of August. That’s a mere 5 weeks! Doing interesting blog posts in crunch time is a too demanding; so I’ll just be slamming some stuff online to feed your curiosity :)
Here’s images I copied the past day from our renderfarm UI. It’s all WIP and tests!
This will probably be the most boring video of the whole project, but maybe someone finds it interesting. It is a timelapse demonstration of how we convert the footage from the RAW Sony F65 footage into something more useful, which in our case is OpenEXR in ACES Color Workspace. For that first conversion we have to use the F65 Viewer from Sony, which only runs on Windows and OSX, and also is not the most pleasant software to work with. Anyway, once we have the ACES EXRs we convert them to rec709 linear with OpenColorIO, which is Blender’s native colorspace, and we stay there as long as we can.
The whole workflow is maybe not very elegant, but so far it works quite well.
We also have to deal with 3 different naming conventions, which is the camera’s naming of the clips, the shotnumbers used on set (seen on the clapperboard) and the shotnumbers that we use here in the studio. Therefore part of my job is to keep track of the framenumbers, shotnumbers, In-Points, Out-Points, foldersizes, and so on. I have to find out which shot is used in the edit, which part of the shot is used and how long it is, in order to export and convert only what’s necessary.
After the conversions are done, and linear HD proxies have been generated, we erase the ACES files. Because oftherwise we would run out of diskspace very soon. Each frame is 50MB.
This is the 2nd in a series of videos about my environment art work on project Mango (wait! – actually: Tears of Steel ! )
It’s an overview of: dome models library, tileable and specific textures and the greeble kits.
For the actual modelling and texturing there’s lot more to say, specific videos will follow, on individual areas and topics.
This mainly is about the assets organization: naming, grouping and linking: how i split things into scenes, named objects and materials to sort them, grouped objects so that they could be used as detailing greeble or as set pieces to link the sets into the final shots.
Of course: the whole assets management pipeline is much bigger than what you see here Plus, things are still evolving and being optimized during production and creation of actual shots, other team members like Francesco Siddi and Sebastian Koenig have a better technical/organization overview and know the pipeline way better than me.
Still this will be useful for modellers and texture artist looking for infos on how to sort and manage their assets.
Here’s a little timelapse of the general workflow for our keying shots.
First we track the camera and save that as a blendfile. From that file we generate a new file that is then used as base for masking and keying. The tracking markers can often be re-used as a way to mask out stuff from the footage.
The cleaned footage is then saved as 4k openEXR files with premultiplied alpha channel.
A third blendfile is created as a base for that layout, swapping the footage for the clean plates and setting up the final shot dimensions (1920×800). That file is then handed over to the person that is then doing the layout for it. The scene layout for this scene was originally done by Andy.
Usually after the main composite is done I fix some alpha-blending issues that cannot be solved in the pre-key outside the main composite.
The key that you see here is still a little but too harsh on the one side of the head, but for the demo I didn’t want to tweak it too long.