During these weeks before the filming and the actual work on the shots starts we are doing several tests to check out the production tools. One of them is obviously Cycles. In this particular test we wanted to see how far we can go with the integration of the robot hand in a shot.
Of course it is obvious that the hand is fake, after all it’s a robot hand, so we also put in objects that could in theory be actually standing on the table. We went for a mirror ball (because there was an actual mirror ball in the scene, so, easy to compare) and a plastic toy.
One of the things we learned from the kickoff project was that animating the parameters of modifiers for a bunch of objects can be very, very time consuming. So we asked Sergey if he could make a button that copied the animated parameters of the actively selected object to the whole selection. 5 minutes later he had fully working feature that did exactly that :)
After finishing the Track, Match, Blend DVD this week I could finally start doing something more useful for the production.
In order to help with the reconstruction of the set at the Oude Kerk I have started tracking various shots that I took there, getting a more or less good geometry reconstruction out of that. But there is always a slight error margin in these tracks, so the resulting geometry is never 100% perfect. Still, it is very useful to get a sense of scale and proportion with some actual footage. The nice thing about Blender’s tracking system is the flexibility to link in several shots at once and align the cameras manually, to get multiple views aligned, and thereby getting a pretty big point-cloud reconstruction of the scene.
Just wanted to share one script which might be useful for almost everyone (and after some further improvements for everyone :)
This script checks which files in svn repository were moved/renamed using `svn mv` command and updates paths used by data blocks in all blend files from this repository preventing “dead links” and manual work to repair all this stuff.
Apparently you can’t do a computer graphics without using computers. But which exactly computers and how much of them we’re using?
Couple of different hardware configurations are used in the studio by artists and by renderfarm and this short post is devoted to describe which exactly hardware configurations we’re using for Mango project. As an addition there’re benchmark result of CPU and GPU rendering on that systems provided at the end of this post.
We didn’t forget what Mango is really about; which is of course to help improving Blender!
Last Monday we had two meetings with the devs & artists, on cycles and general issues. Yesterday we discussed pipeline designs. This morning we worked out a design idea for curve editing for masks. Time for an update :)
Yesterday I sat down with Brecht and Sergey to go over the main development topics, checking if we’re still on track and still have the big picture in mind. Because of the current workshop week we didn’t go over issues extensively with the artists, for that we’ll have plenty of time later. Here’s a short summary of what we discussed.
Motion tracker: is in good shape already, a new solver is underway to test. No bottlenecks.
Cycles render: will be seriously used. Brech is unsure how fast it’ll be in our production setup. We will do GPU and CPU (farm) comparision tests. Missing features are known topics (like shadow & id passes). He’ll also check volume render. Antialiasing and sampling (FSA) is an issue. A more detailed Cycles review we’ll do in 1-2 weeks here with the team.
We will need light probes or environment mapping (and stitching). Worth to investigate is efficient methods to extract light conditions from footage. Sergey loves to dive into this.
3D viewport: Brecht will check on overlay methods to enhance selection/active info, especially in rendered display.
Compositor project: some nodes – required by tracking – will need porting to opencl still. Might become a bottleneck.
Green Screen Keying: we will investigate best practices and state-of-the-art articles on this. My suggestion is to connect keying (mask extraction) to the clip-editor, using markers and tracking info and temporal filter options etc. Jeroen Bakker and Pete Larabell are interested to help too.
Depsgraph: we’ll try to focus on solving the crucial failures. Like the ‘dependency cycle conflict’ for piston cases and essential driver updates. As a bonus – when there’s time – we can try multi-threaded anim updating. The “proxy armature” also will have to get attention.
Getting Alembic to work would rock too… it would allow to combine a lot of real-time characters in a shot for animators and shade/lighters.
Color pipeline: the confused code for alpha and color spaces will have to become stable and useful (also on UI side, to clearly communicate things). OpenColorIO needs to be investigated still by the team.
Asset managing: continue work with Andrea Weikert on it (or gsoc student?) or help out ourselves.
We’ll keep you posted, next week we can do an artists’ version of the above :)
Sebastian Koenig – our masterful 3d camera and motion tracker – is currently finishing a new training DVD for the Blender store. It has been written with a general audience in mind – also filmers with no real experience with Blender should be able to learn how to track and match. And blend!
As usual for Blender Open Movie Workshop titles, all the content will be free to share and spread as CC-by, and all revenues will help out realizing Blender Foundation projects. Like this movie :)
As you all know, Mango is not only meant to create an awesome short film, but also a way to focus and improve Blender development. We already have great new tools, but for a real open source VFX pipeline we need a lot more!
Here are the main categories for development targets we like to work on the next 6 months (in random order):
Camera and motion tracking
Photo-realistic rendering – Cycles
Green screen keying
Color pipeline and Grading tools
Fire/smoke/volumetrics & explosions
Fix the Blender deps-graph
Asset management / Library linking
How far we can bring everything is quite unknown as usual, typically the deadline stress will take over at some moment – forcing developers to just work on what’s essential – and not on what’s nice to have or had been planned. Getting more sponsors and donations will definitely help though! :)
Below is per category notes that have been gathered by me during the VFX roundtable at Blender Conference 2011, and in discussions with other artists like Francois Tarlier, Troy Sobotka, Francesco Paglia, Bartek Skorupa and many others, and some of my own favorites.
I have to warn you. This post is looong. :)
Camera and Motion Tracking
Even though the tracker is aleady totally usable, including object tracking and auto-refinement, there can be some improvements too.
One major feature that we are waiting for to be included is planar tracking. Some of you might know Mocha, a planar tracker widely used in the industry, with which you can do fast and easy masking, digital makeup, patching etc. In a lot of situations you don’t really need a full-fledged 3d track just to manipulate certain areas of your footage. All you need is a tracker that can take into account rotation, translation and scale of the tracked feature in 2d space, for example in order to generate a mask that automatically follows the movements and transformations of the side of a car as it drives by.
Keir Mierle has something in the works that would allow such workflows. Obviously that would be tremendously helpful for masking and rotoscoping as well.