As you all know, Mango is not only meant to create an awesome short film, but also a way to focus and improve Blender development. We already have great new tools, but for a real open source VFX pipeline we need a lot more!
Here are the main categories for development targets we like to work on the next 6 months (in random order):
- Camera and motion tracking
- Photo-realistic rendering – Cycles
- Green screen keying
- Color pipeline and Grading tools
- Fire/smoke/volumetrics & explosions
- Fix the Blender deps-graph
- Asset management / Library linking
How far we can bring everything is quite unknown as usual, typically the deadline stress will take over at some moment – forcing developers to just work on what’s essential – and not on what’s nice to have or had been planned. Getting more sponsors and donations will definitely help though! :)
Below is per category notes that have been gathered by me during the VFX roundtable at Blender Conference 2011, and in discussions with other artists like Francois Tarlier, Troy Sobotka, Francesco Paglia, Bartek Skorupa and many others, and some of my own favorites.
I have to warn you. This post is looong. :)
Camera and Motion Tracking
Even though the tracker is aleady totally usable, including object tracking and auto-refinement, there can be some improvements too.
One major feature that we are waiting for to be included is planar tracking. Some of you might know Mocha, a planar tracker widely used in the industry, with which you can do fast and easy masking, digital makeup, patching etc. In a lot of situations you don’t really need a full-fledged 3d track just to manipulate certain areas of your footage. All you need is a tracker that can take into account rotation, translation and scale of the tracked feature in 2d space, for example in order to generate a mask that automatically follows the movements and transformations of the side of a car as it drives by.
Keir Mierle has something in the works that would allow such workflows. Obviously that would be tremendously helpful for masking and rotoscoping as well.
Another thing that will be important for tracking in Mango is the use of survey data. That means that the user can take measurements on set, for example the size of objects, the distance of features, the height of the camera etc., and feed these informations into the solver. That way not only the solution can be improved, but you can also constrain the solutions to certain requirements. Most likely in a production like Mango there are different shots of the same scene, with the same set, but from different cameara angles. As a matchmover you have to make sure that the different cameras adjust to that scene so that you can easily use the same 3d data for it. Being able to set certain known constraints for the solution can make that process much easier.
There are a few other things I would like to see in the tracker, but these are mainly smaller things like marker influence control and usability improvements like visual track-quality feedback and marker management that can probably be sorted out in a few minutes of some coders free time. Yes, Sergey, that’s you! :)
Photo-realistic rendering – Cycles
Just as important as tracking of course is rendering. The plan is to fully harness the insane Global Illumination rendering power of Brecht’s render-miracle Cycles. With that in our toolset we can create and destroy Amsterdam as photorealistic as it can get.
Still there are some things we need.
For example we need a way efficiently create and use HDR light maps, not only for realistic lighting, but also for correct environment reflections.
Another thing that will have to be solved is noise.
Even though Cycles is already incredibly fast there is always room for improvements.
Besides the pure render-performance one thing that is critical for Mango are good render-passes. Not only the passes that are needed to re-combine the image from the separate render elements, but to extract render data that we need to composite the rendering on top of the footage.
Two of the most common passes for that are lamp-shadows and ambient occlusion. Despite not being really physically accurate, they provide a fast an easy way to integrate objects into the live action plate. Often just the contact shadow together with the rendered object and a little bit of color grading is enough to create the illusion of the object being part of the footage.
So that’s the quick’n dirty way of doing it. But Brecht already suggested that he might find ways to do it much more elegant, by extracting the possible light and shadow contribution of cycles lightpaths to the live action scene. Personally I have no idea how he will do that, but it sounds awesome! Maybe by even using the footage, camera-mapped to textures, thereby being 100% realistic? In any case, I am looking forward to things to come!
In addition to the passes it should be possible to have the movieclip playing back in the viewport, while at the same time having the cycles render-preview running, ideally with a live shadow-pass being calculated. Think of it! Realtime photorealistic viewport compositing!
Related to compositing and also one of the most important features for any VFX-work is the ability to do quick and efficient masking. The current system of 3d curves on different render-layers with different objectIDs has to be replaced with something more accessible.
There is quite a lot of work to be done, and workflows have to be tested how to make a userfriendly, managable, efficient UI for masks.
Blender’s mask system should allow quick and rough masking for color grading as well as detailed, animated and even tracked rotoscoping without the hassle of going through a render-layer. There are some good ideas to make masks and mask-editing available in various places, not only in the Compositor but also in the Image Editor, Movie Clip Editor and Video Sequence Editor. Masking should be as accessible, dynamic and powerful as possible. Being a VFX shortfilm Mango will most likely be a masking orgy!
Luckily Sergey Sharybin is already on that task, supported by Pete Larabell, who also coded the Double Edge Mask.
Sergey has created a wiki development page that sums up the top level design for what is planned: http://wiki.blender.org/index.php/User:Nazg-gul/MaskEditor. And if you remember how fast and awesome the camera tracking module has been coded by him, you can be sure that masking in Blender will be great!
Green Screen Keying
Keying has to be improved, that’s for sure. The channel-key is pretty nice already, but the other keyers can go right into the trashcan. Color-Key and Chroma-Key are just plain unusable for any serious keying. But since Mango will be filmed mostly in front of greenscreen, with certainly different lighting conditions, different camera settings etc. it must be possible to exactly pick the keying color, not rely exclusively on the pure green channel. What and how this will be achieved is a bit unsure still, but Pete Larabell, who also works on masking, will look into that.
Color pipeline and grading tools
Fire/smoke/volumetrics & explosions
Doing Mango without a good amount of kick-ass destructions, dust, debris and detonations is probably not an option.
Blender does have nice smoke, particles and rigid-bodies, but so far these simulations mostly work best in a secure test-environment and are not interacting with each other. Controlling these effects can sometimes be a nerve-wrecking and tedious experience. Lukas Toenne is doing great work for the node-particles which should make much more things possible than what we can do with particles now. But to make smoke, fire, simulations and explosions really communicate and influence each other in an animated complex FX shot a lot more work has to be done!
Also the the setup of these effects can be streamlined. Fracture shards setup for example is still a bit clunky. We need to find a way to easily control and tweak all the different parameters. This might also be a good time to finally move rigid-body-simulation from Game-Engine into Blender and make it a modifier!
Fix the Blender deps-graph