With the production start of project Mango getting close it’s a good idea to test your own workflow a bit … because when instead you’re swamped with tons of work you can’t spend time thinking how to make things smarter and faster. So I tried to put together some questions and topics that always bothered me when I do environment models and shaders. Nothing amazing or innovative, but hopefully good for speeding up work, here’s what I came up with:
You got to love a carefully painted custom texture map, with all those subtle (or strong) weathering effects placed in the right spots just for that object.
Only issue: painting textures takes a lot of time! :)
Speaking of environments, that’s a real issue: your environment is often made of tons of different pieces, all quite detailed as models but not that important by themselves to afford the time for accurate custom painting. In my experience, in arch.viz. you only can afford tileable textures and no custom painting, in games you have to custom paint, and in movies you ‘simply’ need the best quality and realism…
Still, in any case it could be handy to have some kind of automation to get some weathering effects (based on the shape of the object) without custom painting, and keeping the unwrap phase reasonably fast. ‘Automation’ for this stuff won’t help quality much, but speeds things up. So it’s good for minor objects or as a base for important ‘hero’ pieces.
That’s the idea behind the tests below, dealing with: batch-bake of AO/dirtmaps, unwrapping multiple objects together, node shaders (cycles in particular)
Camalot AV Facilties is Netherland’s most renowned digital film camera rental and servicing business. One of the owners – Philippe Vié – today confirmed we’ll get their full support for equipment and services. Having worked with our Director of Photography Joris Kerbosch before, they’re confident we’ll make a good use of their Red Epic cameras :) (same camera as being used by Peter Jackson for the Hobbit, delivers 5k)!
I’m really proud to have triggered the enthusiast interest of the top fashion graduate of last year’s Rietveld Academy in Amsterdam. Pablo is currently doing his Masters in London and will be meeting with us second week of March to discuss clothing and costume design.
He’s been showing us artwork he did for post-apocalyptic comic book as well. His style is amazing, and will add a lot of credibility to our project. Maybe he can do some virtual robot fashion for us too! :)
As you all know, Mango is not only meant to create an awesome short film, but also a way to focus and improve Blender development. We already have great new tools, but for a real open source VFX pipeline we need a lot more!
Here are the main categories for development targets we like to work on the next 6 months (in random order):
Camera and motion tracking
Photo-realistic rendering – Cycles
Green screen keying
Color pipeline and Grading tools
Fire/smoke/volumetrics & explosions
Fix the Blender deps-graph
Asset management / Library linking
How far we can bring everything is quite unknown as usual, typically the deadline stress will take over at some moment – forcing developers to just work on what’s essential – and not on what’s nice to have or had been planned. Getting more sponsors and donations will definitely help though! :)
Below is per category notes that have been gathered by me during the VFX roundtable at Blender Conference 2011, and in discussions with other artists like Francois Tarlier, Troy Sobotka, Francesco Paglia, Bartek Skorupa and many others, and some of my own favorites.
I have to warn you. This post is looong. :)
Camera and Motion Tracking
Even though the tracker is aleady totally usable, including object tracking and auto-refinement, there can be some improvements too.
One major feature that we are waiting for to be included is planar tracking. Some of you might know Mocha, a planar tracker widely used in the industry, with which you can do fast and easy masking, digital makeup, patching etc. In a lot of situations you don’t really need a full-fledged 3d track just to manipulate certain areas of your footage. All you need is a tracker that can take into account rotation, translation and scale of the tracked feature in 2d space, for example in order to generate a mask that automatically follows the movements and transformations of the side of a car as it drives by.
Keir Mierle has something in the works that would allow such workflows. Obviously that would be tremendously helpful for masking and rotoscoping as well.
Nothing exciting to mention really, but I know unexciting news on progress is also welcome :)
Past month’s first half went to getting an application for the Netherlands Film Fund ready. With help from David Revoy (artwork for Mango script presentation) Anja (budget spreadsheets) and Rob (Sintel + Institute report) I delivered two booklets with about 150 pages of content to the Fund 2 weeks ago. (see image). Film Fund budget would be *very* welcome to lift up the quality of our filming work. Fingers crossed!
The time schedule is still same as well. Starting February 18th Ian Hubert and David Revoy will work here for two weeks on a final storyboard for the film. They then present that to the team on March 3, then they’re all here (apart from Jeremy who arrives 2 weeks later). Idea for the kick-off in the first week of March (3-8) is to make a short film together. Complete from start to finish in 5 days. Will be a great exercise together to figure out what we can do, and what Blender can do even! :)
In the past weeks I’ve also done paperwork for contracting, been contacting potential sponsors and studios, booked flights for everyone to Amsterdam, visited apartments where they can live, checked on where to get good bicycles. Also had a meeting with DP Joris Kerbosch, I’ve already booked in several experienced VFX supervisors as consultants, contacted camera sponsors (want Red epic!), visited greenscreen studio, checked on locations for filming (incl old factories).
The actual filming is still depending on a lot of variables. Current estimate is to rather do it a bit later than too early. Instead of ’2nd half april’ it more is ‘mid may’ now. Final decisions on this will be done with Ian & Joris here, in about 3 weeks.
So; there’s a lot in the pipeline, as soon as there’s tangible news I’ll post it here immediate. Expect more updates here from other team members, and of course from Ian and David when they’re storyboarding. (Ian asked “can i change the script still?” Yeah, sure! Not for long!)
I’ve spent a lot of time doing various rigid body and smoke tests, just to get comfortable with physics in Blender.
I also tried to stress test the tools by using large amounts of active rigid body objects. That last clip with the round church tower collapsing consists of just bellow 10 000 active objects simulated by Bullet via the game engine. Both Bullet and Blender handled that many objects surprisingly well!
In his spare time – while finishing Project London vfx and getting pestered by me for Mango – Ian worked with his mate Scott Hampson on a crazy funny short film… just because! It’s a real zero-budget film; made with passion and shiploads of talent! Effects and 3d have been done in Blender, obviously.
The actual start of Mango-production is getting closer and closer and time is flying.
Since we have an awesome script now and even some great concept-art we can really start to think about HOW we are going to make things happen, technically. I mean, camera-tracking good and fine, but how will we, for example, do the makeup? Can we afford some actual makeup arists, who will turn our actors in some jawdropping vfx characters, covered with gore, blood, bolts, cables, jelly or whatever is required?
Or can we do that all digitally?
The basic idea for digtal makeup is easy, as long as you do not think about it too long. Just have a digital double, track the head, body, limbs or whatever, apply the textures, props or clothes to that digital version of your actor and composite it over the actual footage of your actor. Here’s an example:
This was not too hard to do, but it was a perfect situation:
Enough markers, ok lighting, lots of perspective shift, no fast movements, everything quite in focus, not too far away. And most importantly: No deformation! As soon as the markers move and deform in relation to each other you can forget about doing object tracking.
“When Thom decides he’d rather be awesome in space than keep dating roboticist Celia, he never imagined he was planting the seeds of Earth’s destruction. Twenty years after this tumultuous romance he has to go back to the Amsterdam scene of his breakup with Celia to save the world. But are a high-powered robotic disguise and a time traveling battle fleet enough to fix a broken heart?”
So we do, in fact, already have a script! For a month or so, actually. As always, it’s impossible to know how much of it will make it through to the final film, but it’s a good start! I was looking at David’s concept art the other day thinking, “If we make a film that looks as good as this, I’ll be more than happy.” But man- after the whole team puts 6 months of work and imagination into it? This thing is going to be even more incredible. The best thing about this project is going to be all the layers of ideas; it’s already happening.