Weekly – April 13

by - 2012/04/14 48 Comments Artwork, Production

This week was spent on a lot of modeling and testing efficient ways to recreate photo-realistic parts of Amsterdam. How much to do with photo/footage projection? How much is real modeling here? A lot depends on the final shots of course, but it’s crucial to master the art of recreating realism in Blender asap, before actual filming. The tests I’ve copied above are just first trials, it’s research in progress. As soon as good results or conclusions can be shown it’ll be here on the blog within a day. :)

We’re also very happy to have the help from production designer Romke Faber for the sets! Ian produced an extensive briefing. Together we should visit the studio asap to inspect the possibilities there.

For next week the main question to answer is “Are we going to film the bridge breakup scene in Amsterdam on location, or in a studio with greenscreen?”. Lots of factors to weight here, including financial ones!

Oh: anyone knows where to get the best quality chrome balls to photograph reflection maps?

-Ton-

  1. Tycho says:

    waaa so cool

  2. Tycho says:

    Why really a chrome balls ? Have someone test something like what used in astronomy : a glass ball with a little silver or aluminium on the surface.
    Glass ball wont be hard to find and it’s not expensive to put aluminium on it …
    ps can you upload the 6th render blend files

    • PhysicsGuy says:

      We have a small vapor deposition chamber for chrome and gold. I can see if I can dig up a glass sphere and coat it.

  3. Luca Napoletano says:

    I love the robot hand integration test. I looked so hard to see it fake.
    Maybe the stylish design tells your brain its not real, but visually it looks like its there in the real world.

  4. Armando says:

    I can see the reflection of 2 chairs in the silver ball and 1 chair in the vase?. So they are real…..or not? The robot hand on the table look great. Just a real one.

    • – pretty much exactly what we used for a commercial shoot in Kenyan desert in 2010, worked really well and was cheap enough to have spares in case of disaster…

      https://picasaweb.google.com/lh/photo/azXiVFsAhzY2RXCgFKf629MTjNZETYmyPJy0liipFm0?feat=directlink

      shiny balls in action :)

      • Marcelo says:

        I think the problem with chrome balls is that you will also need to edit out your own reflection from each image taken, besides doing panorama building…

        I agree that for specular light maps, a standard panoramic view will be the best.

        For mapping both the diffuse and ambient light contribution, a matte sphere still could be useful.

        • – yeah, you just have to clone out yourself,, use as long a lens as you can (this will get you as far away from the reflection and as small as possible and make yourself small (as I am in the picture)

          the nice bit I never understood about shiny balls is that you get a lot more information than you think; the edges contain lighting data from the whole 180 degrees, fro this shoot we did 2 shots, each about 45-60 degrees apart, giving us pretty much full light coverage. As the light in this location was fairly stable, we ended up using only 3 different lightmaps, one from early day, one midday (hot) and one from late afternoon (even though we did take lightmaps for pretty much every setup). We also reversed all the camera angles as the lighting changed, making sure we shifted to a roughly matching BG to avoid continuity errors – of course this is easier in a natural environment such as this, probably impossible in most urban locations…

  5. Ben says:

    I thought you had had the robot hand 3D printed… (which might not be a useless idea actually. Are you considering that for any props?)

  6. ZetShandow says:

    Well, if you want are the maps of reflection would not be as difficult as taking a picture of what’s behind the cameraman recording, sorry my english

  7. Marcelo says:

    Be sure to also grab a matte sphere, to better capture diffuse lighting behavior.

    Usually these spheres are called “light probes”, and there’s plenty of references out there:

    http://www.pauldebevec.com/Probes/
    http://gl.ict.usc.edu/Data/HighResProbes/
    http://gl.ict.usc.edu/HDRShop/tutorial/tutorial5.html
    http://www.unparent.com/photos_probes.html

  8. ZetShandow says:

    a question, How will you do if the reconstruction does not allow cycles movie clip texture?

    • ton says:

      Then we tell Brecht to implement it :)

      • ZetShandow says:

        oh right, well if they correct the error of the lack of movie clip texture could use the technique to record what is behind the cameraman, I use this technique a lot in camera tracking the call “back to back”, well if they had used chrome balls to take the reflection map would have realized the error when using glass material objects, because the reflection map would be easy but the problem would be the time to use the IOR, as when they have corrected the mistake I can make the please let me know and send the corrected version, sorry my english and sorry to have put three times, apparently my computer was damaged a bit, thanks

  9. Armando says:

    I’m from Belgium and i would like to share this link to an amazing Belgian add. Thanks. But Mango will be better. yes?

    http://www.youtube.com/watch?v=316AzLYfAzw

  10. yellow says:

    Chrome balls? What about a fish eye on a nodal pan head, just too much work cleaning up and merging exposures off chrome balls.

    If really must use them, google hollow polished steel bearings, get them all sizes, they’re tough din’t shatter and as they’re hollow they’re light.

    Nodal pan+fish eye lens+ Canon 5D is the way to go.

    • Nate Wiebe says:

      I agree here, also taking photos and merging with hugin may also give good results.

  11. FrnchFrgg says:

    Why not doing an equirectangular HDR panorama with a good tripod and a good head ? Sure, 12 shots for horizon + 7 * 2 for the up and down rings + 3 * 2 for the poles = 32 shot positions * 6 exposures is a lot, but when a tripod is used hugin automatic aligning does wonders…
    And you have the RAM to merge all these into a 400000*400000 exr :-)

    Perhaps overkill but IIUC thats how they made their lightprobes at http://gl.ict.usc.edu/Data/HighResProbes/ (with fewer images).

    • FrnchFrgg says:

      I should have said that I personally tested the method in my own living room, with handheld taken photos and it already worked great.

  12. Nice. Though I did miss the robotarm in the reflection of the chromeball and vase.

  13. JaydenB says:

    The best and cheapest option that works is using those Garden Chrome balls. They even come with little things on the bottom which a tripod could hold! (with a small attachment)

  14. Phil2.0 says:

    I’m really astonished about the way you work, you go on modeling, scene layout without real and strong art direction and any good, detailed finished concepts artworks.

    You do everything on the go. I can’t say this blog gives this impression or not.

    You have what? 3 or 4 months left and lot to do. Don’t you think this blog is going to slow you down?

    • kjartan says:

      … no not really. we only post when it feels right to do it so it’s not that time consuming.

  15. mastermind says:

    I hope the explosion will look ok…… and the fracturing :D

  16. Luciano says:

    buy a styrofoam ball and send it to a bike shop to be chromed, it’ll be PERFECT and cheap.

    • PhysicsGuy says:

      Styrofoam is not smooth enough to begin with. This means the chrome on top of it will also be bumpy.

  17. ZetShandow says:

    oh right, well if they correct the error of the lack of movie clip texture could use the technique to record what is behind the cameraman, I use this technique a lot in camera tracking the call “back to back”, well if they had used chrome balls to take the reflection map would have realized the error when using glass material objects, because the reflection map would be easy but the problem would be the time to use the IOR, as when they have corrected the mistake I can make the please let me know and send the corrected version, sorry my english, thanks

  18. 3pointedit says:

    mirror speheres, lots of old FX blogs talk about this. Most try large Christmas baubles. Others use Big bearings. But none of them take volumetric lighting into account :( that is pools of light that you walk through.

    And I cannot believe that you got a hires volumetric render in just 7 SECONDS!! :O

  19. Christoph Pöhler aka Dracio says:

    I don’t have a chrome balls
    I use an hand blown Christmas ball (this is serios)
    cause it is made of glas and work like a real mirror
    radius of ca. 15 cm

  20. delic says:

    I would rent a panoramic camera designed for that, or hire a company that owns one to take the needed sphericals in hdr.

    Sure you can find one in NL.

  21. Tycho says:

    I just realized something : no matter what the type of reflecting sphere you will use, you are only gonna have an hemispherical environnement.
    That no worth for reflecting shader since if its reflecting something (and if they arent couple of reflecting objects that face each other : in other word glossy-diffuse direct), it surely came from the hemispherical that is centredin the camera and face the scene.
    But what about transmitiv shaders ? It will be just black (I mean there will be the film but not the light) and even if you are gonna film in hdr, it would be hard to connect the front light (from the ball) with the back one, without any seed (visble in inclined object)
    So I thing using a other camera (no need she would be 4 k since its just for lighting : and for the hight-reflectiv-low-roughness reflections, the quality would just be better than a sphere-based environnement, even if film in 4k) with a fish-eye lens.
    What about such a solution ?
    But anyway, it would be cool if blender can use both solution (fish-eye lens for hight quality, normal chrome sphere for common people).
    Keep this good work.

  22. sozap says:

    correct me if I’m wrong but the nice thing with chrome ball is that you shoot it with the same camera that take your shot, so you have the same exposure and white balance that the shot you’re filming right ? That’s why chrome ball are used instead of fisheyes lens.
    Do you plan to have one chromeball per shot, or per scene ? and will it be used as a texture in the world (in blender) or directly in the materials ?
    There is plenty of application of chrome ball , I’m interested to know how will you use them…

    And the robot hand integration is very clean, and the others image show you’re progressing nicely, I’m looking forward for next blog posts and advancements !

    keep up the good work !

    • yellow says:

      They’re looking at shooting raw in motion and stills. White balance has nothing to do with raw? Perhaps they consider UniWB also? This is 4K delivery.

  23. Christoph Pöhler aka Dracio says:

    By the way the Picture withe the hand is madness so brilliant cause
    my very first thade(after WOW…..) was
    “I hope they sall this in the Blender Store!!!”

  24. Matt says:

    You’re much better off taking panos for reflection maps. You can do it either with a couple of shots from a fisheye, or with several more wide angle shots.

    http://www.fxguide.com/fxguidetv/fxguide-142-nuke-skies-for-happy-feet-2/

    If you don’t have the time or patience to do your HDRIs this way, you can probably get away with assembling those from a chrome ball, but the bumpy reflections in a ball won’t be high enough quality to use for reflection maps, especially when you take distortion, camera reflection removal, etc. into account. A chrome ball (and 18% gray ball) is still a good idea for test frames shot from your actual footage camera, so you can line up exposure and lighting by eye too.

    PS. careful with that ‘west-facade’. The cracks don’t look like they’re made from bricks at all.

    • Dexter2999 says:

      Not sure that I follow the correlation between the NUKE article (which I have not watched) and HDR for use in Blender. If the skies are done for NUKE then that is the compositing stage, isn’t it? How would that help in attaining a match for the 3D elements to the live plate elements? And do these techniques translate to Blender’s compositor?

      Didn’t watch the article (yet) as I wasn’t terribly fond of the original HAPPY FEET, at least not enough to watch the sequel.

  25. Hubberthus says:

    Showed the robot arm pic to my wife, she just looked at it and said: What?
    I smiled which made her say: What? Did they print it out?
    Had to tell her that the arm was rendered in Blender because she didn’t have a clue why I was still smiling. :D

  26. Sam Schad says:

    Ton-
    Fly me over from the States and I’ll do all your HDRi panos on-site the week of shooting. (I’m only half joking!).

    Back to the question at hand: any chance of someone posting a screenshot of the node setup on the hand? Would love to take a look at that.

  27. Sam Schad says:

    Ton-
    Have you ever met this gentleman? Looks local to you, and does some really great stuff.

    http://www.bobgroothuis.com/blog/category/dutch_skies_360_online_shop/?lang=en

  28. roofoo says:

    I have a chrome juggling ball that works well. You can get them from juggling supply stores, or here. http://www.jugglingstore.com/store/detail.aspx?ID=809

  29. Jon N/A says:

    *Looks at the Musical Instrument as if it was a cake* Yum!

  30. andreas says:

    Just started my own HDRI environments and texture collection that you might want to check out: http://www.hdri-hub.com

    I make them at pretty high resolution, so I can also use them as backplate.