So, I've done some googling in the past, but mostly come up dry for what I want to know. I though maybe there were some polycounters with more expansive knowledge that they might want to share.
Games these days have post-processing that lets us make tweaks and modifications to the final frame. The kind of stuff I've seen boils down to:
- Tone mapping and HDR settings within the renderer that affect the brightness and constrast of the final scene. Basic exposure stuff.
- Photoshop adjustments such as Hue/Sat, Selective Color, Color Balance (nice to have, but they are cumbersome)
- a rough color mapping technique that lets you bake a bunch of Photoshop adjustments into a texture. It's a grid of colors that the game uses (for example, if the color in the final frame is [255, 0, 0] look to this coordinate in the texture, and it is colored [240, 20, 0] so use that color instead) This lets you use any any color or value adjustment you can find in Photoshop, but is coarse grain.
What I want to know is: What are the tools Filmmakers have and how do they use them?
I have it in my mind that Cinematographers have this whole library of techniques available to them to get interesting looks to their films, and I want access to that library.
I've heard of "color timing" software. What is this and how does it work? It can't just be Photoshop adjustment layers over time, right?
There was some special chemical film development process used on Saving Private Ryan that gave it its desaturated, contrasty look. What other techniques like this are there?
What are some of the common approaches to photography that we should mimic with our HDR and tone mapping settings?
Replies
But the basics are
scan your video, analog or digital
edit. It's similar to working in layers with photoshop actually but depending on the app will be the limits and your imagination of course.
Bring in your sound
edit
output the final file
Alex
I mentioned the Saving Private Ryan thing because it's a cool effect to try and mimic.
I'm just wondering if anyone is aware of any techniques beyond what I mentioned and beyond what you can find in Photoshop.
Yeah, ok. I'm not talking about how to edit clips or sound. I'm talking about color timing, photography, and post processing. Not what the editor or sound guy does, what the cinematographer does.
[/ QUOTE ]
Cinematographer really doesn't do those things as far as I know. All of the stuff you are referring to is done after the film is shot. Theres a whole slew of people devoted to this aspect of film making. Some people, like a Colorist, handle that aspect you're referring to in Saving Private Ryan with setting and correcting the colors of the movie, then you have editors for timing, other people who take care of green-screen cleanup, etc. There are a variety of programs used, some for each specific job, and besides whats already been mentioned there are also Flint and Inferno.
it's more complicated in color film, but for black and white the 'timing' really just affects the final exposure of the film. for color film I'd imagine you have more control over individual color ranges.. not sure though since I've never dealt with any color film developing
The more modern stuff I think is classified as Color Grading.
Here check this out:
http://en.wikipedia.org/wiki/3D_LUT
that seems to be the hi res version of what I've seen in a couple games.
also:
http://en.wikipedia.org/wiki/Color_grading
It appears Lustre is closest to what I'm talking about. Flint and Inferno appear to be compositing apps.
http://usa.autodesk.com/adsk/servlet/index?id=7668806&siteID=123112
watch the "Color Grading Master Techniques" vids - good stuff.
also:
http://www.speedgrade.com/onset/
http://www.speedgrade.com/di/
Anybody ever used this stuff before?
Tools I mentioned software because it sounded like you wanted to recreate these effects in a game engine, so I'm thinking in those terms. The look films have depends greatly on the lens, then there is how the Director wants the camera moved. The biggest tool film has that cg doesn't is light and atmosphere. Other things used that most 3d software has, the ability to use gels, and masks on lights. Gels have color masks don't. This is kind of like the projectors used in Unreal ed. Depth of field. To get contrasty video you can do it with software to a degree, or go film in the right time of day. You can also put filters on your camera, and if you want to try some experimental crap put Vaseline on the lens, or just dirty the thing up. I'm not sure all the bells and whistles professional video cameras have but I imagine you can set things like aperture, film speed or some equivalent. My photography Professor used to do all kinds of crazy shit to get his shots. For example he cut a hole into a frozen lake around sunrise and set the camera on self timer, he got into the frozen lake swam to the hole he made for his hand at the right time and when the sun was just shining on the apple he was holding to cast and awesome highlight the camera went off and he had his shot.
Nowadays Color Timing refers to the practice of color adjusting each shot to ensure color continuity.
Bleach Bypass is a good one. Check out Cross Processing.
The company I used to work for, we'd try out effects in photoshop on still frames, get a look we liked, and then emulate that with Final Cut or Color.
it's a very cumbersome way of grading the scene over time and requires those quite extensive toolsets.
a game engine should give you all that info for free as part of the rendering process. just a matter of exposing that info and connecting it to post-processing effects in an artist-friendly way, i'd assume?
afaik, they used colossus on lotr, which now goes by the name of ...lustre. autodesk assimilates them all!
Each corner of the box is a different primary color, rgb & white & black and can't remember the last one, maybe yellow? This is the setup for no color change.
Then to get the color changes I'd like, I take some screenshots of the game, add Adjustment Layers for what change s I want, then apply those layers to the cube bitmaps.
You can do a hell of a lot this way. We can even add noise to the images, does some interesting things.
We're also using 1D gradients for simple things like a thermal imaging look, night vision, that kind of thing. Same technique as the volume texture, just less control. Someone else posted some examples of 1D gradients not long ago, I think it was poopinmymouth?
Other film-like effects are now being mimicked with HDR/Tone Mapping and screen space shaders (DOF and Vignette).
The last thing I think would be fun would be to be able to draw on masks from the game (like, here's a sky mask, or here's a character's heads mask, or here's a depth mask), and apply the LUT based on that. It would allow you to, for example, shift the scene towards a simplified color palette in the distance.
Or you could use the scene depth pass to modulate between untouched framebuffer and LUT'd framebuffer. We do something similar here for a particular visualization mode, anything within a certain area is shaded normally and everything outside it uses a totally different draw mode, basically just using an opacity-mapped mesh as the mask between them. Can't wait to play around with this more.
I dug up the colorcube we use for the 3D LUT. The far corner is white, so that gives you all three primaries, the three secondaries, and black and white, for a nicely adjustable color range.
You just have to slice it up such that the edge pixels stay as 100% values, not watered down. Like if you take a 16x16 pixel slice, the corners need to be adjusted to be solid colors. Dunno if that makes sense.
http://www.videocopilot.net/tutorials.html?id=46