Does anyone have any experience trying to create an hdr sphere from a single photo? Most of the info I've found has to do with going out and taking pictures at various exposures, but what about faking it from a single pic?
This seemed to be kind of close:
http://www.luminous-landscape.com/tutorials/hdr.shtml but again it looks like these were photos taken on location w/ known f-stops. Hrrmm.
Replies
should be applicable to other packages than max as well.
If you compile these things into a single hdr image they can then be used in any environment that supports them
1. HDR does not mean bloom.
2. You do not need any form of HDR to render bloom
3. Tone mapping or exposure control is usually the only benefit HDR rendering has in a game, you can blind people when they look at the sun and then adjust the exposure as their "eye" adjusts.
4. I bet most game artists would be hard pressed to tell the difference between xbox360 games rendered in full HDR or a game rendered with an "overbright" solution in a 0-2 range.
In example, would I create a HDR for a character diffuse? If so, Im not sure how the above processes shown would apply to create a hdr from its ldr since it isnt a environment nor photo sourced.
One thing I think that is being confused is that you never have to make HDR images to have a game use HDR lighting, the code could simply allow for lights to have a range higher/lower than 0-1, and te renderer will just adjust the exposure to light.
Another interesting lighting demo I saw recently (listening in on GDAlgorithms). Not HDR, but a possibly robust realtime global illumination system. Nice looking, would solve a lot of problems, but I'm really skeptical it could be used in a real game.
Putting any HDR bitmaps into a game is not applicable yet as they are 32bit images so you eat up the texture ram quick.
[/ QUOTE ]
You can do some fancy tricks though - ie putting in an "exponent" value into the alpha channel lets you drive the brightness much higher, and the texture can still be saved as a regualr dxt or whatever. Just a little pixel shader math to bring it back to near full quality. You lose a bit of detail (banding) due to quantizing the data - but it can look pretty good. Mostly just used for skyboxes anyways, and overdriving the brights.
HDR as a texture usually a few different exposure times on a single still image. Combined they create several light ranges to mix between. Use is as a cube map in an engine or render that supports those extra lightranges and you get all the nifty blinding light effects.
Correct me if i'm wrong.
We use the tech Whargoul mentioned, alpha is the 0-200% exponent, for skybox/reflect/transmit cubes and for emissive-glow fx. Then we have a camera-like global exposure control that uses those alphas and any high-exponent light values to influence the over/under exposure. Not fully working yet, but pretty cool so far.
Here's a cool little demo of HDR exposure control, etc. Fun to play with, if you have a decent card.
I've been painting on a Linear Dodge layer, then moving that into the alpha. Or just copying the RGB and Levelling it down some. It's just not very intuitive I guess.
btw, I saw this paper today. Nice writeup of how/why bloom/flares occur with the human perceptual system.
Physically-Based Glare Effects for Digital Images (3MB PDF)
Suffers a bit from programmer art syndrome, but the ideas are very readable.