a normal map stores normal vectors. each RGB channel in your image stores one of the vector's components (xyz). Some shaders like that the vectors being constructed of these values have the length 1. However if you just paint a normal map by hand, there's no way for you to tell how long the vector which a certain pixer represents really is.
"Normalizing" takes care of that. It adjusts the pixel's RGB colors so that the resulting vector would have a length of 1.
Some shaders don't care if the map is normalized or not, so you don't see a difference when loading un-normalized and normalized maps.
but to a layman, what does that actually mean Kwramm ie what would a vector length of 2 look like. To me the only diff is that when you normalize the map it seesm to add soemthing to the blue channel
I also haven't seen a huge difference when the maps are off, though they usually aren't off by a lot. I have had normalization be a requirement for tech reasons though.
At my previous company, we used performance optimized textures that only saved 2 channels worth of normal information. So for example, the texture only used the red and green channels of the normal map, and we used the blue channel to store the spec map. That way we could use a single dxt 1 texture instead of using a separate spec and normal texture. Knowing that it had to add up to 1.0, if the red channel was .3 and the green channel was .2, the engine knew that the last channel had to be .5. However if you had manually painted your normal map without re-normalizing, and your pixel was .5,.3,.7, when it went into the engine it would see .5 and .3 and calculate that the blue channel must be .2, when if fact you wanted it to be .7, so your in-game normal no longer matches the one you painted.
Ruz: the effects you encounter would depend on what the shader or your pipeline does with the map.
Could be stuff like PredatorGSR mentioned, it could be artefacts, wrong looking normals which are too strong or too light, .... some shaders/pipelines can handle non normalized maps, others don't.
Most viewport shaders just don't seem to care - for personal work I don't bother normalizing. But it's different when your stuff ends up in a game, where it's processed by build tools and stuff gets converted and packaged, etc.
If you're in doubt, just normalize your map to be on the safe side, I'd say...or just ask the TAs in your company what to do.
but to a layman, what does that actually mean Kwramm ie what would a vector length of 2 look like. To me the only diff is that when you normalize the map it seesm to add soemthing to the blue channel
Essentially normalising takes the vector (in this case RGB, representing XYZ) and then multiplies each channel by a value so the length of that vector adds up to 1.
That means that the normal value of that pixel still points in the same direction (each value has been multiplied by the same amount) just that it's length is the same as every other pixel (1). Otherwise you get things like overbrightening, when the length is more than 1 or overdarkening, when the length is less than 1.
So it'll affect the blue channel because it's being multiplied by the same amount as the R and G channels for that pixel. As the blue channel is 1 by default (in tangent space, at least) then it'll more than likely get darkened a little, which'll be most noticeable on the blue.
Some engines will want normalised maps in, some will normalise them itself, some won't care.
Great thanks for the info everyone. So I asked our rendering engineer what the point is and he said exactly what you guys said. Interesting though, he also said he renormalizes all the textures in the runtime shader so basically it doesn't matter as he has to do that step anyway to account for trilinear filter/anisotropic filtering at run time. I'll put a normal map side by side in the engine today and see if there is a difference, seems like I don't have to do this step in our engine.
At my previous company, we used performance optimized textures that only saved 2 channels worth of normal information.
... in this case the shader is calculating the blue channel data for you, so you automatically get normalized normals.
The artifacts I've seen were only apparent when the shader was written to not renormalize, which I think is pretty rare. When it happened, my specular highlight rendered as a noisy mess.
It's still good to normalize a map even if the shader normalizes at the end. For example, UnrealEngine normalizes the result of whatever nodes are plugged into the "Normal" input. The problem is it doesn't fix any combinations or mixing, multiplying or whatever you're doing before that end result. So if you have one normal map with crazy values mixing with a normalized normal you could skew the normals in a way that you're not expecting.
Generally, if you're blending your maps in a smart way, there is really no reason to re-normalize. If you put a normal from bump image over your baked normals, and just set it to overlay, without tweaking the blue channel you definitely will want to re-normalize.
Replies
I just do it out of habit when making normal maps. To me it's akin to triangulating a mesh before exporting to the an engine.
I'm not sure if I've ever seen any of the problems listed in the wiki article but I haven't not renormalized any normal maps in a very long time...
"Normalizing" takes care of that. It adjusts the pixel's RGB colors so that the resulting vector would have a length of 1.
Some shaders don't care if the map is normalized or not, so you don't see a difference when loading un-normalized and normalized maps.
At my previous company, we used performance optimized textures that only saved 2 channels worth of normal information. So for example, the texture only used the red and green channels of the normal map, and we used the blue channel to store the spec map. That way we could use a single dxt 1 texture instead of using a separate spec and normal texture. Knowing that it had to add up to 1.0, if the red channel was .3 and the green channel was .2, the engine knew that the last channel had to be .5. However if you had manually painted your normal map without re-normalizing, and your pixel was .5,.3,.7, when it went into the engine it would see .5 and .3 and calculate that the blue channel must be .2, when if fact you wanted it to be .7, so your in-game normal no longer matches the one you painted.
Could be stuff like PredatorGSR mentioned, it could be artefacts, wrong looking normals which are too strong or too light, .... some shaders/pipelines can handle non normalized maps, others don't.
Most viewport shaders just don't seem to care - for personal work I don't bother normalizing. But it's different when your stuff ends up in a game, where it's processed by build tools and stuff gets converted and packaged, etc.
If you're in doubt, just normalize your map to be on the safe side, I'd say...or just ask the TAs in your company what to do.
That means that the normal value of that pixel still points in the same direction (each value has been multiplied by the same amount) just that it's length is the same as every other pixel (1). Otherwise you get things like overbrightening, when the length is more than 1 or overdarkening, when the length is less than 1.
So it'll affect the blue channel because it's being multiplied by the same amount as the R and G channels for that pixel. As the blue channel is 1 by default (in tangent space, at least) then it'll more than likely get darkened a little, which'll be most noticeable on the blue.
Some engines will want normalised maps in, some will normalise them itself, some won't care.
The artifacts I've seen were only apparent when the shader was written to not renormalize, which I think is pretty rare. When it happened, my specular highlight rendered as a noisy mess.