How to ? Directional lightmaps with 3dsMax/Vray

polycounter lvl 8
Offline / Send Message
Noors polycounter lvl 8
Hey ppl :)

I'm in the process to improve our lighting workflow at work.
Problems :
No one here has really skills on advanced lighting technics.
Also I'm a simple cg artist and technical papers are beyond me.
I'm very sorry if my technical vocabulary is approximate and my post as messy as my brain.

We need to stay "mass appeal" (also web appeal :s) so we can't use top full dynamic lighting with ssao and stuff. So i've looked at Radiosity Normal Mapping aka Directional Lightmaps technic from valve hl2, which looks to be a good compromise for us.

Basically it allows to use normal maps on totaly prelit models, with 3 lightmaps storing the lights incoming from 3 directions.
aka the normal map influences the diffuse of lightmapped models.

As described here

and improved here with self shadowing normal maps

I've found many informations, mainly from researches of Ali Rahimi.
who generates the 3 lightmaps in Maya with Turtle there

Last version of lumonix shaderFX includes a radiosity normal mapping node.

The shader needs a modified normap map called ssbump in order to speed calculations. ssbump can be generated with this

Then the normal can't be used anymore for specular calculations but it's another problem.

I was looking for a way to generates the lightmaps with Vray in Max. Our editor doesn't support lightmaps generation *shrug*
From what i understand, there's 2 way to achieve it :

Screwing mesh normals with a maxscript so they all point in the right direction before each lightmap generation.

Using a NormalBump/vraynormal in the bump channel of the max/vray material, special normal maps based on valve paper. I'm using maps from Viik on this thread

I quiet fail atm because light doesn't seem to come from the right directions on my normal map. Actually it's like there's no logic. On the same "angle plane", normal map will react differently :poly132: I bet it has to deal with uv and tangent/binormals stuff. I tryed with swapping RGB maps but no success. It's a bit complex to me.

I know Neox from polycount used directional lightmaps, generated with vray on Airborn so i guess, it's possible :p

QUESTIONS ARE

Do you use the same process to produce directional lightmaps in 3dsMax ? Can you describe it ? yeah Neox, you !

I don't understand which transformations i have to apply on my mesh normals to screw them the right way. Do someone has a little script to do it based on vectors desribed by valve's paper?
I read that UDK doesn't use this technic anymore. What coud be the alternative to use lightmaps and normal maps at the same time and at low calculation cost ?

Thank you very much for you time.


For the record : UDK uses the same technic for their 3 components lightmaps though it is coupled with a shadow map, to hide specular in shadowed areas.

Their "simple lightmap", used in the citadel demo, is just an HD basic lightmap with the normal maps relief baked into it, thus, theres no normal map in the final file (and no specular i guess)

Replies

  • Eric Chadwick
    Sorry I don't have any info to share, but this is a cool post, hope you work it out.
  • o2car
    Offline / Send Message
    o2car polycounter lvl 10
    We used Radiosity Normal Maps in Mirror´s Edge. Very memory consuming though, thats why they changed it in the UDK and later versions of unreal 3.
  • hijak
    xnormal can bake radiosity normal maps, although ive never had a reason to try it or seen it done.
    I would think depending on your situation you could do something like. Create bent normal map's, and a lightmap cube, and have it sample lighting from the cube. Might yeild decent results and transition well into self shadowing.
    also if you plan to use baked lightmaps, why not just bake the lightmaps, with the normal maps applied, this might yield decent results. Just throwing ideas out there.
  • commander_keen
    Offline / Send Message
    commander_keen greentooth
    I would say you should try adding normal map and specular map info into the screen buffer in a separate pass, like this:

    Pass1. render the object with diffuse and lightmap.
    Pass2. render normal map and specular highlights(using actual light positions, or perhaps prebaked per lightmap pixel light normals)
    Pass3. render normalmap and specular shadow

    then combine them like this:
    finalColor = (Pass1+Pass2)*pass3

    this method lets you easily scale it for lower end cards. you can simply disable the second 2 passes if they are not supported by the graphics card.

    edit: the second 2 passes could be combined in to a single pass too as long as you dont have to save the pass to a texture for later use.
  • Ali Rahimi Shahmirzadi
    If you only have 1 light (sunlight for example) There is no need for RNM. But if you have lots of light RNM is a best choice. As far as i know no other render engine support RNM (Only turtle).
  • jazznazz
    Offline / Send Message
    jazznazz polycounter lvl 7
    Hi everyone, first post here . Nice forum, helpfull one too :)
    @ Noors - I'm using exactly the same workflow you described, max+vrayRawTotalLight bake element+those 3 R.tiff G.tiff B.tiff bitmaps set to normal map in the material. Like you, I couldn't find a different approach. Right now the programmer is writing the shader, we'll see what will come out of it. I'll post screens later. Good luck
  • jazznazz
    Offline / Send Message
    jazznazz polycounter lvl 7
    Hi again . Vray is the best renderer out there in my opinion, but I couldn't make it render DLM :( I switched to maya/turtle, so far its working just great. Noors, any luck with Vray ?
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    Hey guys, thanks for your replies.

    Nope i didn't do further tests though i'm pretty sure it's possible to screw the mesh normals with maxscript and then render eachdirectional lightmap correctly.
    Yeah looks like it's a lot easier with turtle so i'll make tests with that.

    Oh btw thanks Ali, didn't notice your reply, i'm littlebigkebab on shaderfx forum ^^

    Will look more deeply into all your advices but now i'm also interested in how the lightmaps interact with normal maps in UDK. Looks like they only use an average direction of all lights, right ?
    Bah, i wish our engine did that shit automatically ^^
    thanks !
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    Hello there !
    Here is finally how i've done it with vray. It's fairly simple, but since i'm shit at understanding technical papers, i struggled long time.
    So i've simply overidden the mesh normals with applying 3 different materials with a special color in the bump channel as normal map for each. This is the technic i was talking about on the first post, but sadly the colors were wrong, and i didn't have a clue how that guy found them.
    But they are just based on Valve paper and their tangent basis. So the formulas, which can be find in valve paper are:
    X {sqrt (2/3) , 0 , 1/sqrt 3
    Y {-1/sqrt 6 , 1/sqrt 2 , 1 / sqrt 3 }
    Z {-1/sqrt 6 , -1/sqrt 2 , 1/sqrt 3 }

    once normalized to 0 - 1 (+1, and /2) gives :

    X/red
    0.908248
    0.5
    0.788675

    Y/green
    0.295876
    0.853555
    0.788675

    Z/blue
    0.295876
    0.146446
    0.788675

    So basicaly, create 3 vray mats, named Red, Green, Blue.
    Assign a VrayNormalMap in bump channel (put it to 100). Use the same map channel as your future normal/diffuse map, so basicaly 1. In the VrayNormalMap, create a VrayColor, and fill it with the previous info.
    For example, fill the Red VrayColor with :
    red : 0.908248
    green : 0.5
    blue : 0.788675
    Then do the 3 bakes:
    Apply the Red material, do your render to texture
    Apply the Green material....

    Then, shader FX has a radiosity normal map node. So my sfx looks like this :
    rnmyzy.jpg
    There are 2 multiply and 1 add coz i use modulate 2 x, but you can simply use the RadiosityNormalMap * Diffuse.
    note : the RNM node uses regular normal map and not ssbump map. It does the conversion itself.

    ALSO, tangent basis is related to UV. If you ever rotate your channel 1 uv afterwards, you have to regenerate the lightmaps coz the basis has changed !

    And tadaaa, shity test with 1 red and 1 blue light.
    So the result in max, it's only lightmapped with 3 64*64 lightmaps. 1 normal map, 1 diffuse. No dynamic light.
    rnm2.jpg

    Same with flat normal map :

    rnmno.jpg


    Have to add specular now !

    Now this can be optimized (1 texture with only lighting directions in each channel, and 1 texture/vertex color for ligh color) and i'm not sure it's the way UDK handle it. Still, it's a first step
  • pailhead
    Hi, What kind of output are you using in the shaderFX grab above? If i understand correctly, whatever your lighting is when you are baking with the normal override materials, you should set all your lights to white? I figure simple desauration could work too,but is that what is happening, the above tree should be combined with a color texture on top of everything?
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    mmh no, the 3 lightmaps have the color information. I was thinking of a way to optimize this, instead of 3 rgb maps, use 1 rgb map for the color and 1 rgb map with 1 "desaturated" lightmap in each channel.
    But someone here with the highest skills (oscar i think ?) told me they tried on Mirror's Edge and it wasn't looking as good as 3 rgb maps.

    I didn't push anyway further but ShaderFx radiosity node is meant to be used with 3 rgb maps. You could probably use desaturated lightmaps and add a color node on top of it tho.

    Most engines have advanced baking tool to not deal with all this tho (for artists minds sake :) )
  • pailhead
    Right now i'm struggling to actually output the lightmaps. I've been using the 2.2 gamma correction as a way of pumping more light into my renderings for half a decade, but i don't really work in linear, as in, other than rendering in linear, but all my output was always in 2.2.

    I can't for the love of me, output what i want, when i have gamma correction off in max. I had to inverse the exr twice to get the right result.

    I just realized that i asked kind of a dumb question, as the light maps definitely showed up with color. But i'm under the impression that something was wrong with the normals, until i color corrected it. Was late last night when i was experimenting :)

    http://wiki.polycount.com/RadiosityNormalMap
    All three RGB channels are used to store the light vectors, so the light maps could not also store the colors of the lights. Apparently they used vertex color to store the light colors.

    This refers to a single RGB lightmap, not three inputs as shaderfx handles it?

    I was even able to color correct my light maps this way, shift some hues etc, but i kept the value of each pixel to what came out of the bake. maybe that's how it samples the 3 channels it needs for the direction, HSL or something. If desaturating these and then adding the color info as you've said, would you render a 4th map for the color information, with a regular material instead of the normal override?

    I'm trying to understand what exactly is happening in the whole process, i definitely have it working but i'm not sure if it's working right.
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    Yeah, again, i'm only artist with light knowledge of the technical stuff.

    Gamma correction is also a different issue. In liner workflow, ligtht calculation is made in gamma 1 with diffuse textures at gamma 1, and only at the end the gamma 2.2 is applied.

    I do think the lightmaps shouldn't be gamma corrected. Only the final buffer should.

    Yes the wiki refers to a single rgb map with light intensity, the light color information beeing stored in vertex color. Atleast that's what we assumed looking at the picture.
    Yes that would require to bake the color but i'm not sure how (or extract it from the lightmaps)
    The map souldn't contain any grey shading, only pure color
  • o2car
    Offline / Send Message
    o2car polycounter lvl 10
    Noors wrote: »
    mmh no, the 3 lightmaps have the color information. I was thinking of a way to optimize this, instead of 3 rgb maps, use 1 rgb map for the color and 1 rgb map with 1 "desaturated" lightmap in each channel.
    But someone here with the highest skills (oscar i think ?) told me they tried on Mirror's Edge and it wasn't looking as good as 3 rgb maps.

    I didn't push anyway further but ShaderFx radiosity node is meant to be used with 3 rgb maps. You could probably use desaturated lightmaps and add a color node on top of it tho.

    Most engines have advanced baking tool to not deal with all this tho (for artists minds sake :) )

    Thats right.
    I strongly believe you loose some color contrast using the "monocrome lightmaps". At least if your resolution is lower (you only have one color per texel)

    Here is an example of a simple viewport shader in Maya, baked with turtle using 3 RGB lightmaps.
  • kaz2057
    I reopen this discussion to find helps with RNM using Vray.

    I prefer to reply Beast/Turtle DirLightmaps, using compressed version.
    It is important render each 3 lightmaps and finally copy in unique DirLightmap (X_Vector in R Channel - Y_Vector in G Channel - Z_Vector in B channel).

    But I found a big problem ... My Blue Lightmap is offset when I merged 3 lightmaps together. Anybody can help me?
  • kaz2057
    Anybody know if it possibile rendering just one Direction Lightmap, including all three vector, in 1 baked map?

    In vray render 3 times all the scene take too long time ...
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    You could save to file and re-use the irradiance map and lightcache as they shouldn't change between renders. That would speed up the rendering, but no, you have to render the scene 3 times as you have to change the normals.
  • kaz2057
    Dear Noors,

    I hope in your support about a problem that cause my nightmares.

    In your tutorial, you advice to apply Channel 1 into VrayNormalMap Node.

    But I have a problem with overlapping.

    In my UVCh1 on my geometry I have a lot of overlap. When I render my RNM_x there are a lot of errors ...

    But I can fix, for example with a plane with many subdivision for crate internal square, making a unique fitting planar projection without overlapping.

    I see that when I import my geometry in unity, using real time lighting, I have the same error. It seems a tangent error ...

    Any Advice?
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    You assign the normalmap to channel 1, but you render the lightmap on a second channel where you have no overlapping.
    I'm not sure to follow you on what you're testing in Unity.
  • kaz2057
    Dear Noors,

    thanks for quickly support.

    However I ecountered problem before the bake. (I bake on UVCh2 and use VrayNormalMap ch1). I have error in simple render.
    If you can help me, I upload my beta scene.

    https://mega.co.nz/#!QYQB0AQa!NqkvZCxQvm8Cav_QC2BzpW9rVdlEYL0cT7ndyltYvkg

    In max scene, you wiil find RNM material assigned in override vray rendering slot.
    I use vray 2.30.1.

    Try to open the scene and click render ... you can see my tangent space error ...
    If you try just to add an UVW Map planar fitting size on plane, the error is fixed ...

    Please help me :)
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    I cant open the scene (>2012) I'm not sure what you want to achieve by rendering the scene. Post a screenshot instead but considering the altered normals, you probably will have a strange result with a direct render. There's no really point to do it imo.
    Set your lights and render options with no vraynormal, then just bake 3 times with the 3 vraynormalmap.
    I honestly think the process is a pain considering that decent editors do it relatively transparent for the user (unity included), and vray sin't really designed fo this, but i guess you have your reasons to do so.
  • kaz2057
    Dear Noors,

    after some tests, my problem is shared also in Unity Lightmap Baker (Beast), infact it is a TANGENT problem of geometry.
    As soon as possibile I reup my filemax =<2012 and I hope you have time to help me
  • kaz2057
    Dear Noors,

    I fix my Tangent problem with geometry.

    It was a rotation problem of each UV island. I need to rotate each UV island on same direction, and so tangent is the same for each face on geometry :)
  • kaz2057
    I have just another question ...
    Your tip to save IR and LC map before to bake is very good.

    I ask you ... Can I use this maps also when I bake VrayRawLightmapMap?

    Correct pipeline would be to render from a vray camera (in my scene I use vray camera) a test render to max square resolution (like 2048x2048 or similar to render to texture) and then when I bake I choose IR e LC "load from file".

    I think that vray camera FOV need to be included all the object in scene to cast correct ray sample ...

    It is correct?

    Thanks
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    It's weird that uv rotation causes issue. It indeed changes the tangent binormal repair but your bake should take it into account.

    Ah, i think vrayrawlighting map is just the direct lighting, it doesn't use GI of IR and LC. But both (GI+direct) are calculated when you perform a GI rendering, so just output both.

    As for generating the IR and LC, indeed, i'm not sure the render to texture generate the maps ? Else you'll probably need to animate the camera on a few frames, place it at different places so samples are calculated everywhere and not only from a single point of view. And chose multiframe incremental mode of fly through.
    This is what is used for rendering animation.
  • kaz2057
    Noors wrote: »
    It's weird that uv rotation causes issue. It indeed changes the tangent binormal repair but your bake should take it into account.

    Ah, i think vrayrawlighting map is just the direct lighting, it doesn't use GI of IR and LC. But both (GI+direct) are calculated when you perform a GI rendering, so just output both.

    As for generating the IR and LC, indeed, i'm not sure the render to texture generate the maps ? Else you'll probably need to animate the camera on a few frames, place it at different places so samples are calculated everywhere and not only from a single point of view. And chose multiframe incremental mode of fly through.
    This is what is used for rendering animation.

    Sorry I typed wrong. I was tell about VrayRawTotalLightingMap.

    However you confirm that multiframe inc mod is not the right way for bake lightmaps ? !

    Then if you can help me, I have another question.

    Because of I bake Lm with Vray, I use LWF with gamma 2.2.

    When you set color for RNM, you modify the value on LWF color scale?

    I use 32x32 texture colored in Photoshop instead of Vray color node.

    You think I need to override bitmap gamma override to 1.0? (I use in Input Gamma 2.2 in Preferences)

    Thanks
  • Noors
    Offline / Send Message
    Noors polycounter lvl 8
    I've never used multiframe as i relaunch the computation every time. I was just guessing a way to optimize that with storing the lightcache. But maybe the different normals affect the lightcache so it has to be computed each time.

    Yeah if you talk about the color you put in the bump slot, you have to override its gamma to 1 so max doesnt correct it. Now my head aches. Just check with a real normal map, you'll see if it's off or not.

    Now i think a more easy pipeline would be to generate lightmaps in UDK or Unity. You'll have a direct feedback. It's such a pain to make lightmaps in max.
Sign In or Register to comment.