Home Technical Talk

Displacement + Normal or just Displacement?

Yerus
polycounter lvl 4
Offline / Send Message
Yerus polycounter lvl 4
I have a 6 UDIM UV set of a humanoid, I've bake the pores into normal maps and kept just the overall shape for displacement maps (2048x2048). 
As it's known, normals maps don't have great performance cost, but at the same time, there are 6 more maps to be loaded, whilst for just using displacement (with the pores baked to it)  will add on dynamic subdivision, as it's getting more detailed.
So, which workflow u deem best to use?
Thx!

Replies

  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter
    In games typically the displacement  just displaces vertexes and nothing extra  doing nothing to normal vectors  so you need to have a normal map  to see  shading details. 
      in a word without normal map you see your displacement only in silhouettes/profiles  and nothing of  your pores at all.    Besides  the displacement is usually not so hi res in games  due to lack of vertexes for displacing.


    in offline renderers  the displacement shader recalculates per pixel normals on the fly  so you don;t  need  a normal map.    But it takes more shader  real time   calculations then having already pre-baked normal map.


  • Eric Chadwick
    Options
    Offline / Send Message
    I think each sentence you wrote there is fundamentally incorrect.

    Most games don't use displacement at all, because the displacement maps must be high bit depth and uncompressed to work properly. That eats up a ton of precious memory, it's generally cheaper just to use a lot of verticies up front with no d map

    If a game uses displacement, they tesselate the mesh. Not sure about the new normals, though I'd bet they're recalculated.

    Offline renderers don't convert displacement into "per pixel normals" but they tesselate the mesh to varying amounts depending on underlying mesh, your settings, camera distance, etc.

    Displacement is avoided as often as possible in renders because it's slow. An additional bump map helps reduce render time, so it's used a lot. We only displace when significant silhouette or parallax are needed. It's for fairly large-scale "bumpage". Then bump maps handle the smaller-scale bumps. 

    There are no "shader real time   calculations" in an offline renderer.

    If you're making a portfolio piece, don't rely on displacement. Model in the detail needed, and bake a normal map for the medium-range shading.

    If you need pores for a game mesh, that means a separate LOD for cinematic close-ups only, at which point you're likely using a high-res mesh, and high-res normal maps.
  • Yerus
    Options
    Offline / Send Message
    Yerus polycounter lvl 4
    Oh! I see, but I forgot to mention..it’s for DCC not real-time! But I have decent but limited resources. I thought of using normal maps combined to save some performance, but there are 6 more maps to load, is it worth it??
  • Eric Chadwick
    Options
    Offline / Send Message
    Too many variables, and no visuals, to offer an informed opinion. Try it and see. 
  • gnoop
    Options
    Offline / Send Message
    gnoop polycounter

    If a game uses displacement, they tesselate the mesh. Not sure about the new normals, though I'd bet they're recalculated.

    Offline renderers don't convert displacement into "per pixel normals" but they tesselate the mesh to varying amounts depending on underlying mesh, your settings, camera distance, etc.


    There are no "shader real time   calculations" in an offline renderer.


    Well , I really shouldn't say  "in games typically" since I worked for not so many games .    So maybe it's not everywhere.

      But for what I know   normals are not recalculated ( beyond typical  per pixel real time tangent to world space conversion from tangent normal maps).   Games displacement  just shifts vertexes along vertex normals stored in meshes.   New vertexes get interpolated  normals  BEFORE  shifting .    
    Pretty much same as it works in Substance Designer  where  you see no shading  difference ( or any pores)  on displaced  mesh  without a normal map. 

    Also game displacement works with 8 bit height stored in alpha  pretty ok too.    Compression is issue for sure as for evry  RGBA texture.  

     if  tesseleted mesh would get new vertex normals  based on new tessellated shape after initial shifting   the normal map would become instantly irrelevant  and  geometry  degradation wouldn't work properly.     


    As of offline render I just meant  that  it does more calculations on the fly  calculating per pixel  normals  from grayscale displacement image  instead of  picking them form separate normal map.      So normal map is not needed.    And it does need  hi bit depth for proper  per pixel normals calculating in smooth  gradients and such.



  • oglu
    Options
    Offline / Send Message
    oglu polycount lvl 666
    If its not for realtime why are you using only 2k maps? Could you show some images what you are doing. To get pore detail in displacement you need a lot of texture resolution. 
  • Yerus
    Options
    Offline / Send Message
    Yerus polycounter lvl 4
    4k maps and up get too expensive, I have limited hardware and lots of physics going on. I'm using hair card also with cloth physics applied to it, it's more expensive than dynamic curves but it's more accurate with collisions.
    The texel density is ok I fear, I've done a good job squeezing those things.
Sign In or Register to comment.