Home Technical Talk

Managing consistent texel density

arcitek
polycounter lvl 2
Offline / Send Message
arcitek polycounter lvl 2

I have lately been trying to better understand texel density within a modular environment because of a project I am working on where I have adjacent surfaces where light bakes in unreal are slightly off.

There have been some really good threads on here and I think I am getting close but still trying to get the full understanding.

I created a simple box object in 3dsmax and did a uv unwrap. In the attachment, you can see I set up the UV unwrap using checkerbox texture that at 400 cm square, is a 2048 pixel texture. In the uv options, I set up the grid.

From reading threads here, I did download a basic texel script that is to help manage the TD of the objects but I cant seem to get this to work. You can see how I unwrapped the basic box object but I cannot get the texel density to match the 2048 image. My understanding is that if I unwrap 50 unique objects with their own UV sets and the TD in each one of these matches the 2048 TD, I should be good to go with my light bakes in Unreal. Oh, just a side note, I am using a second channel for my UV's for the light baking.

Anyway, can anyone help me out or point me to another resource that can explain / show this to me well? I know I can get third part scripts and plugins but like always, I like to try to learn by using what is available in the program I am using.


Replies

  • Fabi_G
    Options
    Offline / Send Message
    Fabi_G insane polycounter

    Hi! From reading the text, I got the impression this is about static lighting in general, less about texel density? Of course a sufficient lightmap density is part of it, but there are more aspects. Some screenshots of the "slightly off" lightbakes would be nice for context.

    I learned a lot about lightmass by watching this video from a technical artist at epic, following along archviz courses on unreal academy (like this one), as well as looking at a sample project by the artist "koola". This was a few years back, surely there are newer resources out there today.

    Some points I remember:

    • Adjusting scene scale and quality parameters can help to remove baked lighting value differences. Obviously there are also differences between preview and production lightbakes.
    • Lightmass importance volume is necessary for amount of bounces and volumetric lightmap.
    • Lightmap resolution can be overwritten on an object basis for more detailed shadows in specific areas.
    • Test bake modules in a small scene first to reduce baking time.

    In the context of a larger project, this would questions for an tech/lighting artist.

    Much success!

  • killnpc
    Options
    Offline / Send Message
    killnpc polycounter

    i'm seeing a lot of emphasis being placed on texel density. this must be a popular cred gatekeep, as if there's a texel density problem. it would need to be pretty damn egregious in order be noticeable or impact performance. i assume this is more a concern for VR assets; large textures / extreme close ups. it'll make clean to check but, don't think it's really all that important to specify in a task. but, if the cool kids driving the standards wagon say it's important, best placate the nerds, those cool kids can cause a fuss.

    anyway, essentially there's mild restriction to how much control you have over texel density. it's pretty anchored to how well the UVs pack and the texture size. adjusting UVs to target a texel density by reorganizing sh*t to mirror, "squanch" a edge in, or split out elements into their own separate textures to increase texel density seems to me like a pretty damn wasteful use of a production time. a checker pattern does the job imo, no need to break out the damn calipers here.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    It's not gatekeeping. It's a technical requirement defined for a number of very good reasons that you have clearly failed to grasp.


    I honestly can't be bothered picking your post apart so I'll ask you one simple question.

    If it's not a good idea why did everyone independently arrive at the same solution ?

  • Alex_J
    Options
    Online / Send Message
    Alex_J grand marshal polycounter

    @poopipe

    what killnpc explains makes sense to me. In the sort of 3d work I've done, doing anything more than eyeballing for consistent density would just seem like total waste of time.

    I've seen you explain the reasoning for TD requirements before, though I dont remember exact details because it wasn't a concern for me at the time. But the gist I got was, on a large team (presumably like, the largest AAA teams?), it becomes important because, well, on a large team you just have to operate that way as you don't have easy, direct communications with every member and you are also pushing software to limit annnd also you can't just blindly trust all members to know the general scope of the project in every department.

    In other words, you need to be able to just give a hard metric for grunts to follow and this is all to avoid problems endemic to large team.

    Is that right?

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    That stuff covers the importance in terms of collaboration . Wouldn't say huge teams - small teams benefit from efficiency too. If you only ever work on your own and everything you make is in total isolation then it's al irrelevant.

    Also tbf you might just not give a shit - which is perfectly valid if you're in charge of the project.


    The other part is resource usage. You should be aiming to set texel density to a value that gives you the smallest size texture required to get the fidelity you need on screen at your closest view distances. This is good for memory and is easy to understand.


    There's some more esoteric issues around runtime resource usage as well though.

    I'm struggling to find a single source that explains how it works and tbh I'm not clever enough to explain it with actual confidence cos I'm a tech artist not a render engineer.

    But ..

    mip selection is based around visible portions of a texture and the rate of change across their related UVs - it's not just 'object far, use smaller texture'

    If the camera is pointing at an area of particularly low texel density on an object then it's reasonable to assume that it's going to try and load a higher MIP than it would when looking at an area of higher density. The point of the exercise is to match screen pixels to texels as closely as possible after all.

    mip0 is 4 times the memory footprint of mip1 so in a streaming situation you just forced the system to render 3 other textures at half res to compensate for the one you forced to render at double


    I'll add a disclaimer here in case it isn't obvious.

    It is not this simple, you would not be able to measure such behaviour with some boxes and a copy of unreal and as always I may have misinterpreted the code/be unaware of new developments.

    If I'm wrong, please show me what to read so I can be right

  • Alex_J
    Options
    Online / Send Message
    Alex_J grand marshal polycounter

    @poopipe

    i really just wanted to get an explanation to show why it is not gatekeeping :) I don't doubt that you know how to do your job at all, lol.

    I think what you've decribed would fall under a case of like, games beyond a certain production budget (e.g. they are big enough to warrant a full-time tech artist) necessitate this degree of resource management, whereas if a developer like myself was thinking about texel density to such a degree in my game, I'd never finish. Too many other, larger concerns. And without a knowledgeable tech artist, probably I'd just be doing stupid shit anyway - thinking I understood.

    But if the OP is trying to get a job at a place like where poopipe works, I don't think learning this stuff would be gate-keeping. Probably a lot of people who want to sound smart but don't actually know might spread misinformation in the same way as the old "n-gons are bad" myth - but that doesn't indicate the original premise is wrong.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    hah! - misunderstood.

    Gatekeeping doesn't even come into it. Many(most) studios will see this as a part of the required skillset


    Encouraging newcomers to develop required skills is constructive, telling them its a waste of time is destructive.

    In this case, It's just simply a good thing to understand because it will make your own life easier - even if you can't see it now, you'll find out why eventually.



    @arcitek

    When working with texel density It really helps to Ignore the number of pixels (and in fact texel density numbers). Instead think in terms of the world size of your box and the world size of a single tile of the texture.

    how big is the box in that example? , if it's a 1m cube each face should obviously be 1 quarter of the size of the 0-1 area of the UV editor right ?

    I'm assuming you understand all that stuff so lets look at the issue (Assuming i understand what the problem is)


    You're in 3dsMax so a discrepancy in the reported texel density when you press GET may be due to UV vertices being offset in the W axis - this is quite common when using the UVWMap modifier and is a "feature" of plenty of other tools.

    To ensure this is not happening, switch the UV editor to UW or VW mode and check to see if any vertices are out of line with the others - if they are, select everything set the W scale to 0 and test density again.

    A discrepancy caused by this issue will always be an increase over the actual texel density so there is something else going on here. In this case, the box is probably more than 1x1x1m and you scaled the UVs down to fit the texture page - don't do that if you're trying to match a specific density. instead try to pack the faces into the space you have available and if that fails use a larger texture or split it into 2

  • arcitek
    Options
    Offline / Send Message
    arcitek polycounter lvl 2

    I did not realize there were responses to my original post but now I see there were several. There are some good points here but I guess I should clarify why I asked for some help. My understanding of TD is that if I want my materials to be at the same scale (resolution) across multiple objects in a scene, I would need to make sure my TD was defined/ handled correctly. The TD was also brought up in relation to creating modular wall assets and having the materials and light bakes look good between adjacent modular wall assets.

    Since this post, I found a way to make sure I am able to hit the intended target TD resolution in 3ds max by using a TD script I found that is almost identical to the tool found in Maya. As far as getting better results on the adjacent floor planes, i was able to improve and minimize the seams by changing some lightmass settings in world settings.

  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool

    I only worked for racing titles and couple last decades for just a single one actually . And no, consistent texel density is not the requirement . A car cockpit has smaller texel size than outside, then track surface , then outside terrain and track side objects.

    I recall we once had a calculator of how smaller texel size should be depending on the distance from the main road. And UV efficiency had always been a priority. Nowadays too. Otherwise we would never had anything hires enough up close and repeating like crazy.

    So honestly I am still a bit surprized why suddenly , 10 years ago maybe, consistent texel density became a big deal. I myself try to keep texel size more or less close but never perfectly same for a number of reasons. A nature of a material mostly and how well it repeats through open spaces and big objects. I very much doubt our players would ever notice the subtle inconsistency .

    My guess it's a genre specific or something but I am not sure I really understand why it's so important in so called modular approaches.

    And BTW , texel density of what layer? Our objects and the whole environment in general have tileable and macro or unique textures like edge wear and dents. Almost any surface have texture layers with slightly different texel size depending on context. Regular and macro normal maps . Many objects are too big to be UDIMs , like huge rocks walls laser scanned from an actual place. You often have to do slightly different texel density to hide better visually repeating patterns. Some extra rocks geometry over those huge rocks is using unique textures and also varies in scale subtly . So never perfectly same texel size too. For bricks , yes , they need to have same texel size. The dirt, rocks or heavily weathered plaster wall - not at all.

    No problems in production. So I am still a bit puzzled.

  • killnpc
    Options
    Offline / Send Message
    killnpc polycounter

    Sry for not replying sooner, I didn't see your comment until now and wish I had, I didn't mean to leave you stewing.

    The tone of your response makes it seem as though I've very much missed the mark on something here worth learning more about. I hope to find the time to learn more about it soon.

    Based on the comments I've read of yours on other topics, you've displayed, from my understanding, a good grasp of a good many aspects of game dev. So, my apologies, I think the stance comes from a disconnect from where this trend originates and what deeper advantages there are.

    Anyway, I think your response here is valid, especially when my comment might had detered another from a useful path.

    Cheers.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    @killnpc np - also sometimes i sound grumpy :D


    @gnoop

    consistent does not mean constant. a racing game example is a perfect example of where texel density rules are important. trackside objects need more because they're closer, background objects get less . you set standards for the various types of object based on their expected distance from the camera and off you go.

    And yes. you can do all sorts of clever tricks to increase perceived density, deal with tiling and so on but the vast majority of these tricks are based around UV coords. If you do not use a consistent texel density on your objects, you need to change parameters on your materials to suit the UV scaling of each object they are used on. At worst this creates a complete, new unique material (breaking batching and thus introducing "drawcalls") and at best it costs memory you wouldn't need otherwise.

    There's always going to be special cases that don't fit a system - it doesn't mean the system is invalid. However, if everything you make is a special case, it's a pretty good indication you're doing it wrong.

  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool

    I usually use slightly different texel size on different materials. Asphalt having a bit stretched ahead texel size for example so it could tile in a bit longer intervals the way I could put some irregularities in asphalt seam line or tarred seams etc , and some small scratches . Beside of repeating with longer interval they just look better form a car cockpit that way when you look at the surface with sharp/ tangent angle . like 30% stretched in V direction . The grass texture having like 15-20% smaller texel size in general because it needs more pixels to represent grass blades properly. More than so of asphalt grains. Our editor allows to do it easily by material kind, by tga, by surface physics properties etc. And I highly doubt someone ever noticed that difference. More than twice I agree it becomes sort of eye catchy and bad looking.

    Same for uniquely unwrapped things, I see no reason to allokate another half empty texture page if something could be squeezed in into the first one . Nobody really notice IMO.

    Could you explain more please how could it create more draw calls.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    This is engine dependent so mileage will vary but simply put...

    altering any parameter of a material - eg. a float, texture etc. - can potentially cause the creation of a new material at runtime, which is a "drawcall"


    most engines have mechanisms to avoid this happening all the time but you can't solve the problem entirely.

  • Klunk
    Options
    Offline / Send Message
    Klunk ngon master

    So honestly I am still a bit surprized why suddenly , 10 years ago maybe,

    it was important 25 years ago too, possibly more so, small overly stretched textures next to overly tiled texture on very low poly counts was a recipe for disaster/right bloody eyesore!

    stuff like this was not uncommon....

    when it should look like


    hasn't anyone produced a script for this or a shader even, set the ratio for the mid grey any thing where real world size greater than uv size goes red and visa versa goes blule... the more monochrome the more uniform.


  • gnoop
    Options
    Offline / Send Message
    gnoop sublime tool

    Nobody really did it that way, what we really did for such a building 25 years ago is having the roof of that building be just a thin stripe in UV with rather parallel vertical grooves ( by stretched out pixels) rather than actual tiles which requires much more pixels. And doing that we got much better details for facade , doors and windows. And nobody would see the difference really. People still do it for mobile games as far as I know.

  • poopipe
    Options
    Offline / Send Message
    poopipe grand marshal polycounter

    @Klunk - doing it as a pure shader solution is tricky since you're only ever looking at the currently loaded mip and that can't give you accurate texel density. You can get the rate of change across UVs in terms of screen space but that's only really useful in situations where the textures are the same size.


    Obviously a tool could be written to show the information required but I suspect there'd be quite a lot of data wrangling to do - can't see it being very snappy to operate.

Sign In or Register to comment.