It's 2016, so what app do you use to bake your normals, displacements, ao, vertex color, cavity and other maps?
I am currently looking into a workflow that enables baking high res maps of up to 32, 64 or even 128k.
Are you still using what you used 8 years ago or have switched to a more recent app to handle this task?
Replies
By 32, 64 or 128k do you mean 32,000x32,000 pixels and up? If so, what are you doing that requires such high resolution?
S Painter for me. So simple to set up and iterate, and be instantly ready to texture.
I'm also curious why you'd want to bake 128k? This would be where udims come in. None of the game industry aimed apps can do it(SP udims are not truly integrated yet)
Zbrush or Mudbox can bake that res using udims.
I'm testing toolbag 3 aswell.
i second mightybake... its also able to bake UDIMs...
@musashidan I am guessing it would consume a lot of memory that's why I decided to ask here. I am using a realistic renderer. This workflow isn't for game art so I am thinking 128k for detailing a complete unit in a scene that requires it might not be that bad?
UDIM is efficient but at a point managing the textures can be a handful. Like 32 tiles for a wall facade. I wonder about ptex. Ptex didn't take off as it is not used in most studios' workflow. Why aren't we striving for a resolution independent textures system rather than having to have multiple tiles and materials?
At some point, uv texturing is gonna get old especially with films and games getting more advanced graphically. There has to be something being developed to up the texturing workflow.
Realtime or offline rendering, you would be much better off creating a layered material with multiple high res tiles that you blend together to transition between materials and create variation.
To make such a high resolution texture, you would want to do this anyway. There is very little benefit to baking it down to some crazy mega resolution unique map, and very serious drawbacks, like memory usage and general content management.
a 128x128k is 256x larger than 8x8k. An 8K texture is 192mb, a 128x128k texture is 50 gigabytes. That's 200 GB for a simple diffuse/spec/gloss/normal set, and that's for a single material. Are you gonna build a render machine with 16TB of ram or something?
I completely agree about the memory issue stuff and optimisations neccesary to making things run smoothly on the rendering end. I guess udim is the way to go till some genius comes up with something better.
all of them are uv based and baking in the end...
and ptex is geometry bound... no reusing... no modeling on textures stuff... very dangerous...
Idk, for me, what you're trying to do sounds way more complicated lol.
If u model on textures already painted with ptex , all you need to do is repaint that part. Like I said, ptex seems tailored towards texture painting artists. Tbh, ptex does need a lot of flexibility so I clearly understand why it can't be used for games especially when you need to keep the number of textures as well as their file size as low as possible.
@NoRank Lol, I know. I am here to learn. Yeah, will be sticking to udim. I was just thinking of photogometry or scanned models and approaching that from a texturing viewpoint.
What exactly are you trying to do, anyway, that you think you need 128K of res? Even if your final output is an extreme closeup on a 4K screen, 128K would still be a bit crazy. Not to even mention the memory footprint of such a texture!!
I said maybe even 128k and that was even if the need ever arises. I knew there would be issues if this is done so I decided to ask if this is even possible. Xnormal highest baking output is 32k, I was just curious why that is. Like I mentioned earlier, back to udim.
I know, imagine 1million resolution texture, the cpu might blow up..lol.