hello. i've been hearing about this for some time now. and i just don't get it. i've heard about people using render to texture in max to "re-bake" their UV maps. can anyone shed some light on this for me?
Basically you simply unwrap your model to one specific UV channel, texture it the way you want, then create another UV layout in , say, channel 2. In the RTT window options you choose the from and to channels. No need for a duplicate of the model or anything - everything happens within one single object. I don't have max here atm but everything is rather self explanatory.
Note that you can also save UV layout to file which can be handy in the process.
Well the only time I used this technique from start to finish was on my Blizz entry. I had to complete the armor very VERY fast hence couldn't afford to UV map each bit, arrange everything in a cool layout, and then paint the little dudes (with the possibility of inappropriate UVs popping here and there and time flying away, which would leave me with a potentially incomplete texture map).
Hence I simply drew each armor pad I needed straight on a temporary texture sheet, and when I had a nice pad completed I slapped it on a 3d plane and shaped it in place, then moved on to the next. Since it was all done in a rush it made the process faster and less annoying since I was certain I wouldn't have everything to paint at once if time ran short. That way, whatever happened, I knew I was gonna have say, 75 to 90 % the design actually modeled AND UVed on time. When I reached something I liked enough I decided to stop, arranged the UV bits I made on time in a nice sheet and baked everything to a better layout.
Yeah well not really production friendly but it helped. Note that you can also use that technique to fix annoying texture seams.
(edit)
Hmmm in fact it can be also very helpful in production when you need asset uniformisation. Like on MMOs where many assets share the same layout but some old ones need to be adjusted to fit a new layout system. Used it on 2 projects already and it made the person who was about to manually edit the layout in photoshop VERY happy!
Another good usage is you have a high poly model without UVs, and you texture it in Zbrush. Once you have a low poly model with UVs and you're ready to bake your normal maps, you can bake out the diffuse at the same time to the low poly UVs.
I sometimes use one UV channel for procedural max textures like noise, smoke, dent, electricity ect... then render those out and use them when painting textures.
Right now I'm using a "Max Shader" to scatter dust around the objects I'm working on. I blend my defuse with a procedural dust texture, to settle dust on top of everything then bake out my texture. Areas that are not parallel to the Z axis do not receive the dust material (gotta love falloff). Saves me from having to try and figure out where dust would settle. The procedural dust texture uses a separate UV channel, I also render out just the dust with an opacity mask so I can overlay the dust on my texture so I can keep painting without having to re-bake.
I also used this technique to blend snow over complex areas.
I also have a stock set of max materials I keep on hand, they work as a good base to start painting from. They also use a separate UV channel than my final layout. This helmet was textured 100% in max materials in about 30min. It works well as a base to start painting on
other reasons:
being able to work on high-res textures that the machine otherwise wouldn't handle. for some assets you just want to have those really high res textures and since 4k and higher images are a little hard to work with on today's machines, i prefer to split accross several 2k textures and separate parts as much as possible initially. yet for game export it needs to be condensed to a single texture page and rebaking makes it easy.
another thing - suddenly there's the request to have higher resolution on a certain part of an asset that's already textured. no problem, if you painted in a higher res than the intended output one. just modify the uv's to give the prominent parts more pixel-space and rebake.
and: imagine some asset that's complicated to unwrap. it spreads out like mad on your texture sheet, wasting space like nothing else. just texture it that way and then reshape the uv's into something more efficient, rebake and voila. no need to compensate for stretching/seams during painting.
gotta be careful with normalmaps when doing this tho.
[ QUOTE ]
Also, how bad is this on normal maps? Won't it wreak havoc on everything?
[/ QUOTE ]
If you neither rotate nor stitch UVs, you're usually OK. If you change the orientation, or you change the UV vertex count, you're probably gonna cause new seams.
The way I understand it, you're altering the tangent space inadvertently, because it's tied to the UVs. So you'll likely see new seams until you re-cast the normalmap with the new UVs.
The way I understand it, you're altering the tangent space inadvertently, because it's tied to the UVs. So you'll likely see new seams until you re-cast the normalmap with the new UVs.
[/ QUOTE ]
To be sloppy, you can project a normalmap from a copy of the same mesh with the first normalmap/uv applied. In theorie that should be perfectly fine and you can rotate and manipulate the new uv's to your liking. You will end up with a slightly blurred map of course. Deactivating filtering helps, but the pixels will shift on your map slightly then, compared to a normalmap projected from the highpoly.
Noren, do normalmaps actually alter the raycasting? That's interesting. I guess they would since you can put a bump map on your highpoly mesh, and that gets factored into the new normalmap. Intriguing.
just to clarify , you have 2 uv sets, channel 1 and 2. your existing texture.uv's is on channel 1, your modified uv set/texture on channel 2 .
render channel 2 to texture, addind a diffuse element only , with no lighting or shadows.
That should work
Further to this , is there any way to pump out a psd using this method, with the layers still intact
you can bring in a layer psd in to max, but it seems you can't render out.
Yeah Ruz, that's exactly what I was doing.
I tried it on another model and it seemed to work though..
Perhaps the original one was corrupt or something.
dunno how relevant this is, but on my dom war entry the skin, jacket, glove face and hair were all rendered from normal mapped mid poly object, to a low poly object with different uvs. I did it because my machine was choking, and so I could apply a material with bump to the mid poly (fake hi poly) but still easily edit uvs on it. the normal map on the mid poly was higher resolution so it was rendering down, didnt seem to cause a quality issue (as long as the mid poly was detailed enough to describe most of the shape)
Eric, yep it works actually really well.
I did the same like rooster on several occasions and also recently rebaked a normalmap from a similar lowpoly to a variation with different uv's. Since I had to cast from the highpoly anyway I had a chance to compare. They matched up quite perfectly at least visually. Like already mentioned the one from the lowpoly is slightly blurred of course. If one turns off filtering for the "source" normalmap, the second one is more crisp and closer to the one from the highpoly - but distorted a bit for one or two pixels in some places unfortunately.
edit: I think I have to add that I'm quite a pedant and with filtering on you see almost no difference at 100%. The mentioned blurring will become more apparent at 200 or 300%. So Ingame, out of some distance with blurring and whatnot, you'll hardly ever notice a difference.
Vassago, that's interesting. Working mainly with Epoly I would have expected that the whole object is in the new uv-channel anyway, no matter if just some polies are visible in the edit-uvw window due to a faceselection lower in the stack. But it seams like if you do the same in Emesh only the selected faces will be added to a new uv-channel. (You can check in Channel Info. )
So I think you found the solution to the problem and why it works with Emesh and not Epoly. Use Edit Mesh to select or move those offending uv's away.
I can fix the issue completely in Epoly, as long as I move all of those ch2 UVs away. But in this tutorial, the person used epoly and didn't seem to have this issue. Very frustrating.
I can fix it, but it's a longer work routine, so it doesn't seem as streamlined.
Actually I wouldn't bother with a stack like that at all.
One uvw-unwrap modifier that is set to channel two should do the trick as well. Move all your uv's out of the 0-1 region as a first step and then arrange the ones you need back in there.
I'll message ya tonight then.
This all worked to the point where I had to rebake the remapped UVs back into the base texture. When I did that, it gave me a solid-color texture map. Proper UVs, but the texture wasn't there.
Replies
Basically you simply unwrap your model to one specific UV channel, texture it the way you want, then create another UV layout in , say, channel 2. In the RTT window options you choose the from and to channels. No need for a duplicate of the model or anything - everything happens within one single object. I don't have max here atm but everything is rather self explanatory.
Note that you can also save UV layout to file which can be handy in the process.
So nextgen!
Hence I simply drew each armor pad I needed straight on a temporary texture sheet, and when I had a nice pad completed I slapped it on a 3d plane and shaped it in place, then moved on to the next. Since it was all done in a rush it made the process faster and less annoying since I was certain I wouldn't have everything to paint at once if time ran short. That way, whatever happened, I knew I was gonna have say, 75 to 90 % the design actually modeled AND UVed on time. When I reached something I liked enough I decided to stop, arranged the UV bits I made on time in a nice sheet and baked everything to a better layout.
Yeah well not really production friendly but it helped. Note that you can also use that technique to fix annoying texture seams.
(edit)
Hmmm in fact it can be also very helpful in production when you need asset uniformisation. Like on MMOs where many assets share the same layout but some old ones need to be adjusted to fit a new layout system. Used it on 2 projects already and it made the person who was about to manually edit the layout in photoshop VERY happy!
beers on me!!!
Marcus Dublin
Artist - Kaos Studios
Right now I'm using a "Max Shader" to scatter dust around the objects I'm working on. I blend my defuse with a procedural dust texture, to settle dust on top of everything then bake out my texture. Areas that are not parallel to the Z axis do not receive the dust material (gotta love falloff). Saves me from having to try and figure out where dust would settle. The procedural dust texture uses a separate UV channel, I also render out just the dust with an opacity mask so I can overlay the dust on my texture so I can keep painting without having to re-bake.
I also used this technique to blend snow over complex areas.
I also have a stock set of max materials I keep on hand, they work as a good base to start painting from. They also use a separate UV channel than my final layout. This helmet was textured 100% in max materials in about 30min. It works well as a base to start painting on
being able to work on high-res textures that the machine otherwise wouldn't handle. for some assets you just want to have those really high res textures and since 4k and higher images are a little hard to work with on today's machines, i prefer to split accross several 2k textures and separate parts as much as possible initially. yet for game export it needs to be condensed to a single texture page and rebaking makes it easy.
another thing - suddenly there's the request to have higher resolution on a certain part of an asset that's already textured. no problem, if you painted in a higher res than the intended output one. just modify the uv's to give the prominent parts more pixel-space and rebake.
and: imagine some asset that's complicated to unwrap. it spreads out like mad on your texture sheet, wasting space like nothing else. just texture it that way and then reshape the uv's into something more efficient, rebake and voila. no need to compensate for stretching/seams during painting.
gotta be careful with normalmaps when doing this tho.
http://www.gamasutra.com/features/20061019/kojesta_01.shtml
Also, how bad is this on normal maps? Won't it wreak havoc on everything?
[/ QUOTE ]
If you neither rotate nor stitch UVs, you're usually OK. If you change the orientation, or you change the UV vertex count, you're probably gonna cause new seams.
The way I understand it, you're altering the tangent space inadvertently, because it's tied to the UVs. So you'll likely see new seams until you re-cast the normalmap with the new UVs.
*edit
I can get it to work in editable mesh, but not editable poly...
The way I understand it, you're altering the tangent space inadvertently, because it's tied to the UVs. So you'll likely see new seams until you re-cast the normalmap with the new UVs.
[/ QUOTE ]
To be sloppy, you can project a normalmap from a copy of the same mesh with the first normalmap/uv applied. In theorie that should be perfectly fine and you can rotate and manipulate the new uv's to your liking. You will end up with a slightly blurred map of course. Deactivating filtering helps, but the pixels will shift on your map slightly then, compared to a normalmap projected from the highpoly.
render channel 2 to texture, addind a diffuse element only , with no lighting or shadows.
That should work
Further to this , is there any way to pump out a psd using this method, with the layers still intact
you can bring in a layer psd in to max, but it seems you can't render out.
I tried it on another model and it seemed to work though..
Perhaps the original one was corrupt or something.
thanks,
http://www.cebas.com/products/products.php?UD=10-7888-33-788&PID=38
I did the same like rooster on several occasions and also recently rebaked a normalmap from a similar lowpoly to a variation with different uv's. Since I had to cast from the highpoly anyway I had a chance to compare. They matched up quite perfectly at least visually. Like already mentioned the one from the lowpoly is slightly blurred of course. If one turns off filtering for the "source" normalmap, the second one is more crisp and closer to the one from the highpoly - but distorted a bit for one or two pixels in some places unfortunately.
edit: I think I have to add that I'm quite a pedant and with filtering on you see almost no difference at 100%. The mentioned blurring will become more apparent at 200 or 300%. So Ingame, out of some distance with blurring and whatnot, you'll hardly ever notice a difference.
Vassago, that's interesting. Working mainly with Epoly I would have expected that the whole object is in the new uv-channel anyway, no matter if just some polies are visible in the edit-uvw window due to a faceselection lower in the stack. But it seams like if you do the same in Emesh only the selected faces will be added to a new uv-channel. (You can check in Channel Info. )
So I think you found the solution to the problem and why it works with Emesh and not Epoly. Use Edit Mesh to select or move those offending uv's away.
I can fix it, but it's a longer work routine, so it doesn't seem as streamlined.
One uvw-unwrap modifier that is set to channel two should do the trick as well. Move all your uv's out of the 0-1 region as a first step and then arrange the ones you need back in there.
http://boards.polycount.net/showflat.php?Cat=0&Number=175605&an=0&page=4#Post175605
http://www.3dtutorials.sk/index.php?tutorials=0&software=0&id=123
This all worked to the point where I had to rebake the remapped UVs back into the base texture. When I did that, it gave me a solid-color texture map. Proper UVs, but the texture wasn't there.