@Farfarer good point, ill have a look at pyinstaller. actually im quite surprised how many people have problems with the python 3.3 install requirement... ( looked at py2exe a while ago, but hadnt enough motivation to dig deeper, as we dont have any module dependecies at the moment )
-also is there a way to add custom file format for the lowpoly ?
yes, at the moment its located in xnormal.py line 10. (youre might not aware that the ".py" files are all just plain text files, you can edit them directly.)
But like other settings i really should put these somewhere more accessible... i might add a config file or something.
-can we bake using cgf/chr (crytek) formats ?
whatever works with xnormal should work with the backstube, its really just an xml setting manipulator.
//Update
i just moved the settings to config.py - open with texteditor of choice
hm i'm thinking if releasing a stripped down template PSD for Ao, Normals, Vertexcolor etc without the way they are arranged could work, hope i have time to test this today.
then you guys would at least have the combination and padding part automated
Hi guys, just wanted to say thanks for this, saves a buttload of time! Is there away to add a floor mesh in for AO generation? I tried adding it to the bake folder as 'hi_floor' but doesn't seem to make a difference and I just get overblown AO. Thanks
EDIT: Had a derp moment, I had the floor mesh named 'Floor', which should've been 'hi_floor', fixed the problem. Cheers!
hehe i was about to say that, it should read anything you throw at it as long as you use the right prefixes. i think it only reads one lo_ though, not sure will have to test.
about the psd assembly, we lately started "licensing" customized versions to clients, i'm afraid for a while it would make no sense to release that
Does backstube support the Mikk-TSpace plugin setting? Does XNormal or the settings file save this setting? Mikk-TSpace is important for the upcoming version of UE4:
Here's an update on the Xnormal / Synced normals pipeline.
We found 2 steps to better sync normal map rendering:
The first step is a setting in xnormal you can change to make it better synced with our rendering:
Click the plug icon on the bottom left
Click Tangent basis calculators tab
Select Mikk - TSpace plugin
Click Configure button
Check the Compute binormal in pixel shader box.
( I will make sure the docs are updated with this)
The second step was a change to our renderer and will be available in the next release. The good news is you can go ahead and make this xnormal change and when the next version of Unreal Engine is available to you, your art will look better.
isnt the mikkT plugin the standard tangent basis calculator in xnormal?
anyways i added a way to change tb plugins in the config.py the last time i updated backstube - but i hope you dont have to change anything at all at the moment
as we used mikkT for a ubisoft production i'm pretty certain it is the default setting and will just bake with that?
so i just rechecked xnormal to be sure, there is only MikkT as an option out of the box, unless you have something else installed. which other tangentbase should it use if not this?
We never recieved any complaint by ubisoft but it is possible that the results are "good enough" we had no engine to test mikkT so we assumed all is correct?
as we used mikkT for a ubisoft production i'm pretty certain it is the default setting and will just bake with that?
so i just rechecked xnormal to be sure, there is only MikkT as an option out of the box, unless you have something else installed. which other tangentbase should it use if not this?
We never recieved any complaint by ubisoft but it is possible that the results are "good enough" we had no engine to test mikkT so we assumed all is correct?
I think he's referring to UE4 -- Jordan just confirmed the best method for synced normals there and you need to have the "Check the Compute binormal in pixel shader box." switch in the mikkT space plugin turned on. Which is under the config section of the plugin:
"Here's an update on the Xnormal / Synced normals pipeline.
We found 2 steps to better sync normal map rendering:
The first step is a setting in xnormal you can change to make it better synced with our rendering:
Click the plug icon on the bottom left
Click Tangent basis calculators tab
Select Mikk - TSpace plugin
Click Configure button
Check the Compute binormal in pixel shader box.
( I will make sure the docs are updated with this)
The second step was a change to our renderer and will be available in the next release. The good news is you can go ahead and make this xnormal change and when the next version of Unreal Engine is available to you, your art will look better."
lots of stuff is hidded in the registry, really horrible that santi uses xml AND the registry to save settings, never really understood that and it doesn't make it very accessable, as if any of these settings could not be used on a per asset base.
ahh thanks guys - didnt know there where any options burried there
well to come back to the question, you can save this in xnormal and it should just work as expected - as for now we dont change anything there, so it just keeps whats already setup.
okay how about now? but it is unpolished and not tested much, i would like to add a bit of graphics to the input window down the road.
put it into presets/scripts/airbornstudios there might be more coming
how it works is simple:
Lets assume you baked everything with xnormal (or with backstube to not have to touch xnormal ), your files will always have the same suffixes like _occlusion _vcols _normals and whatnot.
It will collect your files and pack those files based on suffixes into layergroups.
You also have the option to turn on padding, however, you will need xnormals dilation filter to be able to use that. We will add a routine that asks if that filter is installed, but right now you have to take care yourself.
Note, always bake without padding and let dilation do the job, because then you will never ever have to touch up overlaps based on padding you baked. padding is a postprocess in Xnormal as well, so stay non destructive and keep it a posteffect in your PSDs as well.
there are not many more words that would fit going with this pattern, maybe Lackstube that textures for you, but ddo will do that. We will try to support ddo as much as possible to get our own workflow integrated.
Well and Kackstube, that would shit for you, not sure how to solve this tho.
I did exactly as mentioned at the starting of the thread. But the normal map came out as if the entire low poly and high poly were baked together.Not in passes.
Here is what i did:
1)Exported all high and low poly as .OBJ format in a directory with suffix "hi_xx" for high poly and "lo_xx" for the corresponding low poly.
2)Then opened X normal imported one high poly and one low poly,then set the rear and frontal value as 3.
3)save the settings.
4)opened backstube
copy pasted the path for model directory
copy pasted the path for setting directory
copy pasted the path for output directory
5)Then clicked bake now.
Also tried to export all high poly as .OBJ and low poly as .sbm file format.But result remained the same.
whats the result in xnormal? it should be exactly similar as we do nothing but read the xnorma settings and apply rhem to your files in your folder, could you show your results or upload your data?
High poly has no detail like in this picture on tail. The details of the above mesh has been projected on tail like it would if we bake entire low and high poly mesh together.
do you have the passes set up with folders?every pass needs its own folder, say a folder called body with just the body lowpoly and the according highpolies, and another folder called scales with just the scale lowpoly and the according highpolies. I would also suggest to bake only every second scale in one batch to avoid intersections of the scales.
I had to create 36 folders , named them properly, put the high poly and low poly into respective folders.
Now i have 36 Normal map that needs to assembled in photoshop.
This would be almost equal to time spent in xnormal check high poly respective low poly. Provide a name for the output file press generate
and look at page one, we gave you a tool to assemble the files
also you baked just a normalmap, now bake ao and vertexcolors, you can't bake these in xnormal in the same run. now think of doing this 100+ times for 100+ characters. Backstube doesn't care if you bake 1, 10 or 100 characters.
The biggest chunk i ever baked at once was 5 characters with 5 batches each. that would be at least 25 times setting up, if you take the vcols - ao bug into account its at least 50 times touching xnormal.
Of course you can explode your model, so it will be just 5 bakes. But exploding means you have to do this in sync between high and lowpoly and its just an errorsource you add. the workload stays somewhat the same - to me it always felt horrible to export my highbase to max, move it with the lowpoly, get it back to zbrush and export for baking - so i always preferred baking in batches
Sorry for the late reply. I got caught up in something.
I just need 2 folders 1 for the body and 2nd one for the entire model.The reason i had so many folder because i was trying to bake each shell on his tail individually on this creature. Which was not necessary.
What do you prefer to do ? In case where normal map doesn't come out the way its supposed to be.(like 2 mesh very close each other in this creature that little hanging flesh on his mouth there is no way to get clean normal map in 1 pass)
I prefer rebaking , split that part rebake with that mesh only and combine in photoshop.
i prefer investing the bit of time for a cage because then it will be perfect
but i'd definitely have a batch to avoid intersecting lowpolymeshes, thats why i said put every second scale into a batch. so you have 1,3,5,7 etc in one batch and 2,4,6,8 etc in the other - enough space for raydistancestuff
because if you explode it, you wouldn't really need backstube, well of course you can automate that as well but the setup time is significantly shorter in xnormal because you did the setup in your 3d app of choice already.
we created the tool because i never ever explode anything, but bake in batches instead, this way i remove the need of moving stuff in sync and by hand between max and zbrush
I swapped to a workflow where I export separate groups of non-intersecting meshes instead of exploding. Baking was going to be tedious until I remembered this program! Thank you for sharing this lovely tool with us.
Hey guys! Packstube is exactly what I needed for my workflow where I just keep everything in one master PSD for use with QuickSaveMaps. But I ran into two little snags.
First one is that it doesn't seem to recognize .tif files. Which is a problem since I bake everything in .tif for 16bit precision. I just get the "error: Maps folder does not contain image files"
Second one is that it doesn't seem to recognize "_bent_normals" as a separate suffix as it grouped my bent normal map with my tangent space normal map. Or is that intentional?
Also, can we pretty please get a directory browser rather than just copy pasting in the path?
And it would be amazing if it could detect matching file names and put them in separate documents. So for example armor_normals, armor_occlusion, and armor_heights will be put in a separate document from weapon_normals, weapon_occulsion, and weapon heights. That would be something that would make sense to put in as an option checkbox.
1. the tif issue should be fixable, i'll have to look into it at work but in one of the settingsfiles should be all the datatypes it supports, so it shoulbd be just adding a ".tif" to the config.
2. well thats a problem of xnormal mostly, we just let it search for suffixes, of course exceptions of the pattern should be possible to code. but the cleaner way would be for xnormal not to use the same sffix for individual maps
3. will ask Benjamin if he has the time, as is always have the folders open i work in this is no real issue to me.
4. hm well internaly we use prefixes to seperate the objext space normals from the tangent base ones, thanks to xnormal using the same suffixes. An option you don't have but well a clean solution to a messy output format. But changing that would remove the prefix option.
1. Great to hear! Is this something I can do on my own, or will I need to wait for an update?
2. I'm not familiar with javascript, but is it really that hard to have "bent_normal" as a detectable suffixwith absolute case?
3. You're right, it's not a huge hassle since most of the time the folder will be open. It's just kinda annoying for it to be inconsistent with every other UX. I understand if you'd rather put your resources elsewhere.
4. Ah, I see. Well like I said, having it be a checkbox option ought to not cause that conflict? Or am I misunderstanding?
to the script. This should work with any other format you may be working with. Like those crazy kids that only bake in .gif these days...
However I ran into another bug. If one of your maps has an alpha channel that is just pure white the script will stop and pop up the error "Warning: No pixels were selected" followed by "Error: General Photoshop error occured. This functionality may not be available in this version of Photoshop. -The command "Delete" is not currently available."
I assume this is because it will use the alpha to delete the masked pixels for the use with padding. So when it finds nothing to delete Photoshop gets confused and stops. I'm not sure what the best way of fixing this would be. The simplest solution I can think of is to simply not run the delete pixels procedure if you're not using padding.
Of course, I could just make sure that my maps don't have an empty alpha channel. But some programs (I'm looking at you Maya...) like to do this. And going through a stack of files to delete their alpha channels is just as tedious as importing the maps by hand, and sorta defeats the purpose of automation.
Hey Fingus. Took a look at the jsx and it should do what you are talking about. If you check "Add Padding" its runs "AddPadding( layerSet );" in AssembleTextures. Photoshop usually tells what line its erroring out at.
Replies
@MM:
-also is there a way to add custom file format for the lowpoly ?
yes, at the moment its located in xnormal.py line 10. (youre might not aware that the ".py" files are all just plain text files, you can edit them directly.)
But like other settings i really should put these somewhere more accessible... i might add a config file or something.
-can we bake using cgf/chr (crytek) formats ?
whatever works with xnormal should work with the backstube, its really just an xml setting manipulator.
//Update
i just moved the settings to config.py - open with texteditor of choice
then you guys would at least have the combination and padding part automated
EDIT: Had a derp moment, I had the floor mesh named 'Floor', which should've been 'hi_floor', fixed the problem. Cheers!
about the psd assembly, we lately started "licensing" customized versions to clients, i'm afraid for a while it would make no sense to release that
anyways i added a way to change tb plugins in the config.py the last time i updated backstube - but i hope you dont have to change anything at all at the moment
XNormal itself remembers the setting... so maybe it is used or not used when it is on/off.
so i just rechecked xnormal to be sure, there is only MikkT as an option out of the box, unless you have something else installed. which other tangentbase should it use if not this?
We never recieved any complaint by ubisoft but it is possible that the results are "good enough" we had no engine to test mikkT so we assumed all is correct?
I think he's referring to UE4 -- Jordan just confirmed the best method for synced normals there and you need to have the "Check the Compute binormal in pixel shader box." switch in the mikkT space plugin turned on. Which is under the config section of the plugin:
"Here's an update on the Xnormal / Synced normals pipeline.
We found 2 steps to better sync normal map rendering:
The first step is a setting in xnormal you can change to make it better synced with our rendering:
Click the plug icon on the bottom left
Click Tangent basis calculators tab
Select Mikk - TSpace plugin
Click Configure button
Check the Compute binormal in pixel shader box.
( I will make sure the docs are updated with this)
The second step was a change to our renderer and will be available in the next release. The good news is you can go ahead and make this xnormal change and when the next version of Unreal Engine is available to you, your art will look better."
Is from: http://www.polycount.com/forum/showpost.php?p=2039738&postcount=1404
C:\Users\username\Documents\xNormal. Should be a file called mikktspace.dat. Unfortunately its a binary file. Also nice to see you here kary :P
I'm here a lot, I just rarely post ;P
well to come back to the question, you can save this in xnormal and it should just work as expected - as for now we dont change anything there, so it just keeps whats already setup.
okay how about now? but it is unpolished and not tested much, i would like to add a bit of graphics to the input window down the road.
put it into presets/scripts/airbornstudios there might be more coming
how it works is simple:
Lets assume you baked everything with xnormal (or with backstube to not have to touch xnormal
It will collect your files and pack those files based on suffixes into layergroups.
You also have the option to turn on padding, however, you will need xnormals dilation filter to be able to use that. We will add a routine that asks if that filter is installed, but right now you have to take care yourself.
Note, always bake without padding and let dilation do the job, because then you will never ever have to touch up overlaps based on padding you baked. padding is a postprocess in Xnormal as well, so stay non destructive and keep it a posteffect in your PSDs as well.
I'll give this a shot, thanks for the dilation info.
\/ Okay thought it was a typo.. funny xtra stuff? nah.
Packstube packs for you
there are not many more words that would fit going with this pattern, maybe Lackstube that textures for you, but ddo will do that. We will try to support ddo as much as possible to get our own workflow integrated.
Well and Kackstube, that would shit for you, not sure how to solve this tho.
Here is what i did:
1)Exported all high and low poly as .OBJ format in a directory with suffix "hi_xx" for high poly and "lo_xx" for the corresponding low poly.
2)Then opened X normal imported one high poly and one low poly,then set the rear and frontal value as 3.
3)save the settings.
4)opened backstube
copy pasted the path for model directory
copy pasted the path for setting directory
copy pasted the path for output directory
5)Then clicked bake now.
Also tried to export all high poly as .OBJ and low poly as .sbm file format.But result remained the same.
High poly has no detail like in this picture on tail. The details of the above mesh has been projected on tail like it would if we bake entire low and high poly mesh together.
Below is the picture of high poly.
I had to create 36 folders , named them properly, put the high poly and low poly into respective folders.
Now i have 36 Normal map that needs to assembled in photoshop.
This would be almost equal to time spent in xnormal check high poly respective low poly. Provide a name for the output file press generate
this one for instance had 6 folders
http://airborn-studios.com/projects/2011/portfolio/owl_huntress/frenja_14.jpg
if you start your project in zbrush with proper naming its just mass exporting them, sorting them into folders
and look at page one, we gave you a tool to assemble the files
also you baked just a normalmap, now bake ao and vertexcolors, you can't bake these in xnormal in the same run. now think of doing this 100+ times for 100+ characters. Backstube doesn't care if you bake 1, 10 or 100 characters.
The biggest chunk i ever baked at once was 5 characters with 5 batches each. that would be at least 25 times setting up, if you take the vcols - ao bug into account its at least 50 times touching xnormal.
Of course you can explode your model, so it will be just 5 bakes. But exploding means you have to do this in sync between high and lowpoly and its just an errorsource you add. the workload stays somewhat the same - to me it always felt horrible to export my highbase to max, move it with the lowpoly, get it back to zbrush and export for baking - so i always preferred baking in batches
Thanks for Sharing :thumbup:
I just need 2 folders 1 for the body and 2nd one for the entire model.The reason i had so many folder because i was trying to bake each shell on his tail individually on this creature. Which was not necessary.
What do you prefer to do ? In case where normal map doesn't come out the way its supposed to be.(like 2 mesh very close each other in this creature that little hanging flesh on his mouth there is no way to get clean normal map in 1 pass)
I prefer rebaking , split that part rebake with that mesh only and combine in photoshop.
[IMG][/img]
but i'd definitely have a batch to avoid intersecting lowpolymeshes, thats why i said put every second scale into a batch. so you have 1,3,5,7 etc in one batch and 2,4,6,8 etc in the other - enough space for raydistancestuff
Haben Sie vielen Dank!
we created the tool because i never ever explode anything, but bake in batches instead, this way i remove the need of moving stuff in sync and by hand between max and zbrush
First one is that it doesn't seem to recognize .tif files. Which is a problem since I bake everything in .tif for 16bit precision. I just get the "error: Maps folder does not contain image files"
Second one is that it doesn't seem to recognize "_bent_normals" as a separate suffix as it grouped my bent normal map with my tangent space normal map. Or is that intentional?
Also, can we pretty please get a directory browser rather than just copy pasting in the path?
And it would be amazing if it could detect matching file names and put them in separate documents. So for example armor_normals, armor_occlusion, and armor_heights will be put in a separate document from weapon_normals, weapon_occulsion, and weapon heights. That would be something that would make sense to put in as an option checkbox.
Thanks for the great tools guys!
2. well thats a problem of xnormal mostly, we just let it search for suffixes, of course exceptions of the pattern should be possible to code. but the cleaner way would be for xnormal not to use the same sffix for individual maps
3. will ask Benjamin if he has the time, as is always have the folders open i work in this is no real issue to me.
4. hm well internaly we use prefixes to seperate the objext space normals from the tangent base ones, thanks to xnormal using the same suffixes. An option you don't have but well a clean solution to a messy output format. But changing that would remove the prefix option.
2. I'm not familiar with javascript, but is it really that hard to have "bent_normal" as a detectable suffixwith absolute case?
3. You're right, it's not a huge hassle since most of the time the folder will be open. It's just kinda annoying for it to be inconsistent with every other UX. I understand if you'd rather put your resources elsewhere.
4. Ah, I see. Well like I said, having it be a checkbox option ought to not cause that conflict? Or am I misunderstanding?
to the script. This should work with any other format you may be working with. Like those crazy kids that only bake in .gif these days...
However I ran into another bug. If one of your maps has an alpha channel that is just pure white the script will stop and pop up the error "Warning: No pixels were selected" followed by "Error: General Photoshop error occured. This functionality may not be available in this version of Photoshop. -The command "Delete" is not currently available."
I assume this is because it will use the alpha to delete the masked pixels for the use with padding. So when it finds nothing to delete Photoshop gets confused and stops. I'm not sure what the best way of fixing this would be. The simplest solution I can think of is to simply not run the delete pixels procedure if you're not using padding.
Of course, I could just make sure that my maps don't have an empty alpha channel. But some programs (I'm looking at you Maya...) like to do this. And going through a stack of files to delete their alpha channels is just as tedious as importing the maps by hand, and sorta defeats the purpose of automation.
ive done every thing on the steps right and i get this when i hit bake