I guess most of you guys have seen this document on the The Order 1886 lighting and material creation
. I was just wondering whether its possible in substance to blend alot of materials on to one object like the water pump in the document?
I now know you can use the multi material blend with mesh id mask however the one in the document is slightly different.
The workflow used by RaD for The Order is exactly the same as using a series of Material Blend nodes with base materials and masks in SD.
I think that they could have used SD instead of creating their own in-house shader system if these features had made it earlier into the tool.
That's exactly the workflow some of our customers working on new consoles use Substance for though.
Yes Substance can do the same workflow as the Order. The only bit is that cool blending will happen in the shader aspect not within substance even though in substance you can get your tiling but it will still be restricted to 1 to 1 map like 204x82048 etc when the whole material blends is complete. You could also write a shader that supports tiling.
For example tiling the rust or dirt base textures need to be done via a shader.
In general I tend to use substance as quick results viewer and than export my elements textures such as masks and base textures... and than do the material composting in a PBR engine like marmasat tool bag.
You can make that step more linear if you make tools / scripts for it though...
Are you talking about the vertex color realtime shader blending? Because the standard system from RaD bakes the blended materials down to a single texture, which is exactly what Substance would do.
Hey guys! I'm quite new to Substance Designer and Unity and would really appreciate your help.
I have a few different models that should use the same textures. For this purpose I've created substance that takes color mask and AO as inputs and outputs diffuse, specular and glossiness as alpha channel of specular (read somewhere this is the way Unity works with glossiness), but when I import model and apply substance, glossiness has no effect in Unity.
So the first question is:
What is the right way to output glossiness for Unity?
The second question is how to efficiently apply my substance on multiple models? Should I duplicate it for each model and feed model specific maps or there's another way?
If something is unclear I can share my substance and model with maps.
I wanted to add I'm not reading this thread as closely as I use to, if there is anything you think should be added to the original post, please PM me and I'll get to it right away.
Any news on exporting .sbsar files with no compression? The JPEG compression shinks the file size significantly but produces a lot of artifacts on the maps too. I'd like to use uncompressed files for the presentation renders, but I have to export all my maps/graphs individually to do that, as publishing raw files crashes out the exporter
Yes there is significant change to the textures when publishing as a .sbsar file, you can see in this image. especially noticeable was the darkening of the diffuse map
Top row is exported maps and hand built unity shader
middle row is maximum jpeg compression (low quality)
bottom row is minimum jpeg compression
I'm having a hard time comparing that... If anything, there's barely a difference between min and max quality, and detail seems perfectly retained in the JPG compressed ones.
You probably just have a gamma correction issue with textures generated from a Substance file, which probably has more to do with Unity and how you use its shaders, than it does with compressing textures inside the SBSAR.
edit:
I just tried with a Substance full of JPG masks (in Marmoset), and i can not see any quality drop. Maybe if you replace your entire diffuse with just one JPG, but in that case you're using Substance wrong. I think it's clear something is going wrong in Unity.
everytime I linked many 2k maps my substance ram usage eventually went crazy , and the program got frozen . any tips how to put many bitmap pictures in substance without upsetting the memory?
it works in photoshop , is it because instead of linking it I need to embed to the substance file?
Bumping this as it seems you didn't get an answer yet. I guess you get an heavy memory usage simply because you are linking the bitmap directly in the graph which trigger their display and therefore their loading in the ram.
Try adding them via the explorer in a ressource folder instead (beside you graph). You can still drag'n'drop your bitmaps, but this time release the mouse over your substance in the explorer view.
@Xoliul I tried doing the tests in toolbag with the same results. The published archive seems to be adjusting the gamma values in the texure, or maybe it's some part of the graph setup which is adjusting it?
Bottom two rows are published archives, top is exported maps from graph
Comparison between the diffuse map outputs
edit: thought this may be the JPEG compression algorithm, however exporting my map from the graph as JPEG looks the same as a TGA export
Yeah like i said, it has very little to do with JPG I think.
Did you try what it does in Substance Player ? Just trying to see what is actually the culprit here, and player is the only real official viewer. Unity has some bugs sometimes, and I noticed today Marmoset doesn't play too nice with Substances either. UDK should be pretty good with them too btw.
I'm thinking we need to create a substance package or library to share with other artists to help get some more information sharing and to make this package more powerful for people new to the program. I'm thinking we either need to do a mini contest or find out the most requested or desired substances for artists, and make them. Any ideas or suggestions?
Okay, Welp I finally got started learning substance designer and fail on the first minute of the video. When I try to set the color of my uniform color node it instantly turns the map into 0x0
....what
I'm thinking we need to create a substance package or library to share with other artists to help get some more information sharing and to make this package more powerful for people new to the program. I'm thinking we either need to do a mini contest or find out the most requested or desired substances for artists, and make them. Any ideas or suggestions?
Yeah I think we should do something like this. They're gonna release a new version soon that makes sharing libraries much easier, this would be pretty much a requirement for it to work.
I built most of our library at work, I can remake a bunch of the good ones from that and we start from there. I believe we shouldn't get too specific with things though. A free-for-all vote sounds a bit silly, things will be all over the place, and inexperienced users might request things that don't make sense. Another good starting point is matching some Ddo stuff since people seem so used to that. Perhaps go a bit beyond it, since Ddo stuff is kind of rigid; you could do nodes that can do several effects by just allowing more control.
Sharing it could work by creating a shared Dropbox account for it, which contributors get write access to or something? That way you could stay up to date automatically I guess.
Okay, Welp I finally got started learning substance designer and fail on the first minute of the video. When I try to set the color of my uniform color node it instantly turns the map into 0x0
....what
Working as intended I think? When you change a node while not actively viewing it, and it is not connected to an output, it goes into an "uncalculated" mode, showing as 0x0. Either doubleclick it to force it to recalculate, or start using it as diffuse output or something. They do this so that unused nodes aren't recalculated constantly, slowing things down.
Awesome, I think it'd also be good if we tagged substances as hand-painted-look, PBR ready, etc. Looking forward to a request board.
Yeah we can host it on dropbox and mirror it on google drive and maybe my website or something else if it gets too big, I haven't looked to see how big substances get, I'll probably wait until they add the sharing features in anyway.
Yeah we should just categorize things properly. Again before we get into requests there's some 'low hanging fruit' type of things to tackle. I think it's important to sort of preach a 'clean and correct' workflow to people with this project, instead of just pumping out sloppy nodes. That means some educational videos would go well with it.
Regarding size: if you don't use too many (uncompressed) textures, they are super small. 2Gb will go a long way.
You 2 dudes want to discuss this more in here, or on Skype or something? it's just gonna be up to someone like us to sit down and do it. I have thursday and friday off so I could get this started.
Awesome, I think it'd also be good if we tagged substances as hand-painted-look, PBR ready, etc. Looking forward to a request board.
If you're doing PBR ready substances, could i request that you make two variants of each? I've made (it's on the first post of this thread) a full specular shader. would be useful i think to have metalness and specular based substances for PBR workflows.
I ran into a problem that maybe someone has a fix for:
Substance crashes whenever I try to expand a specific folder within my package. The folder has about a dozen or so graphs for all of the objects for my scene. I can browse every other folder, like the base materials I've created and even all of my resources, just fine. I should note that the folder contains graphs only, no resources of any kind.
Anyone have any ideas?
Update: I ended up reverting to one of the autosave backups and dividing my package up from there. (Very easy.) In the end I only lost about 20 minutes worth of work. Yay autosave! If anyone else runs into this, I suggest limiting the number of graphs within a folder to about ten. It seems that Substance does a massive calculation when graphs in a folder become visible, either for computing outputs or just thumbnails within each graph, which can lead it to crash.
@Xoliul I just tried using the published substances in the player, it's looking much better and gives the expected output. The package seems to be publishing correctly then, but the texture maps are being adjusted with whichever module unity/marmoset uses to handle the .sbsar files. I will just stick to exporting flat maps for now as I don't need to modulate any of the parameters in the engine.
Yeah we should just categorize things properly. Again before we get into requests there's some 'low hanging fruit' type of things to tackle. I think it's important to sort of preach a 'clean and correct' workflow to people with this project, instead of just pumping out sloppy nodes. That means some educational videos would go well with it.
Regarding size: if you don't use too many (uncompressed) textures, they are super small. 2Gb will go a long way.
You 2 dudes want to discuss this more in here, or on Skype or something? it's just gonna be up to someone like us to sit down and do it. I have thursday and friday off so I could get this started.
Indeed I agree. I only have weekends and usually those weekends are stuck doing house work haha I usually get a 1 hr per day at home of something pc related though. but yep might be nice to recreate some of those nodes we know we need and love before diving into a request board.
Has someone come up with a way to easily switch between metalness and non metalness textures for PBR?
If you author your workflow with full specularity in mind to begin with, then create your metalness mask, and use that mask to copy/paste from the specular map onto the diffuse map, that's usually quickest.
1.Is there an easier way to blend multiple textures together than blending them in pairs, and then blending those pairs? It becomes a bit cumbersome and unweildy. I suppose I could create my own node/substance that just does that behind the scenes, but is there a blend-node in SD somewhere that already does this?
And on that topic, would it be possible to get a node that overrides the uv's? This way you could calculate a specific procedural texture on a particular uv projection (simple ones like boxmapping done in SD or as an pre-made UV channel from the modeling software) and rebake that to the regular uv's before the output.
1.Is there an easier way to blend multiple textures together than blending them in pairs, and then blending those pairs? It becomes a bit cumbersome and unweildy. I suppose I could create my own node/substance that just does that behind the scenes, but is there a blend-node in SD somewhere that already does this?
And on that topic, would it be possible to get a node that overrides the uv's? This way you could calculate a specific procedural texture on a particular uv projection (simple ones like boxmapping done in SD or as an pre-made UV channel from the modeling software) and rebake that to the regular uv's before the output.
1. You want to blend 3 or more textures together with one mask? This has to be a color mask then, no? Like random, contrasting RGB values? I think 'Multi Material Blend' is what you're after. You can blend as many channels as you want with that, with a multi-color mask. But maybe i'm misunderstanding you.
2. You could do that quite easily with a Worldspace normalmap. Should be fairly simple, just use the channels separately as masks to blend 3 procedurals.
You're still tied to UV's however. Maybe you want real 3D-procedural baking. I personally never felt the need for this.
I think you're kind of forgetting that Substance works purely in 2D, independent of UV's even. There is currently no way that a 3D mesh influences a substance at runtime, expcet by translating info to 2D by baking maps. They'd have to make some fundamental changes to enable "UV-agnostic" nodes.
I'll look into that. I was using masks to apply gradient maps to multiple areas, and things got pretty unweildy.
If you've got a minute, could I pick your brain about how you'd use worldspace maps to simulate Blended Box Mapping? I'm still wrapping my head around how things work, and while I love SD already, the seams do still bother me. real 3d procedurals would be amazing, but I saw Blended Box Mapping as a fairly decent stopgap solution that would be more realistic to hope for. I see I misunderstood some parts though. Rather than uv-agnostic nodes, wouldn't extra maps to bake (blended box to uv, uv to uv, etc)work? The procedural maps then wouldn't strictly be non-destructive anymore, but since rebaking is so easy you could tweak settings and rebake, right?
Ofcourse I'd prefer a node that does this, but baking could work.
Maybe I just need to work with it some more before to better understand things.
On another note, I miss a few usability things I'm used to from other modular (audio) software, but I might have overlooked them:
-dragging nodes over wires inserts them
-grouping nodes into a sub-container (I suppose starting a new substance and copy-pasting the nodes works, but that's no longer self-contained, and kind of a hassle)
-dragging from an input/output onto the canvas pops up the 'Add Node' dialog, filtered to only show the ones that can be connected to the i/o node you're dragging from. Selecting one from the dialog then automatically places it where you let go on the canvas, and connects it.
-a rightclick dialog on a wire. It could have 'delete' and 'add node here', mainly.
I'd also prefer to be able to connect multiple outputs to one input and just automatically get blending options on the receiving node, but that's mostly about having easier to read signal flow for me (and again: just something I'm used to.)
Hey I was just trying it a bit, until i realized i missed one important part: you can do the blending fine, but not the (re)projecting.
Thinking about it now, it seems like the best way to easily enable this, would be to have a baking option that allows baking any channel (diff, spec, etc) or bitmap from one UV set to the other (or to something like a box-projected auto-UV). That's one thing Max or maya can do that Substance can't. Doesn't sound like a crazy feature request.
Your suggestions all sound useful, even though I already think Substance is one of the nicest node editors there is (I've used some pretty horrible ones).
The (proper) group container thing is something coming soon though, check one of the previous pages for it.
edit: and again, related to your gradient masking; if it gets unwieldy, just invest some time in creating a proper node that does it for you. Doesn't take long and goes a long way. This sort of mentality is really what's gonna make you much more proficient with the program.
Xoliul: I already starting making my own multi-blend thing, it's far easier than I initially thought! Thanks for the help so far, I'll keep experimenting with it.
Xoliul - being able to project some things from box-projected auto-UV would be amazing for alleviating some of the noise-pattern problems at seams too. This feature is one I've been thinking about a lot, and you are right that it's the one thing traditional 3d apps have that Substance doesn't. We needs this.
I would be interested in helping build a library - but I think the logistics are hairy. I'm with you on needing to get people on the same page, we need standards if we are going to build anything useful that actually works together.
I wonder if we could go the open-source route, like using Git or something so that we could add substances, change each others, etc. and have the changes be curated/approved, free to everyone, etc. I don't know how to use open-source that well, and I think a lot of artists might be scared off by it - but it seems like an ideal philosophy/workflow if we could get past that.
SBS vs SBSAR?
One is just a working file the other an export? If I move one of my images I am using to another file does this make my sbs stop working? I am confused as to how these two work differently
ETA Is there a way to only expot sbsar when you publish?
SBS : source file, editable in Designer. SBSAR : Compiled + Compressed version of your SBS. Also include the resources needed by the graph (like bitmaps).
The sbs format is the raw format, like a psd file for Photoshop, while an SBSAR is a compressed object, like a jpeg/png image.
The sbsar is only one file that you can easily share, but you can't modify it further that the parameters included with it. So your tweaking is sort-of limited (depends of the parameters exposed). Bitmap 2 Material is for example a very complex substance with a lot of parameters.
The SBS format use relative paths, so you can move it were you want if you take care of moving the ressource slinked with it too. However, only Designer can work with the sbs format. For Unity, the Substance Player or even the UDK, you will have to use the compiled version, which is the SBSAR.
Xoliul - being able to project some things from box-projected auto-UV would be amazing for alleviating some of the noise-pattern problems at seams too. This feature is one I've been thinking about a lot, and you are right that it's the one thing traditional 3d apps have that Substance doesn't. We needs this.
I would be interested in helping build a library - but I think the logistics are hairy. I'm with you on needing to get people on the same page, we need standards if we are going to build anything useful that actually works together.
I wonder if we could go the open-source route, like using Git or something so that we could add substances, change each others, etc. and have the changes be curated/approved, free to everyone, etc. I don't know how to use open-source that well, and I think a lot of artists might be scared off by it - but it seems like an ideal philosophy/workflow if we could get past that.
Allegorithmic reads this, so I hope they'll consider this UV-transfer baking.
Actually, what would be EVEN BETTER: baking UV-delta's into a texture. Like a position map or world space normal, you could bake a "UVW Delta"-map, that contains, per pixel, the offsets between UV-sets. Then a node that just does the transfer realtime based on this map. That would be much, much better and only involve a single bake instead of constant rebakes. Youd would however needs some very high accuracy, 8bit wouldn't cut it at all...
And regarding deployment, yeah...
You want a select group to have write access to it, with perhaps even reviewing things when adding to it. But then you want to have easy deployment and updating to the public. No read access, but an easy process to get the latest version, preferably automatic.
I don't think a system like that exists yet. Closest i can think of is something like an installer that deploys SVN and some python scripts. I guess I could create something like that....
SBS : source file, editable in Designer. SBSAR : Compiled + Compressed version of your SBS. Also include the resources needed by the graph (like bitmaps).
The sbs format is the raw format, like a psd file for Photoshop, while an SBSAR is a compressed object, like a jpeg/png image.
The sbsar is only one file that you can easily share, but you can't modify it further that the parameters included with it. So your tweaking is sort-of limited (depends of the parameters exposed). Bitmap 2 Material is for example a very complex substance with a lot of parameters.
The SBS format use relative paths, so you can move it were you want if you take care of moving the ressource slinked with it too. However, only Designer can work with the sbs format. For Unity, the Substance Player or even the UDK, you will have to use the compiled version, which is the SBSAR.
Okay, I wanted to start working on compiling a materials library to easily apply things to models I make. But anything that is an SBS that I try to reopen comes up blank, even if I haven't moved the texture anywhere. This is kind of frustrating to have to bring the texture back into the window over and over every time I am trying to work with it.
ETA: Does this only happen when you publish things? SBS that i haven't published still contain the texture..?
We have a new "UV node" in the works that allows to remap a texture using arbitrary UV coordinates. That along with a UV delta map could be useful I guess? I have no ETA though.
As for building a library, the 4.1 release (probably tomorrow or on Monday) will bring a new Alias system -thx Xoliul - and an option to pack all external dependencies before saving that will make sharing of substance assets much easier.
@Xoliul: My understanding of Git (admittedly fuzzy) is that it was very similar to that. You can create a fork of the original repository and make changes to it locally, then submit those changes back to the original repo - but they can require approval to get merged in. Then the original repo is kind of like this pure, clean version of approved changes. That's how a lot of open-source software gets made, I thought...
I don't know if it needs to be that serious though, Carlos. Do you imagine tons of people would submit to this? I imagine only like 4-5 people working on it a lot. Mainly you just want people to easily download it, and stay up to date without having to redownload.
This UV node sounds awesome, I've really missed something like this. Would be great if it does absolute positions as well as relative ones (like UV distorting with a normalmap)
Cman2k: I didn't know if I was just getting the workflow wrong or something, glad to hear someone else really wants the uv-projection stuff.
Oh, almost forgot what I came here to post:
Any ideas why this is happening? I've had resolution issues a few times, but reconnecting it to a node with the right resolution generally fixed it. Here, nothing I do fixes it. I've baked it at 2k (it's a vector though, so that's not entirely relevant), and everything it's connected to is at 2k as well.
edit: connecting a 2k Uniform Colour node to the background slot fixed it, but I'd still like to figure out why it was happening in the first place
Any ideas why this is happening? I've had resolution issues a few times, but reconnecting it to a node with the right resolution generally fixed it. Here, nothing I do fixes it. I've baked it at 2k (it's a vector though, so that's not entirely relevant), and everything it's connected to is at 2k as well.
edit: connecting a 2k constant colour node to the background slot fixed it, but I'd still like to figure out why it was happening in the first place
That's because on nodes you have priorities. You can notice that some inputs have a little dot in the middle of the yellow circle. It means this input will be used to define the resolution of the output.
In case you plug a texture smaller that what you want in the input, you can change the settings of the node in the resolution settings (Output size) to use "Relative to Parent" instead of "Relative to input". Or you can use a specific size (with "Absolute").
Another suggestion I had to make the modular workflow nicer: a replace command. Say you want to replace a levels node with an hsl one, normally you'd have to delete it and reconnect everything afterwards. This way you can just swap them out and all connections remain the same.
Another suggestion I had to make the modular workflow nicer: a replace command. Say you want to replace a levels node with an hsl one, normally you'd have to delete it and reconnect everything afterwards. This way you can just swap them out and all connections remain the same.
You could use the backspace key (which remove a node without breaking the links), then select the node before and add your new node :
Replies
Both of my requests are implemented, I love you! :thumbup:
. I was just wondering whether its possible in substance to blend alot of materials on to one object like the water pump in the document?
I now know you can use the multi material blend with mesh id mask however the one in the document is slightly different.
http://blog.selfshadow.com/publications/s2013-shading-course/rad/s2013_pbs_rad_slides.pdf
I think that they could have used SD instead of creating their own in-house shader system if these features had made it earlier into the tool.
That's exactly the workflow some of our customers working on new consoles use Substance for though.
Yes Substance can do the same workflow as the Order. The only bit is that cool blending will happen in the shader aspect not within substance even though in substance you can get your tiling but it will still be restricted to 1 to 1 map like 204x82048 etc when the whole material blends is complete. You could also write a shader that supports tiling.
For example tiling the rust or dirt base textures need to be done via a shader.
In general I tend to use substance as quick results viewer and than export my elements textures such as masks and base textures... and than do the material composting in a PBR engine like marmasat tool bag.
You can make that step more linear if you make tools / scripts for it though...
I have a few different models that should use the same textures. For this purpose I've created substance that takes color mask and AO as inputs and outputs diffuse, specular and glossiness as alpha channel of specular (read somewhere this is the way Unity works with glossiness), but when I import model and apply substance, glossiness has no effect in Unity.
So the first question is:
What is the right way to output glossiness for Unity?
The second question is how to efficiently apply my substance on multiple models? Should I duplicate it for each model and feed model specific maps or there's another way?
If something is unclear I can share my substance and model with maps.
Top row is exported maps and hand built unity shader
middle row is maximum jpeg compression (low quality)
bottom row is minimum jpeg compression
You probably just have a gamma correction issue with textures generated from a Substance file, which probably has more to do with Unity and how you use its shaders, than it does with compressing textures inside the SBSAR.
edit:
I just tried with a Substance full of JPG masks (in Marmoset), and i can not see any quality drop. Maybe if you replace your entire diffuse with just one JPG, but in that case you're using Substance wrong. I think it's clear something is going wrong in Unity.
Bumping this as it seems you didn't get an answer yet. I guess you get an heavy memory usage simply because you are linking the bitmap directly in the graph which trigger their display and therefore their loading in the ram.
Try adding them via the explorer in a ressource folder instead (beside you graph). You can still drag'n'drop your bitmaps, but this time release the mouse over your substance in the explorer view.
Bottom two rows are published archives, top is exported maps from graph
Comparison between the diffuse map outputs
edit: thought this may be the JPEG compression algorithm, however exporting my map from the graph as JPEG looks the same as a TGA export
Did you try what it does in Substance Player ? Just trying to see what is actually the culprit here, and player is the only real official viewer. Unity has some bugs sometimes, and I noticed today Marmoset doesn't play too nice with Substances either. UDK should be pretty good with them too btw.
....what
Yeah I think we should do something like this. They're gonna release a new version soon that makes sharing libraries much easier, this would be pretty much a requirement for it to work.
I built most of our library at work, I can remake a bunch of the good ones from that and we start from there. I believe we shouldn't get too specific with things though. A free-for-all vote sounds a bit silly, things will be all over the place, and inexperienced users might request things that don't make sense. Another good starting point is matching some Ddo stuff since people seem so used to that. Perhaps go a bit beyond it, since Ddo stuff is kind of rigid; you could do nodes that can do several effects by just allowing more control.
Sharing it could work by creating a shared Dropbox account for it, which contributors get write access to or something? That way you could stay up to date automatically I guess.
Working as intended I think? When you change a node while not actively viewing it, and it is not connected to an output, it goes into an "uncalculated" mode, showing as 0x0. Either doubleclick it to force it to recalculate, or start using it as diffuse output or something. They do this so that unused nodes aren't recalculated constantly, slowing things down.
Yeah we can host it on dropbox and mirror it on google drive and maybe my website or something else if it gets too big, I haven't looked to see how big substances get, I'll probably wait until they add the sharing features in anyway.
Regarding size: if you don't use too many (uncompressed) textures, they are super small. 2Gb will go a long way.
You 2 dudes want to discuss this more in here, or on Skype or something? it's just gonna be up to someone like us to sit down and do it. I have thursday and friday off so I could get this started.
If you're doing PBR ready substances, could i request that you make two variants of each? I've made (it's on the first post of this thread) a full specular shader. would be useful i think to have metalness and specular based substances for PBR workflows.
I'll help as much as i can, also =]
Substance crashes whenever I try to expand a specific folder within my package. The folder has about a dozen or so graphs for all of the objects for my scene. I can browse every other folder, like the base materials I've created and even all of my resources, just fine. I should note that the folder contains graphs only, no resources of any kind.
Anyone have any ideas?
Update: I ended up reverting to one of the autosave backups and dividing my package up from there. (Very easy.) In the end I only lost about 20 minutes worth of work. Yay autosave! If anyone else runs into this, I suggest limiting the number of graphs within a folder to about ten. It seems that Substance does a massive calculation when graphs in a folder become visible, either for computing outputs or just thumbnails within each graph, which can lead it to crash.
Indeed I agree. I only have weekends and usually those weekends are stuck doing house work haha I usually get a 1 hr per day at home of something pc related though. but yep might be nice to recreate some of those nodes we know we need and love before diving into a request board.
If you author your workflow with full specularity in mind to begin with, then create your metalness mask, and use that mask to copy/paste from the specular map onto the diffuse map, that's usually quickest.
1.Is there an easier way to blend multiple textures together than blending them in pairs, and then blending those pairs? It becomes a bit cumbersome and unweildy. I suppose I could create my own node/substance that just does that behind the scenes, but is there a blend-node in SD somewhere that already does this?
2.I know 3d procedurals have been discussed, but in the mean time: would blended box mapping be something you could add?
Here's a good read about them: http://www.neilblevins.com/cg_education/blended_box_mapping/blended_box_mapping.htm
And on that topic, would it be possible to get a node that overrides the uv's? This way you could calculate a specific procedural texture on a particular uv projection (simple ones like boxmapping done in SD or as an pre-made UV channel from the modeling software) and rebake that to the regular uv's before the output.
Don't think so, though transform 2D shows tiling past borders, and you can paint with wrapping in bitmap mode.
1. You want to blend 3 or more textures together with one mask? This has to be a color mask then, no? Like random, contrasting RGB values? I think 'Multi Material Blend' is what you're after. You can blend as many channels as you want with that, with a multi-color mask. But maybe i'm misunderstanding you.
2. You could do that quite easily with a Worldspace normalmap. Should be fairly simple, just use the channels separately as masks to blend 3 procedurals.
You're still tied to UV's however. Maybe you want real 3D-procedural baking. I personally never felt the need for this.
I think you're kind of forgetting that Substance works purely in 2D, independent of UV's even. There is currently no way that a 3D mesh influences a substance at runtime, expcet by translating info to 2D by baking maps. They'd have to make some fundamental changes to enable "UV-agnostic" nodes.
If you've got a minute, could I pick your brain about how you'd use worldspace maps to simulate Blended Box Mapping? I'm still wrapping my head around how things work, and while I love SD already, the seams do still bother me. real 3d procedurals would be amazing, but I saw Blended Box Mapping as a fairly decent stopgap solution that would be more realistic to hope for. I see I misunderstood some parts though. Rather than uv-agnostic nodes, wouldn't extra maps to bake (blended box to uv, uv to uv, etc)work? The procedural maps then wouldn't strictly be non-destructive anymore, but since rebaking is so easy you could tweak settings and rebake, right?
Ofcourse I'd prefer a node that does this, but baking could work.
Maybe I just need to work with it some more before to better understand things.
On another note, I miss a few usability things I'm used to from other modular (audio) software, but I might have overlooked them:
-dragging nodes over wires inserts them
-grouping nodes into a sub-container (I suppose starting a new substance and copy-pasting the nodes works, but that's no longer self-contained, and kind of a hassle)
-dragging from an input/output onto the canvas pops up the 'Add Node' dialog, filtered to only show the ones that can be connected to the i/o node you're dragging from. Selecting one from the dialog then automatically places it where you let go on the canvas, and connects it.
-a rightclick dialog on a wire. It could have 'delete' and 'add node here', mainly.
I'd also prefer to be able to connect multiple outputs to one input and just automatically get blending options on the receiving node, but that's mostly about having easier to read signal flow for me (and again: just something I'm used to.)
Thinking about it now, it seems like the best way to easily enable this, would be to have a baking option that allows baking any channel (diff, spec, etc) or bitmap from one UV set to the other (or to something like a box-projected auto-UV). That's one thing Max or maya can do that Substance can't. Doesn't sound like a crazy feature request.
Your suggestions all sound useful, even though I already think Substance is one of the nicest node editors there is (I've used some pretty horrible ones).
The (proper) group container thing is something coming soon though, check one of the previous pages for it.
edit: and again, related to your gradient masking; if it gets unwieldy, just invest some time in creating a proper node that does it for you. Doesn't take long and goes a long way. This sort of mentality is really what's gonna make you much more proficient with the program.
I would be interested in helping build a library - but I think the logistics are hairy. I'm with you on needing to get people on the same page, we need standards if we are going to build anything useful that actually works together.
I wonder if we could go the open-source route, like using Git or something so that we could add substances, change each others, etc. and have the changes be curated/approved, free to everyone, etc. I don't know how to use open-source that well, and I think a lot of artists might be scared off by it - but it seems like an ideal philosophy/workflow if we could get past that.
One is just a working file the other an export? If I move one of my images I am using to another file does this make my sbs stop working? I am confused as to how these two work differently
ETA Is there a way to only expot sbsar when you publish?
SBS : source file, editable in Designer.
SBSAR : Compiled + Compressed version of your SBS. Also include the resources needed by the graph (like bitmaps).
The sbs format is the raw format, like a psd file for Photoshop, while an SBSAR is a compressed object, like a jpeg/png image.
The sbsar is only one file that you can easily share, but you can't modify it further that the parameters included with it. So your tweaking is sort-of limited (depends of the parameters exposed). Bitmap 2 Material is for example a very complex substance with a lot of parameters.
The SBS format use relative paths, so you can move it were you want if you take care of moving the ressource slinked with it too. However, only Designer can work with the sbs format. For Unity, the Substance Player or even the UDK, you will have to use the compiled version, which is the SBSAR.
Allegorithmic reads this, so I hope they'll consider this UV-transfer baking.
Actually, what would be EVEN BETTER: baking UV-delta's into a texture. Like a position map or world space normal, you could bake a "UVW Delta"-map, that contains, per pixel, the offsets between UV-sets. Then a node that just does the transfer realtime based on this map. That would be much, much better and only involve a single bake instead of constant rebakes. Youd would however needs some very high accuracy, 8bit wouldn't cut it at all...
And regarding deployment, yeah...
You want a select group to have write access to it, with perhaps even reviewing things when adding to it. But then you want to have easy deployment and updating to the public. No read access, but an easy process to get the latest version, preferably automatic.
I don't think a system like that exists yet. Closest i can think of is something like an installer that deploys SVN and some python scripts. I guess I could create something like that....
Okay, I wanted to start working on compiling a materials library to easily apply things to models I make. But anything that is an SBS that I try to reopen comes up blank, even if I haven't moved the texture anywhere. This is kind of frustrating to have to bring the texture back into the window over and over every time I am trying to work with it.
ETA: Does this only happen when you publish things? SBS that i haven't published still contain the texture..?
As for building a library, the 4.1 release (probably tomorrow or on Monday) will bring a new Alias system -thx Xoliul - and an option to pack all external dependencies before saving that will make sharing of substance assets much easier.
@Jerc: UV NODE - HELLS YES.
This UV node sounds awesome, I've really missed something like this. Would be great if it does absolute positions as well as relative ones (like UV distorting with a normalmap)
Oh, almost forgot what I came here to post:
Any ideas why this is happening? I've had resolution issues a few times, but reconnecting it to a node with the right resolution generally fixed it. Here, nothing I do fixes it. I've baked it at 2k (it's a vector though, so that's not entirely relevant), and everything it's connected to is at 2k as well.
edit: connecting a 2k Uniform Colour node to the background slot fixed it, but I'd still like to figure out why it was happening in the first place
That's because on nodes you have priorities. You can notice that some inputs have a little dot in the middle of the yellow circle. It means this input will be used to define the resolution of the output.
In case you plug a texture smaller that what you want in the input, you can change the settings of the node in the resolution settings (Output size) to use "Relative to Parent" instead of "Relative to input". Or you can use a specific size (with "Absolute").
Another suggestion I had to make the modular workflow nicer: a replace command. Say you want to replace a levels node with an hsl one, normally you'd have to delete it and reconnect everything afterwards. This way you can just swap them out and all connections remain the same.
You could use the backspace key (which remove a node without breaking the links), then select the node before and add your new node :