I have been converting all my skyshop shaders over to versions made in shaderforge. My testing is showing that lightmaps are being blended into the diffuse and specular light information without needing to check the boxes on the skyshop nodes.
It's definitely coming along. Getting the speed curves and alpha fadeout similar to a real explosion. The drawing order jumps are disturbing though. Can't do anything about it until Unity comes up with a solution on rendering transparent objects.
It looks great. Is this shader being applied to a particle system or is all the movement done in the vertex shader? I think it will look a lot more natural when there is a fire system flipped on after the brightest point in the explosion. The way it goes totally dark feels unnatural.
It's definitely coming along. Getting the speed curves and alpha fadeout similar to a real explosion. The drawing order jumps are disturbing though. Can't do anything about it until Unity comes up with a solution on rendering transparent objects.
Any feedbacks, critics ?
Please, please, give us a breakdown of how you did this!
I'm trying to create a shader that ends in something that looks like this:
If you imagine an isometric cube like this, I'm trying to make it so each face has a different flat colour without using lights. So the left hand side is always tinted blue, the right hand side is always tinted red, etc.
I've managed to achieve this fairly easily using a blend node and a view position node, but I'm struggling to add a third colour to the mix for the "top" of the cube.
rather than have it based on view position, have it based on worldspace normals. since each face will then have an exact coordinate to use as a basis for your colouring.
rather than have it based on view position, have it based on worldspace normals. since each face will then have an exact coordinate to use as a basis for your colouring.
Which node will give me the world space normals? World Position?
Do this:
get the normal direction node and then connect it to 3 component mask nodes, masking one channel for each (R,G,B)
Then create a oneminus node and plug the normal node into it and again, create another 3 component mask nodes as above.
You now have six mask channels for each normal direction.
Add six color nodes, one for each direction, and multiply each color node with a normal mask.
finally, add all of the multiplied nodes together and plug into emission. You should now be able to set a colour for each normal direction.
(you don't need an normalise node for that either!)
Do this:
get the normal direction node and then connect it to 3 component mask nodes, masking one channel for each (R,G,B)
Then create a oneminus node and plug the normal node into it and again, create another 3 component mask nodes as above.
You now have six mask channels for each normal direction.
Add six color nodes, one for each direction, and multiply each color node with a normal mask.
finally, add all of the multiplied nodes together and plug into emission. You should now be able to set a colour for each normal direction.
(you don't need an normalise node for that either!)
Thanks for this! The component mask node is pretty awesome.
I'm pretty close now. The one minus nodes mess things up, so I'm assuming I am missing something from this setup, probably the normal mask?
Thanks for this! The component mask node is pretty awesome.
I'm pretty close now. The one minus nodes mess things up, so I'm assuming I am missing something from this setup, probably the normal mask?
Mind that simply summing them together like that will give you values above 1 across the sphere. If you want to normalize it, making sure that the strength of the color stays the same, you need to raise the normals to the power of two.
You can do that by simply multiplying it with itself ( a*a = a^2 )
You also get a nice little side-effect of that, namely that a negative number multiplied by another negative number, is always positive, so you don't need the abs node either in this case.
You can also use the Channel Blend node for simplifying this shader.
So, here's my version:
Is it possible to expose pan and rotate node values in the material properties? I would like to be able to animate them.
It's possible using the Value property node, the green value node, and hooking it up to the panner/rotator. If you want X/Y control, you can manually make a panner. To pan a texture, all you have to do is to increase/decrease UV coords over time using the add/subtract node.
Thanks all of you who took the survey so far! I'll post stats later, and my take on all the results
Also, per-node precision and being able to rename the internal name of a property is now implemented in RC 0.37!
This extra panel under all variable-defining nodes is turned off by default, and can be enabled in the SF settings window
I've realized the changes I'm doing now is going to tear up pretty much single aspect of the shader compiler, and be rewritten completely. This is going to take quite a lot of time, but I will try to keep you all updated on the progress!
So, a little status update:
The idea with 1.0 has been to do some of the most time-consuming but important features, and especially the future-proofing features.
This means that nested nodes and being able to control vert/frag program splits, should be available to the user in 1.0.
The new system requires a full rewrite of the main node, but it also made me realize that the main node should be a nested node itself.
But this has an interesting though time-consuming consequence - the main node is configured based on your shader settings, in which case you should be able to configure *any* nested nodes, the main node included. For instance, Blinn-Phong vs Phong vs PBR should then be a setting on the main node itself, not of the whole shader in the shader settings.
Especially since a separate vert/frag split mode is going to come, where the main node isn't the final/essential node.
If this is the case, nested nodes should support optional variables, and conditional pre-shader-write branching.
For example:
If you connect something into Diffuse, but not into specular, the part where it does: ( specular + diffuse )*lightColor*lightAtten
...should simply change into: diffuse*lightColor*lightAtten
...and so on. This was one of the more trivial examples, but there are more complex setups, especially with transmission/light wrapping in combination with energy conservation and diffuse power.
This used to be hardcoded into the shader compiler, but if it's a nested node, it should be built-in.
This requires a ton of rewrite and design work, but I believe a main node that's 100% based on the nested node system, will open up a ton of opportunities and flexibility due to its future-proof underlying design. It would also "open-source" the main node, allowing you to edit it, in case you don't like it or want to add more features.
But again, that should/might add a whole system where you should be able to place dropdown menus / checkboxes / values and configure every aspect of the nested node itself, which will take time.
Another tricky aspect of that system, is that the main node will not work in a vert/frag split interface, for several reasons.
While most inputs are in the fragment shader, this is not the case for all:
1. Vertex offset lies in the vertex shader
2. Outline width lies in the vertex shader of a completely different pass
3. Outline color lies in the pixel shader of a different pass
4. Tessellation/displacement writes multiple shader programs other than vert and frag
This needs to be handled somehow, while still remaining easy to use, yet being flexible enough for the future. It'll work out; but it might take quite a while.
One solution could be to make the main node itself contain nested nodes itself, where those nodes are clearly separated in their respective passes/programs.
In the meantime, I hope your work using Shader Forge is going well
I had some questions regarding the licensing and use for ShaderForge. We're currently looking into it at my job. We have two artists who would be actively making shaders and using the toolset but we have other developers using Unity that will be on project as well (programmers, terrain, etc). Do we only need two licenses for the shader artists using the toolset, or one license for each developer seat on the project?
Is it possible to change the panner speed in different time? I didn't success to do that with lerp and time node. But I can use the same node to change speed by time in UDK.
I had some questions regarding the licensing and use for ShaderForge. We're currently looking into it at my job. We have two artists who would be actively making shaders and using the toolset but we have other developers using Unity that will be on project as well (programmers, terrain, etc). Do we only need two licenses for the shader artists using the toolset, or one license for each developer seat on the project?
It's one license per-seat using the tool. So, no, you don't need it for your entire team
Is it possible to change the panner speed in different time? I didn't success to do that with lerp and time node. But I can use the same node to change speed by time in UDK.
Hey Joachim. We're looking into porting some old assets so we can sell them on the Asset Store. Are there any restrictions on using Shader Forge shaders when including them in an asset bundle? What about including shaders you made as examples for Shader Forge?
This has probably been answered already but I couldn't find anything from searches.
is there a way to utilise the alpha channel from a normal map, at the moment it appears that when you set a texture to normal compression it kills the alpha channel, I would like to have my normal and roughness for a tiling detail map in one texture.
....does that make sense?
really enjoying shaderforge so far though, great stuff Joachim
Seth, there is but it's a bit of a ball-ache. Unity does some behind the scenes compression on normalmaps that swaps the channels around and removes the alpha channel.
So what you need to do is this:
make sure your normalmap is set to "texture" rather than "normalmap" in the texture parameters, preferably "truecolour".
load it into your shader as a texture sample, and output the RGB and do some math to it:
multiply your normalmap rgb output by 2, then subtract 1 from the result, then plug the result into your normalmap channel.
it's very important that the normalmap is loaded in as a tex2D and not unpacknormal.
Guys latest unity 4.5.5 broke my shaders...
Pls update shaderforge im unable to use it
now even in unity 4.5.4 im getting the same
basicly the project is nuked by 4.5.5
Hi there!
First of all, I must say I'm a complete newbie at shaders. I've just starting to use Shader Forge and it's making my work a bit easier. Still, it's quite difficult doing what I want while knowing so little about shaders!
Anyway. I'm making a low poly style game in Unity, and I need to do water. A sea, to be clear.
This is what I've obtained until now:
Which is kinda good, but still not great. Keep in mind that right now the water is animated via shader, with a sin curve (more on that later).
First of all, the general look of the water and the reflection. The latter is made with a script that captures the camera, puts the pixels upside down and generates a render texture. Then I use the same render texture in SF, and multiply it for a diffuse color. Then I have a simple specular and a gloss.
I would like to have a different reflection, though. First of all I'd like the image to be refracted according to the inclinations of the triangles composing the water surface. And then I'd like to keep it very visible on the upper side (the one near the geometry) and then fade out while going down. No clue about how to do this, though!
Also, I would like some advice on how to make it look generally better.
The second part is related to the wave movement. I kinda have put together something which I find quite good, but still my surface moves all together, up and down. I would like to have some waves perturbating the surface, rather than all the surface behaving like a huge wave (don't know if it's clear). How can I achieve that result?
Here is my shader, as for today. Keep in mind, it's perfectly possible I added unnecessary stuff. So, really, any advice helps
Guys latest unity 4.5.5 broke my shaders...
Pls update shaderforge im unable to use it
now even in unity 4.5.4 im getting the same
basicly the project is nuked by 4.5.5
So, I've been giving a shot to a pbr shader for the last few days... but I gotta admit, I still have some difficulties wrapping my head around the whole theory. This is why I wanted to share with you what I have 'till now.
Metals are fine so far, but I don't seem to be able to archive the plastic look. It looks more like glass pearls (see the screenshots).
Also it seems I don't really get the diffuse ambient light part in pbr and the fresnel stuff is baffling me (the nodes documentation is not really complete for it unfortunately).
Anyway, here's the node tree... and if someone should figure the remaining problems out, I would be glad to hear them.
(Oh, and the editor screenshot should be 4k, so you can really zoom in to see the fx)
Hello everybody,
I'm having trouble implementing Schlick-Fresnel approximation into my Blinn-Phong uber-shader.
I've tried to wire with nodes this formula:
R=R0 +(1-R0)(1-dot H,V)^5
R0=((n1-n2)/(n1+n2))
Instead of rim effect I get diffuse-only surface towards the light, and some specularity in the shadow.
After long research I've read that SF Half and ViewDirection vectors are in World space, contrary to presumed Tangent space in equations. The problem is, I have no experience whatsoever with coding and have no idea how to convert from one space to another or how to write stuff into Code node, even with ample examples online.
I've had success using in-built Fresnel node and used it with Exponent and Bias parameters, but I really need something with physical IOR to be a benchmark for those simplified settings. Other shaders like Water and Glass need correct values just to start.
Also I use Fresnel multiplied to Specular color(white) and Specular Power slider, plugged into Specular slot. Maybe that is also not the right way to do it?
Thanks for assistance, and thank you Joachim, what a great tool, without it I wouldn't know left from right in real time rendering.:thumbup:
I made a simple electricity shader. It uses a photoshop clouds maps and a panner to offset UVs and make the line jaggy:
I also just started using the hotkeys to insert nodes. That is a huge time saver.
So I am trying to figure this out on my own and I cant figure out how you got this to work, when I attempt something similar the horizontal bar does not show up in my multiple node or anywhere for that matter. Is that coming from your glow line texture? And if so what does that texture look like?
I've asked this in Unity forums ,but maybe someone here will know the answer:
I'm making an ocean shader with water color in diffuse or emission component, which is a a color gradient between 2 values, dependent on Depth Blend. Opacity works the same way.
The problem that I have - Depth Blend is not affected by refraction. To distort the Depth Blend mask I need to have it in texture format, my guess that would be z-buffer texture. My camera is set to deferred, shaders except water are deferred too, water doesn't write to z-buffer. I put a texture node named _CameraDepthTexture, but it doesn't pick anything up.
Hi guys hope you can help. I am trying to fake sss with shader forge for foliage. So a part of the mesh comes between the view and the light source should get lighter. I know there will be a way, I'm just really stuck.
Replies
Skyshop shaders vs versions made in shaderforge:
And how the game looks with the updates:
Any feedbacks, critics ?
I also just started using the hotkeys to insert nodes. That is a huge time saver.
Please, please, give us a breakdown of how you did this!
http://stevencraeynest.wordpress.com/2013/03/29/easy-volumetric-explosion-in-unity3d/
http://vimeo.com/61959085
I'm trying to create a shader that ends in something that looks like this:
If you imagine an isometric cube like this, I'm trying to make it so each face has a different flat colour without using lights. So the left hand side is always tinted blue, the right hand side is always tinted red, etc.
I've managed to achieve this fairly easily using a blend node and a view position node, but I'm struggling to add a third colour to the mix for the "top" of the cube.
Can anyone point me in the right direction?
Which node will give me the world space normals? World Position?
Problem is that the reverse side of the cube looks like this:
Is there a way of reversing the effect so I don't get any black sides?
get the normal direction node and then connect it to 3 component mask nodes, masking one channel for each (R,G,B)
Then create a oneminus node and plug the normal node into it and again, create another 3 component mask nodes as above.
You now have six mask channels for each normal direction.
Add six color nodes, one for each direction, and multiply each color node with a normal mask.
finally, add all of the multiplied nodes together and plug into emission. You should now be able to set a colour for each normal direction.
(you don't need an normalise node for that either!)
Thanks for this! The component mask node is pretty awesome.
I'm pretty close now. The one minus nodes mess things up, so I'm assuming I am missing something from this setup, probably the normal mask?
Previous suggestions are almost right. To make it work on both sides you need to use Abs node.
The spheres that expand and rise up are animations. The smoke rolling is done with texture offset. No particles.
@diamond3
I plan on putting it on asset store once it's finished, so can't really do that now
@ENODMI
Yeah, interesting technique indeed. Looked very convincing in the butterfly effect movie. Although it sacrifices transparent edges for a solid render.
Mind that simply summing them together like that will give you values above 1 across the sphere. If you want to normalize it, making sure that the strength of the color stays the same, you need to raise the normals to the power of two.
You can do that by simply multiplying it with itself ( a*a = a^2 )
You also get a nice little side-effect of that, namely that a negative number multiplied by another negative number, is always positive, so you don't need the abs node either in this case.
You can also use the Channel Blend node for simplifying this shader.
So, here's my version:
It's possible using the Value property node, the green value node, and hooking it up to the panner/rotator. If you want X/Y control, you can manually make a panner. To pan a texture, all you have to do is to increase/decrease UV coords over time using the add/subtract node.
Looks very nice
You probably saw it but it looks better now.
Shader Forge 1.0 is closing in!
It would be great if you all could help steer SF in the correct direction by answering a few questions
Click here to go to the survey
Also, per-node precision and being able to rename the internal name of a property is now implemented in RC 0.37!
This extra panel under all variable-defining nodes is turned off by default, and can be enabled in the SF settings window
So, a little status update:
The idea with 1.0 has been to do some of the most time-consuming but important features, and especially the future-proofing features.
This means that nested nodes and being able to control vert/frag program splits, should be available to the user in 1.0.
The new system requires a full rewrite of the main node, but it also made me realize that the main node should be a nested node itself.
But this has an interesting though time-consuming consequence - the main node is configured based on your shader settings, in which case you should be able to configure *any* nested nodes, the main node included. For instance, Blinn-Phong vs Phong vs PBR should then be a setting on the main node itself, not of the whole shader in the shader settings.
Especially since a separate vert/frag split mode is going to come, where the main node isn't the final/essential node.
If this is the case, nested nodes should support optional variables, and conditional pre-shader-write branching.
For example:
If you connect something into Diffuse, but not into specular, the part where it does:
( specular + diffuse )*lightColor*lightAtten
...should simply change into:
diffuse*lightColor*lightAtten
...and so on. This was one of the more trivial examples, but there are more complex setups, especially with transmission/light wrapping in combination with energy conservation and diffuse power.
This used to be hardcoded into the shader compiler, but if it's a nested node, it should be built-in.
This requires a ton of rewrite and design work, but I believe a main node that's 100% based on the nested node system, will open up a ton of opportunities and flexibility due to its future-proof underlying design. It would also "open-source" the main node, allowing you to edit it, in case you don't like it or want to add more features.
But again, that should/might add a whole system where you should be able to place dropdown menus / checkboxes / values and configure every aspect of the nested node itself, which will take time.
Another tricky aspect of that system, is that the main node will not work in a vert/frag split interface, for several reasons.
While most inputs are in the fragment shader, this is not the case for all:
1. Vertex offset lies in the vertex shader
2. Outline width lies in the vertex shader of a completely different pass
3. Outline color lies in the pixel shader of a different pass
4. Tessellation/displacement writes multiple shader programs other than vert and frag
This needs to be handled somehow, while still remaining easy to use, yet being flexible enough for the future. It'll work out; but it might take quite a while.
One solution could be to make the main node itself contain nested nodes itself, where those nodes are clearly separated in their respective passes/programs.
In the meantime, I hope your work using Shader Forge is going well
// Joachim
It's one license per-seat using the tool. So, no, you don't need it for your entire team
http://acegikmo.com/shaderforge/wiki/index.php?title=UV_Rotation_%26_Panning
So i just downloaded a new version of unity and when i try to open or start a new shader in shader forge,I get this happen:
any ideas?
==edit==
Turns out this is just the latest version of SF on my system, works fine with an older version.
This might help: http://acegikmo.com/shaderforge/faq/?Q=erroronnewversion#erroronnewversion
is there a way to utilise the alpha channel from a normal map, at the moment it appears that when you set a texture to normal compression it kills the alpha channel, I would like to have my normal and roughness for a tiling detail map in one texture.
....does that make sense?
really enjoying shaderforge so far though, great stuff Joachim
So what you need to do is this:
make sure your normalmap is set to "texture" rather than "normalmap" in the texture parameters, preferably "truecolour".
load it into your shader as a texture sample, and output the RGB and do some math to it:
multiply your normalmap rgb output by 2, then subtract 1 from the result, then plug the result into your normalmap channel.
it's very important that the normalmap is loaded in as a tex2D and not unpacknormal.
Guys latest unity 4.5.5 broke my shaders...
Pls update shaderforge im unable to use it
now even in unity 4.5.4 im getting the same
basicly the project is nuked by 4.5.5
First of all, I must say I'm a complete newbie at shaders. I've just starting to use Shader Forge and it's making my work a bit easier. Still, it's quite difficult doing what I want while knowing so little about shaders!
Anyway. I'm making a low poly style game in Unity, and I need to do water. A sea, to be clear.
This is what I've obtained until now:
Which is kinda good, but still not great. Keep in mind that right now the water is animated via shader, with a sin curve (more on that later).
First of all, the general look of the water and the reflection. The latter is made with a script that captures the camera, puts the pixels upside down and generates a render texture. Then I use the same render texture in SF, and multiply it for a diffuse color. Then I have a simple specular and a gloss.
I would like to have a different reflection, though. First of all I'd like the image to be refracted according to the inclinations of the triangles composing the water surface. And then I'd like to keep it very visible on the upper side (the one near the geometry) and then fade out while going down. No clue about how to do this, though!
Also, I would like some advice on how to make it look generally better.
The second part is related to the wave movement. I kinda have put together something which I find quite good, but still my surface moves all together, up and down. I would like to have some waves perturbating the surface, rather than all the surface behaving like a huge wave (don't know if it's clear). How can I achieve that result?
Here is my shader, as for today. Keep in mind, it's perfectly possible I added unnecessary stuff. So, really, any advice helps
Thanks in advance!
problem solved, don't use | char in property names
Metals are fine so far, but I don't seem to be able to archive the plastic look. It looks more like glass pearls (see the screenshots).
Also it seems I don't really get the diffuse ambient light part in pbr and the fresnel stuff is baffling me (the nodes documentation is not really complete for it unfortunately).
Anyway, here's the node tree... and if someone should figure the remaining problems out, I would be glad to hear them.
(Oh, and the editor screenshot should be 4k, so you can really zoom in to see the fx)
I'm having trouble implementing Schlick-Fresnel approximation into my Blinn-Phong uber-shader.
I've tried to wire with nodes this formula:
R=R0 +(1-R0)(1-dot H,V)^5
R0=((n1-n2)/(n1+n2))
Instead of rim effect I get diffuse-only surface towards the light, and some specularity in the shadow.
After long research I've read that SF Half and ViewDirection vectors are in World space, contrary to presumed Tangent space in equations. The problem is, I have no experience whatsoever with coding and have no idea how to convert from one space to another or how to write stuff into Code node, even with ample examples online.
I've had success using in-built Fresnel node and used it with Exponent and Bias parameters, but I really need something with physical IOR to be a benchmark for those simplified settings. Other shaders like Water and Glass need correct values just to start.
Also I use Fresnel multiplied to Specular color(white) and Specular Power slider, plugged into Specular slot. Maybe that is also not the right way to do it?
Thanks for assistance, and thank you Joachim, what a great tool, without it I wouldn't know left from right in real time rendering.:thumbup:
So I am trying to figure this out on my own and I cant figure out how you got this to work, when I attempt something similar the horizontal bar does not show up in my multiple node or anywhere for that matter. Is that coming from your glow line texture? And if so what does that texture look like?
I'm making an ocean shader with water color in diffuse or emission component, which is a a color gradient between 2 values, dependent on Depth Blend. Opacity works the same way.
The problem that I have - Depth Blend is not affected by refraction. To distort the Depth Blend mask I need to have it in texture format, my guess that would be z-buffer texture. My camera is set to deferred, shaders except water are deferred too, water doesn't write to z-buffer. I put a texture node named _CameraDepthTexture, but it doesn't pick anything up.
I'd be glad for any ideas how to accomplish that.