Loving Shader Forge, helping me learn more about shaders (sloooowly, ha) as well as making the creation much faster.
Question about the Vertex Animation shader example, is that based on the mesh's UVs? I'm not too familiar with how this is working under the hood, but with some tweaking I was able to get the bulge in the direction I was wanting.
I was wondering if it is possible for Shader Forge Materials to use their Emissive as lights for lightmapping (in the same way you can tell beast to use the default materials' emissive as lights).
This would save my project alot of time with manually placing area lights and realtime lights for lightprobes later on.
This is sort of the result we're going for;
As you might be able to see the lighting above the sign isn't perfect, due to not beeing able to use shaped area lights.
As a final, some of my professors are -so- impressed with how good shader forge is, that they're planning on using it in their lessons as a substitute for UDK material editing.
Beast's lightmapping isn't as expansive as Lightmass' with how you influence it's range and intensity, so I'l take a nice middle of the road and combine the intense lights from manual placement with emissive textures. though that sign bit really disturbs me.
I'l see if I can apply that fancy mesh light from the information I can gather from the forum to it so I can replicate the effect through the emissive.
Such a simple thing though... _Illum. broke my head over it.
It should work if you need to name the emissive map Texture 2D node "Illum", so that the internal name becomes "_Illum".
Glad you guys like Shader Forge
Allow me to correct myself on the previous statement, I just sent it over to one of my fellow artists who did in fact turn off his ambient light. and it appears that it does not accept _Illum as sufficient instruction for it
I did some looking around in the Self-Illum shader section of unity itself and it seems the compiled versions (As I cannot open the originals) have something called;
_EmissionLM ("Emission (Lightmapper)", Float) = 0
which isn't an internal name that exists in any shape way or form in shaders compiled by your beautiful plugin. Not sure if you just hardcoded that in somewhere.
I'l add this:
Which is my current shader which includes a normal correction mask to make mirrored normal mapped materials appear correctly. which is a problem displayed here and solved Here
And I've put the models side by side now, one with baked area lights, and the other with just it's emissive texture "lighting up" the area.
Kettun, how are you getting the reflections in the floor?
It's a bit of script in combination with that lowest bit on the shader nodes.
I have a sphere hanging underneath the renderable surface at a relative height from the camera (if you jump it moves accordingly) that uses a
'Render to Cubemap' script, who's cubemap is then applied to all materials using an internal name _Cube, it's fairly taxing on a machine but it's beautiful for flat metallic surfaces.
I could even theoretically hook it up to the Diffuse ambient light to give the reflected light more "Oomph" on the surfaces, but that's not really what we're going for.
It's a bit of script in combination with that lowest bit on the shader nodes.
I have a sphere hanging underneath the renderable surface at a relative height from the camera (if you jump it moves accordingly) that uses a
'Render to Cubemap' script, who's cubemap is then applied to all materials using an internal name _Cube, it's fairly taxing on a machine but it's beautiful for flat metallic surfaces.
Wouldn't that cause the cube to be re-rendered and re-called every frame? that's got to be ridiculously inefficient... I can't imagine the difference is all that perceivable from just having a static baked cubemap that doesn't move with the camera in actual gameplay?
It makes quite a big difference when the reflections are very clear, although, yes, it's expensive, you're rendering the world 6 additional times per-frame. (At low resolution, but still)
The self-illumination shaders functionality of Unity are pretty bad documented (no mentioning inside the shader docs but you can find some pieces and hints in the lightmapping section).
You can however find some info about it on the Unity forums.
Try to name your shader/shader path something like Self-Illumin ... from the deeps of my mind I recall having read something about this requirement.
Basically Beast lightmapper (as implemented by Unity) expects some specific names and properties (as you already found out: _EmissiveLM and _Illum) of a shader before it actually acknowledges it as an emissive shader.
We also got Shader Forge here at work and it's incredible, the shaders I have been able to create so far easily outclass anything we previously had access to.
One question though that I didn't seem to find an answer to anywhere. Is there a node/memory limit on the shaders? I was in the midst of creating a terrain blendlayer texture with the capability of having snow accumulate on it based on a mask and vertex painting, and once I got too far in, I was getting pink.
I checked the naming conventions of the nodes as well as trying to troubleshoot any other nodes, but it seemed that everything worked fine until I add a new node of any type to the tree. Any ideas?
Shader Forge 0.26 is now released
• You can now use global variables!
- Currently supported types: Value, Vector4, Color and Texture
- Right click on a property and select "Make global"
- Global variables can only be set from code. For example: Shader.SetGlobalFloat("_MyGlobalValue", 5.0f)
- Currently, there's no way of making the nodes display their current global value
• You can now pick and use Render Textures, as well as Procedural Textures (from Substances), in the Texture2D nodes
• Added an experimental Shader Model 2.0 force feature
• Added the d3d11_9x platform (Direct3D 11 for Windows RT)
• Fixed a bug where box selecting while zoomed out didn't work properly
• Fixed a bug where you could only see the first mesh, in meshes containing submeshes
• Fixed a bug where the Channel Blend node broke when the mask component count didn't equal the color component count
• Fixed a bug where arithmetic nodes sometimes caused null reference exceptions when loading shaders
• Fixed a bug where drag-creating a new node on top of an existing one, caused it to lock up until you clicked again
• Fixed a bug where some coordinates were offset when the SF window was docked
• Ninjafixed a bug where the left separator reset its position when using hotkeys on OSX
Hey there, I would like to map a range of a color channel to a specific section of a texture sheet (atlas texture).
Let's say I've got a 2x2 square atlas texture. The color channel outputs a range of 0 - 1.
Everything from 0 to 0.25 should map to the first subtexture (top left). Anything from 0.25 to 0.5 should map to the second subtexture (top right).
Anything above should create UVs for bottom row (left and right) textures respectively.
I am stuck ... I tried remapping the channel to a range of 1 - 4, rounding the result and doing some more stuff ... but to be honest, I've lost the strings somewhere along the line. Maybe someone in here 's got a clever answer for me.
Hey there, I would like to map a range of a color channel to a specific section of a texture sheet (atlas texture).
Let's say I've got a 2x2 square atlas texture. The color channel outputs a range of 0 - 1.
Everything from 0 to 0.25 should map to the first subtexture (top left). Anything from 0.25 to 0.5 should map to the second subtexture (top right).
Anything above should create UVs for bottom row (left and right) textures respectively.
I am stuck ... I tried remapping the channel to a range of 1 - 4, rounding the result and doing some more stuff ... but to be honest, I've lost the strings somewhere along the line. Maybe someone in here 's got a clever answer for me.
I wouldn't put the colors in a 2x2 grid; it makes the shader harder to create, and more expensive to render. I recommend putting them in a row instead, just make sure each dimension is still a power of two!
I think this node tree should help you. It's not exactly the same thing, but it's along the same line:
Thanks a ton for this plugin, it's fantastic how easy it is to create shaders.
In photoshop, you can apply a linear burn on a layer for whatever percent of opacity you want that layer to be, how can i replicate that onto this shader where the linear burn should only be applied on the shaded part of the model with the color "layer below the shading".
Appending the rgb and a color alpha doesn't seem to achieve that ...
You can (keep in mind that you are forward rendering and each additional light adds its contribution with the same calculations as the first light). On page 20 or 21 (somewhere around there) I posted a shot of a shader graph that does what you are asking for with a custom lighting shader.
Will post the graph later in here...
Edit:
Hey Ace, thx for the hint. I think I got it working - not sure about the Remap and Round nodes... I have to check that when I've got some more time.
Due to Shuriken not having dynamic parameters I squeezed all the data inside the vertex colors of the particle shader. Result so far can be seen in the gif.
And Bug, here's the colored shadows stuff, too. Hope it helps to get you started.
Has anybody made a good tutorial for screen-space stuff? I'm attempting to do a shader that acts as a mask into a forward facing tiling texture. ...does that even make sense?
I'm getting an issue on android, where when I use alpha with nGUI, no matter what I do I get them to not show or to show black! Any ideas? Works on the PC.
I'm getting an issue on android, where when I use alpha with nGUI, no matter what I do I get them to not show or to show black! Any ideas? Works on the PC.
Just a thought... do you have OpenGL ES checked for compiling?
I once made the mistake of only having DirectX checked, so the shader wasn't working on mobile.
yep, I do, thanks for checking. And it only happens when I have alpha. Any other shaders work just fine. I can put one texture and it's alpha and it doeesnt show or it shows black.
as soon as i change the lighting from anything to custom i lose the shadows on my models. it looks like the shadows still effect the specular, but the diffuse (N.L)*albedo doesn't recieve any shadows
as soon as i change the lighting from anything to custom i lose the shadows on my models. it looks like the shadows still effect the specular, but the diffuse (N.L)*albedo doesn't recieve any shadows
i thought in my complex shader i used light attentuation, but will check. maybe i removed it because i thought, what would i need it for if i only use directional lights.
ha thank you! i only used it in the specular term not the diffuse part of the shader.
awesome, thank you very much.
now i just need to find out how to turn on shadows on transparent objects, be it cutour or alphablend, custom shader or default blinn/phong it seems like they do not recieve any shadows.
and another question, how exactly does the direction in a cubemap work? whats the input for it? Or in other words what would i need to do with the reflection direction to rotate a cubemap?
The rotation matrix itself gets constructed via C# script from a Quaternion and passed into the shader as a 4x4 matrix.
I don't know if it is advisable to construct the rotation matrix every frame inside the fragment shader (as that is probably the place where SF will generate the code) performance wise.
The rotation matrix itself gets constructed via C# script from a Quaternion and passed into the shader as a 4x4 matrix.
I don't know if it is advisable to construct the rotation matrix every frame inside the fragment shader (as that is probably the place where SF will generate the code) performance wise.
Thanks for that! I assume i would not need it that complicated i guess a tool to rotate a cubemap would be enough, no need to do it at runtime- sadly i couldn't find such a tool yet, my only idea would be to exchange the faces in clockwise or counterclockwise order and rotate the top and bottom accordingly so i at least have the chance to rotate it by 90°
it's just, the cubemap doesn't follow my lighting of the scene so i would like to make it resemble the scene light a bit more.
Hi Acegikmo!
This looks like such a fantastic tool and I'll definately be buying it, I did notice that you were talking about an integration with skyshop shaders. Have got any hint as to a time frame of when that might happen?
Thanks!
You can already combine them wtih Skyshop as it is, but it's a bit finnicky
Skyshop cubemaps are generally in HDR, so you need to multiply them by their Alpha channel, and then a value of 6 ( I think ).
The diffuse cubemap should have the normal direction as its DIR input, and needs to be plugged into Diffuse Ambient Light.
The specular cubemap is a bit more complex:
Cubemap[Mip] <- Multiplied by 7 <- One Minus <- Your gloss value (Between 0 and 1)
And needs to be plugged into Specular Ambient Light.
If you're then using the PBL lighting mode, make sure you set your specular below 1 for the diffuse cubemap to show
There is a more straightforward and more feature-rich approach coming pretty soon! I'm still working out a few details left in order for it to work
Thanks for that! I assume i would not need it that complicated i guess a tool to rotate a cubemap would be enough, no need to do it at runtime- sadly i couldn't find such a tool yet, my only idea would be to exchange the faces in clockwise or counterclockwise order and rotate the top and bottom accordingly so i at least have the chance to rotate it by 90°
it's just, the cubemap doesn't follow my lighting of the scene so i would like to make it resemble the scene light a bit more.
Ok, now that I've understood what you want you can use this shader and the C# code below. Note though that the C# code is not optimized - it keeps updating the rotation matrix every update cylce (aka frame). You probably want to include some conditionals to narrow that down to scene startup.
Edit: Of course using a tool to rotate and "rebake" the cubemap would eliminate any need for realtime calculations. But the overhead with this approach should be relatively small. Plus you gain the ability to link the skybox to your realtime lighting.
Does anyone know how to create an effect where pixels can "spill out" into screen space? I'm trying to do something along the lines of UDK's "emissive" node, where anything plugged into that works in tandem with a nice bloom effect.
The "Glow 11" plugin attempts to do this and looks good, but the performance on it is not really acceptable for the game I'm working on.
Does anyone know how to create an effect where pixels can "spill out" into screen space? I'm trying to do something along the lines of UDK's "emissive" node, where anything plugged into that works in tandem with a nice bloom effect.
The "Glow 11" plugin attempts to do this and looks good, but the performance on it is not really acceptable for the game I'm working on.
1. Buy Unity Pro if you don't have it already, or activate a trial
2. Set your camera to HDR
3. Import the image effects standard assets
4. Attach a bloom script to your camera, and make sure the bloom threshold is set to 1.0
5. Make a shader that has emissive going beyond 1.0
6. Look though the game camera and be happy!
Replies
Question about the Vertex Animation shader example, is that based on the mesh's UVs? I'm not too familiar with how this is working under the hood, but with some tweaking I was able to get the bulge in the direction I was wanting.
Got it, works now! Thanks!
Working right now on a snow shader
I was wondering if it is possible for Shader Forge Materials to use their Emissive as lights for lightmapping (in the same way you can tell beast to use the default materials' emissive as lights).
This would save my project alot of time with manually placing area lights and realtime lights for lightprobes later on.
This is sort of the result we're going for;
As you might be able to see the lighting above the sign isn't perfect, due to not beeing able to use shaped area lights.
As a final, some of my professors are -so- impressed with how good shader forge is, that they're planning on using it in their lessons as a substitute for UDK material editing.
Glad you guys like Shader Forge
Beast's lightmapping isn't as expansive as Lightmass' with how you influence it's range and intensity, so I'l take a nice middle of the road and combine the intense lights from manual placement with emissive textures. though that sign bit really disturbs me.
I'l see if I can apply that fancy mesh light from the information I can gather from the forum to it so I can replicate the effect through the emissive.
Such a simple thing though... _Illum. broke my head over it.
That was a -really- quick reply by the way.
Allow me to correct myself on the previous statement, I just sent it over to one of my fellow artists who did in fact turn off his ambient light. and it appears that it does not accept _Illum as sufficient instruction for it
I did some looking around in the Self-Illum shader section of unity itself and it seems the compiled versions (As I cannot open the originals) have something called;
_EmissionLM ("Emission (Lightmapper)", Float) = 0
which isn't an internal name that exists in any shape way or form in shaders compiled by your beautiful plugin. Not sure if you just hardcoded that in somewhere.
I'l add this:
Which is my current shader which includes a normal correction mask to make mirrored normal mapped materials appear correctly. which is a problem displayed here and solved Here
And I've put the models side by side now, one with baked area lights, and the other with just it's emissive texture "lighting up" the area.
It's a bit of script in combination with that lowest bit on the shader nodes.
I have a sphere hanging underneath the renderable surface at a relative height from the camera (if you jump it moves accordingly) that uses a
'Render to Cubemap' script, who's cubemap is then applied to all materials using an internal name _Cube, it's fairly taxing on a machine but it's beautiful for flat metallic surfaces.
I could even theoretically hook it up to the Diffuse ambient light to give the reflected light more "Oomph" on the surfaces, but that's not really what we're going for.
If you use a Value node in SF called "EmissionLM" in your node tree, you should get that line you mentioned and had to add
While, if I switch over to the standard self illuminated shaders in unity it just works, instantly.
swapping the material back over of course breaks it, but yeah.
the line is very much added, it just doesn't seem to do anything.
I must just be missing something obvious.
Wouldn't that cause the cube to be re-rendered and re-called every frame? that's got to be ridiculously inefficient... I can't imagine the difference is all that perceivable from just having a static baked cubemap that doesn't move with the camera in actual gameplay?
You can however find some info about it on the Unity forums.
Try to name your shader/shader path something like Self-Illumin ... from the deeps of my mind I recall having read something about this requirement.
Basically Beast lightmapper (as implemented by Unity) expects some specific names and properties (as you already found out: _EmissiveLM and _Illum) of a shader before it actually acknowledges it as an emissive shader.
See this tread for details:
http://forum.unity3d.com/threads/194124-Custom-self-Illumination-cg-shaders-doesn-t-work-with-lightmappers?p=1319359&viewfull=1#post1319359
One question though that I didn't seem to find an answer to anywhere. Is there a node/memory limit on the shaders? I was in the midst of creating a terrain blendlayer texture with the capability of having snow accumulate on it based on a mask and vertex painting, and once I got too far in, I was getting pink.
I checked the naming conventions of the nodes as well as trying to troubleshoot any other nodes, but it seemed that everything worked fine until I add a new node of any type to the tree. Any ideas?
Sure, just mail me
Awesome!
• You can now use global variables!
- Currently supported types: Value, Vector4, Color and Texture
- Right click on a property and select "Make global"
- Global variables can only be set from code. For example: Shader.SetGlobalFloat("_MyGlobalValue", 5.0f)
- Currently, there's no way of making the nodes display their current global value
• You can now pick and use Render Textures, as well as Procedural Textures (from Substances), in the Texture2D nodes
• Added an experimental Shader Model 2.0 force feature
• Added the d3d11_9x platform (Direct3D 11 for Windows RT)
• Fixed a bug where box selecting while zoomed out didn't work properly
• Fixed a bug where you could only see the first mesh, in meshes containing submeshes
• Fixed a bug where the Channel Blend node broke when the mask component count didn't equal the color component count
• Fixed a bug where arithmetic nodes sometimes caused null reference exceptions when loading shaders
• Fixed a bug where drag-creating a new node on top of an existing one, caused it to lock up until you clicked again
• Fixed a bug where some coordinates were offset when the SF window was docked
• Ninjafixed a bug where the left separator reset its position when using hotkeys on OSX
Changelogs for previous versions
Enjoy
Let's say I've got a 2x2 square atlas texture. The color channel outputs a range of 0 - 1.
Everything from 0 to 0.25 should map to the first subtexture (top left). Anything from 0.25 to 0.5 should map to the second subtexture (top right).
Anything above should create UVs for bottom row (left and right) textures respectively.
I am stuck ... I tried remapping the channel to a range of 1 - 4, rounding the result and doing some more stuff ... but to be honest, I've lost the strings somewhere along the line. Maybe someone in here 's got a clever answer for me.
I wouldn't put the colors in a 2x2 grid; it makes the shader harder to create, and more expensive to render. I recommend putting them in a row instead, just make sure each dimension is still a power of two!
I think this node tree should help you. It's not exactly the same thing, but it's along the same line:
Awesome! And yes, 876 copies sold so far
In photoshop, you can apply a linear burn on a layer for whatever percent of opacity you want that layer to be, how can i replicate that onto this shader where the linear burn should only be applied on the shaded part of the model with the color "layer below the shading".
Appending the rgb and a color alpha doesn't seem to achieve that ...
here is the shader :
Lerp
[A] <- Whatever you have in Dst in the blend node
<- The result of the blend node
[T] <- Opacity (Could be a slider, value, etc.)
Will post the graph later in here...
Edit:
Hey Ace, thx for the hint. I think I got it working - not sure about the Remap and Round nodes... I have to check that when I've got some more time.
Due to Shuriken not having dynamic parameters I squeezed all the data inside the vertex colors of the particle shader. Result so far can be seen in the gif.
And Bug, here's the colored shadows stuff, too. Hope it helps to get you started.
I'm getting an issue on android, where when I use alpha with nGUI, no matter what I do I get them to not show or to show black! Any ideas? Works on the PC.
Just a thought... do you have OpenGL ES checked for compiling?
I once made the mistake of only having DirectX checked, so the shader wasn't working on mobile.
as soon as i change the lighting from anything to custom i lose the shadows on my models. it looks like the shadows still effect the specular, but the diffuse (N.L)*albedo doesn't recieve any shadows
You're forgetting Light Attenuation
This should help: [ame="http://www.youtube.com/watch?v=EjCXwV0YYdU"]Shader Forge - Custom Blinn-Phong - YouTube[/ame]
Does it work if you disable your GUI?
awesome, thank you very much.
now i just need to find out how to turn on shadows on transparent objects, be it cutour or alphablend, custom shader or default blinn/phong it seems like they do not recieve any shadows.
and another question, how exactly does the direction in a cubemap work? whats the input for it? Or in other words what would i need to do with the reflection direction to rotate a cubemap?
The rotation matrix itself gets constructed via C# script from a Quaternion and passed into the shader as a 4x4 matrix.
I don't know if it is advisable to construct the rotation matrix every frame inside the fragment shader (as that is probably the place where SF will generate the code) performance wise.
Thanks for that! I assume i would not need it that complicated i guess a tool to rotate a cubemap would be enough, no need to do it at runtime- sadly i couldn't find such a tool yet, my only idea would be to exchange the faces in clockwise or counterclockwise order and rotate the top and bottom accordingly so i at least have the chance to rotate it by 90°
it's just, the cubemap doesn't follow my lighting of the scene so i would like to make it resemble the scene light a bit more.
This looks like such a fantastic tool and I'll definately be buying it, I did notice that you were talking about an integration with skyshop shaders. Have got any hint as to a time frame of when that might happen?
Thanks!
Skyshop cubemaps are generally in HDR, so you need to multiply them by their Alpha channel, and then a value of 6 ( I think ).
The diffuse cubemap should have the normal direction as its DIR input, and needs to be plugged into Diffuse Ambient Light.
The specular cubemap is a bit more complex:
Cubemap[Mip] <- Multiplied by 7 <- One Minus <- Your gloss value (Between 0 and 1)
And needs to be plugged into Specular Ambient Light.
If you're then using the PBL lighting mode, make sure you set your specular below 1 for the diffuse cubemap to show
There is a more straightforward and more feature-rich approach coming pretty soon! I'm still working out a few details left in order for it to work
Ok, now that I've understood what you want you can use this shader and the C# code below. Note though that the C# code is not optimized - it keeps updating the rotation matrix every update cylce (aka frame). You probably want to include some conditionals to narrow that down to scene startup.
Shader: http://pastebin.com/vR032jME
MonoBehaviour: http://pastebin.com/U0Ce88SS
Edit: Of course using a tool to rotate and "rebake" the cubemap would eliminate any need for realtime calculations. But the overhead with this approach should be relatively small. Plus you gain the ability to link the skybox to your realtime lighting.
The "Glow 11" plugin attempts to do this and looks good, but the performance on it is not really acceptable for the game I'm working on.
Hi, If you are not afraid to code a little, this is a way of adding glow effects to simple geometry without using render targets.
2. Set your camera to HDR
3. Import the image effects standard assets
4. Attach a bloom script to your camera, and make sure the bloom threshold is set to 1.0
5. Make a shader that has emissive going beyond 1.0
6. Look though the game camera and be happy!