Also: Alpha/Beta signups are now closed!
This is due to having a ton of testers, and the fact that SF is hitting beta very soon!
I will put up SF on the asset store when it's in the beta, during which you'll be able to get a license for a cheaper price than the full $100 at release. Exciting times
Also, I have a bunch of signups pending, but I'll batch-add you all soon
I might have to skip supporting vertex lit for now, it's turning out to be super tricky to get sorted without spending a lot of time rewriting large parts of the system.
Oooh, I would have liked to have seen vertex lit. I assume it would come sometime between beta and final? if so I suppose I could wait
Either there or after the 1.0 release. I might put up a button for enabling experimental/buggy features, because vertex lighting is working, it's just super unreliable and naive
0.15 is coming soon! I'm considering whether or not to have Beta 0.15 as the pre-release or wait until Beta 0.16. I'm leaning towards having 0.15 as the internal beta, and then go public on the asset store with 0.16
I would guess 0.15 is ready in perhaps three days, in that case?
0.15 is coming soon! I'm considering whether or not to have Beta 0.15 as the pre-release or wait until Beta 0.16. I'm leaning towards having 0.15 as the internal beta, and then go public on the asset store with 0.16
I would guess 0.15 is ready in perhaps three days, in that case?
Pre-release, we will test the poop out of the outline shading
Some great news - I've been in talks with the guy at Marmoset who created Skyshop; who is going to help out making sure Shader Forge will have Skyshop / IBL integration down the line
This is stunning! I do all my professional work in Unity now and I miss the power of a node based shader editor like UDK's. Strumpy just doesn't cut it and I really hope this is as good as it looks Look forward to seeing more.
hi Acegikmo
you are super programmer man
thats amazing
I have the hard waiting for SF
i have some offer:
1.
i like SF provide realtime emission like real-time spherical area lights
2.
i like SF provide tools for realtime cubemap and plan(scene capture reflect actor) reflection(update from scene like udk)
3.
something in unity Was bothering me is unity have perfect plugin then Eases into the work
but problem is Most of the time plugins Have problems together for example
if i use hard surface shader pro and use Voxel Cone Traced Lighting in Unity
Voxel Cone Traced Does not work this is big problme in unity
thats good That plugin developer for unity Interact with each other
(I'll be happy i can SF and SkyShop together)
I think SF Not only must interaction with SkyShop must also interaction with other perfect plugin
like Voxel Cone Traced Lighting in Unity and ...
(Of course i know thats very hard and sometime impossible)
Like this unity will good engine like CryEngine And UnrealEngine
Of course i will see good future for unity
sorry about bad english
i don't know These impossible or not but if These It will all My wishes be fulfilled in unity
I've spent some time with the alpha and I am very pleased with the results so far.
While digging into the process of applying a normal map to a shader I found two differing results:
In the shader I created the line that read the normal information from the texture reads like this:
So that shader is using the UnpackNormal() method that comes with Unity. However I could not see how you managed to tell ShaderForge to do so. Not using that method will usually result in wrong shading due to the compression/packing of regular normal map textures in Unity (although one can change that manually).
1. How does ShaderForge know when to use UnpackNormal()?
2. Can I specify my own UnpackNormalFunction()?
The reason for questions #2 is that I'd like to store a blend modulation mask/heightmap inside the alpha channel of my texture.
When importing that texture into Unity it gets processed in a way that the red channel and the alpha channel get swizzled due to better normal map compression.
The shader needs to "de-swizzle" that format.
Sure, I might implement this swizzling myself inside the shader graph. But having a reusable custom node for this kind of stuff would be awesome.
1. How does ShaderForge know when to use UnpackNormal()?
2. Can I specify my own UnpackNormalFunction()?
The reason for questions #2 is that I'd like to store a blend modulation mask/heightmap inside the alpha channel of my texture.
When importing that texture into Unity it gets processed in a way that the red channel and the alpha channel get swizzled due to better normal map compression.
The shader needs to "de-swizzle" that format.
Sure, I might implement this swizzling myself inside the shader graph. But having a reusable custom node for this kind of stuff would be awesome.
Glad you like it so far!
1. Shader Forge will unpack normals on Tex2D nodes when you assign a texture marked as a normal map, while making a shader. It's a bit of an undocumented behaviour, but it can cause quite a lot of confusion if you didn't add a normal map while making it.
2. You can't make your own function in that sense, but creating, saving and reusing nested nodes is something I've been wanting to do for a long time
1. i like SF provide realtime emission like real-time spherical area lights
2. i like SF provide tools for realtime cubemap and plan(scene capture reflect actor) reflection(update from scene like udk)
3. something in unity Was bothering me is unity have perfect plugin then Eases into the work
1. You can do spherical area lights with the custom lighting functionality, but it isn't built-in, so it's hard to switch lighting modes or make it energy conserving, and so on. There might be a proper implementation of this later on
2. This is already possible, but you'll have to implement or download the real-time cubemap scripting part yourself
3. This shouldn't be much of an issue, as Shader Forge is quite independent from Unity's other functions. It's primarily doing isolated work in the Shader Forge window, and outputs a shader for the engine to use
1. You can do spherical area lights with the custom lighting functionality, but it isn't built-in, so it's hard to switch lighting modes or make it energy conserving, and so on. There might be a proper implementation of this later on
2. This is already possible, but you'll have to implement or download the real-time cubemap scripting part yourself
3. This shouldn't be much of an issue, as Shader Forge is quite independent from Unity's other functions. It's primarily doing isolated work in the Shader Forge window, and outputs a shader for the engine to use
1.Excellent
this is good if you add this in future:)
2. i think this is impossible then i create real-time cubemap tool because i have to create custom nod for SF for control in SF(TextureTargetRender2D) and SF is not open source and
thats true????
i happy if you create this tool and Place to asset store or Place with SF in Future!!:poly142:
3.oh god thats very very bad
This means that is i can't use skyshop image based lighting with SF?
(if i change or move light source in SkyShop not impact with Shaders then create in SF)
Is that area-light effect DX11 only or will it work on OSX, even OSX 10.9 is fine?
It's a regular shader, shouldn't be much of an issue
I created it in OSX 10.9, runs smooth! It's not using anything DX specific, works just fine in OpenGL too
Your works is ACE, ace! Please, I'm so anxious to 0.15 i can't wait! (it already passed 3 days, hahahaa, jk)
In all words, this is now a necessity for Unity, for me it's something that if I didn't have I wouldn't use Unity. You're making Unity pretty happy indeed.
I've recently added some features that has caused some really tricky bugs to solve. I need to get rid of these nasty bugs first, before going live with beta 0.15. Sorry for the delays!
I'm also rather sick today, so my function doesn't want to brain properly
Sorry for noobish question, but I would like to create kind of color replacement shader, I got stuck on combining the original texture and the tint color layer, I would like to create a mask layer for tint color, so It would only affect part of the original texture and then combine them without any additive or multibly blending.
@4s4:
What you need then is a lerp node.
Plug the tint-mask into the T input and your original color into the A input. Your tint color/texture goes into the B input.
Make sure your tint mask is white (or 1 so to speak) where you want your tint color to be and black where your original color goes.
Sorry for noobish question, but I would like to create kind of color replacement shader, I got stuck on combining the original texture and the tint color layer, I would like to create a mask layer for tint color, so It would only affect part of the original texture and then combine them without any additive or multibly blending.
Hmm, tried something like this, but the end result is quite different from Lerp node preview. I also tried using texture 2d as a mask and everything worked nicely. Am I doing something wrong?
Hmm, tried something like this, but the end result is quite different from Lerp node preview. I also tried using texture 2d as a mask and everything worked nicely. Am I doing something wrong?
Why are you dividing by the slider? You're probably overshooting the 0-1 range in the t value. And why are you using the light attenuation icon as a texture? :0
I have experienced a somewhat strange bug or feature with the multiply node while testing the lerp node:
Please note how the outcome differs if a one-lane output (like Dot) gets connected to A and when a three-lane output (like the one from Desaturate) gets connected to A.
It looks like the multiply node truncates the second operand to a scalar although it clearly is a vector3 when the other operand is a scalar type.
The irritating thing about this behaviour is the fact that the multiply node itself shoots a three-lane value out of its... output.
And multiplying a scalar with a vector2, 3 or 4 never results in a scalar. Mathematically that's just false... if my school maths don'f fail me here
Edit:
Could you add an "Assemble" node to the Vector Operations please?
So we can manually plug values into its XYZW-components and receive a fully functional vector?
1. Desaturate outputs a Vector1 but looks like a Vector3
2. Lerp needs to typecast Vector1 inputs in A or B when the other input has more than one component
Thanks for the bug report!
Also, you can use the append node to put together vectors
Edit:
Seems like:
Lerp(scalar, vector, t) typecasts the vector to a scalar
Lerp(vector, scalar, t) typecasts the scalar to a vector
Which makes sense but is somewhat confusing. I think I'll make it auto-typecast all scalars to vectors in Lerps.
@Append node: works like a charm, though it requires at least 2 nodes to assemble a vector3. Nothing too bad and I can still edit the final shader code by hand to optimize some of the lines.
@Lerp typecasting:
Wow, didn't know that lerp was doing the first typecast internally. Sounds more like a bug to me than a feature... if it is a feature it sounds like a lazy programmer's way of saving only a few more characters to type.
Lerp should always cast the smaller operand to the bigger one imo. Makes more sense to me.
Thx for posting your findings. Appreciate learning new things.
Got it working, I was guessing that was doing something wrong with divide node and slider earlier I don't know if this new shader is done correctly, but it works.
I really like the editor Acegikmo, It's so much easier to use than Strumpy's shader editor. :thumbup:
Replies
Also:
Alpha/Beta signups are now closed!
This is due to having a ton of testers, and the fact that SF is hitting beta very soon!
I will put up SF on the asset store when it's in the beta, during which you'll be able to get a license for a cheaper price than the full $100 at release. Exciting times
Also, I have a bunch of signups pending, but I'll batch-add you all soon
Edit: Totally and accidentally nailed that Windows 95 default desktop background color
(rubs hands together)
Another pass. It's not possible to do it in the same pass as far as I know
Do you think it's crucial to have it?
Either there or after the 1.0 release. I might put up a button for enabling experimental/buggy features, because vertex lighting is working, it's just super unreliable and naive
I would guess 0.15 is ready in perhaps three days, in that case?
Pre-release, we will test the poop out of the outline shading
https://www.youtube.com/watch?v=3tHI2J9_c9k
you are super programmer man
thats amazing
I have the hard waiting for SF
i have some offer:
1.
i like SF provide realtime emission like real-time spherical area lights
2.
i like SF provide tools for realtime cubemap and plan(scene capture reflect actor) reflection(update from scene like udk)
[ame="http://www.youtube.com/watch?v=BMxwJsGlOsA"]UDK Realtime Cubemap Reflection and Environment Map - YouTube[/ame]
[ame="http://www.youtube.com/watch?v=uWYpyNkF26o"]Simple Reflections in UDK - 3dmotive - YouTube[/ame]
[ame="http://www.youtube.com/watch?v=2XoheOgAfkg"]UDK Realtime Reflective Floor Material - YouTube[/ame]
3.
something in unity Was bothering me is unity have perfect plugin then Eases into the work
but problem is Most of the time plugins Have problems together for example
if i use hard surface shader pro and use Voxel Cone Traced Lighting in Unity
Voxel Cone Traced Does not work this is big problme in unity
thats good That plugin developer for unity Interact with each other
(I'll be happy i can SF and SkyShop together)
I think SF Not only must interaction with SkyShop must also interaction with other perfect plugin
like Voxel Cone Traced Lighting in Unity and ...
(Of course i know thats very hard and sometime impossible)
Like this unity will good engine like CryEngine And UnrealEngine
Of course i will see good future for unity
sorry about bad english
i don't know These impossible or not but if These It will all My wishes be fulfilled in unity
And this is just stunning. On the floor making prayer-esque gestures your way.
I've spent some time with the alpha and I am very pleased with the results so far.
While digging into the process of applying a normal map to a shader I found two differing results:
In the shader I created the line that read the normal information from the texture reads like this:
In the parallax shader that ships with the examples package that line reads like this:
So that shader is using the UnpackNormal() method that comes with Unity. However I could not see how you managed to tell ShaderForge to do so. Not using that method will usually result in wrong shading due to the compression/packing of regular normal map textures in Unity (although one can change that manually).
1. How does ShaderForge know when to use UnpackNormal()?
2. Can I specify my own UnpackNormalFunction()?
The reason for questions #2 is that I'd like to store a blend modulation mask/heightmap inside the alpha channel of my texture.
When importing that texture into Unity it gets processed in a way that the red channel and the alpha channel get swizzled due to better normal map compression.
The shader needs to "de-swizzle" that format.
Sure, I might implement this swizzling myself inside the shader graph. But having a reusable custom node for this kind of stuff would be awesome.
Glad you like it so far!
1. Shader Forge will unpack normals on Tex2D nodes when you assign a texture marked as a normal map, while making a shader. It's a bit of an undocumented behaviour, but it can cause quite a lot of confusion if you didn't add a normal map while making it.
2. You can't make your own function in that sense, but creating, saving and reusing nested nodes is something I've been wanting to do for a long time
1. You can do spherical area lights with the custom lighting functionality, but it isn't built-in, so it's hard to switch lighting modes or make it energy conserving, and so on. There might be a proper implementation of this later on
2. This is already possible, but you'll have to implement or download the real-time cubemap scripting part yourself
3. This shouldn't be much of an issue, as Shader Forge is quite independent from Unity's other functions. It's primarily doing isolated work in the Shader Forge window, and outputs a shader for the engine to use
1.Excellent
this is good if you add this in future:)
2. i think this is impossible then i create real-time cubemap tool because i have to create custom nod for SF for control in SF(TextureTargetRender2D) and SF is not open source and
thats true????
i happy if you create this tool and Place to asset store or Place with SF in Future!!:poly142:
3.oh god thats very very bad
This means that is i can't use skyshop image based lighting with SF?
(if i change or move light source in SkyShop not impact with Shaders then create in SF)
It's a regular shader, shouldn't be much of an issue
I created it in OSX 10.9, runs smooth! It's not using anything DX specific, works just fine in OpenGL too
In all words, this is now a necessity for Unity, for me it's something that if I didn't have I wouldn't use Unity. You're making Unity pretty happy indeed.
I've recently added some features that has caused some really tricky bugs to solve. I need to get rid of these nasty bugs first, before going live with beta 0.15. Sorry for the delays!
I'm also rather sick today, so my function doesn't want to brain properly
What you need then is a lerp node.
Plug the tint-mask into the T input and your original color into the A input. Your tint color/texture goes into the B input.
Make sure your tint mask is white (or 1 so to speak) where you want your tint color to be and black where your original color goes.
Why are you dividing by the slider? You're probably overshooting the 0-1 range in the t value. And why are you using the light attenuation icon as a texture? :0
Please note how the outcome differs if a one-lane output (like Dot) gets connected to A and when a three-lane output (like the one from Desaturate) gets connected to A.
It looks like the multiply node truncates the second operand to a scalar although it clearly is a vector3 when the other operand is a scalar type.
The irritating thing about this behaviour is the fact that the multiply node itself shoots a three-lane value out of its... output.
And multiplying a scalar with a vector2, 3 or 4 never results in a scalar. Mathematically that's just false... if my school maths don'f fail me here
Edit:
Could you add an "Assemble" node to the Vector Operations please?
So we can manually plug values into its XYZW-components and receive a fully functional vector?
1. Desaturate outputs a Vector1 but looks like a Vector3
2. Lerp needs to typecast Vector1 inputs in A or B when the other input has more than one component
Thanks for the bug report!
Also, you can use the append node to put together vectors
Edit:
Seems like:
Lerp(scalar, vector, t) typecasts the vector to a scalar
Lerp(vector, scalar, t) typecasts the scalar to a vector
Which makes sense but is somewhat confusing. I think I'll make it auto-typecast all scalars to vectors in Lerps.
@Lerp typecasting:
Wow, didn't know that lerp was doing the first typecast internally. Sounds more like a bug to me than a feature... if it is a feature it sounds like a lazy programmer's way of saving only a few more characters to type.
Lerp should always cast the smaller operand to the bigger one imo. Makes more sense to me.
Thx for posting your findings. Appreciate learning new things.
I really like the editor Acegikmo, It's so much easier to use than Strumpy's shader editor. :thumbup: