In other news, Alpha clip / cutout / Alpha test / 1-bit alpha is now implemented!
This type of transparency doesn't have any sorting issues, which makes it great for stuff like foliage
(The lighting is wrong on the backfaces, this is on the TODO list!
i appreciate you already have a huge list of beta applicants. but if you're happy to have another, i'll break the shit out of it for you haha!
Sure! Send me a PM with:
1. Your first & last name
2. Your email address
3. Your general field of work (2D Art / 3D Art / Level Design / Coding / etc.)
4. Which Company are you working at? (Or Freelance / Indie / School / etc.)
I think for SSS, you can include Wrap Lighting, TSD, and Pre Integrated with the ability to use IBL and Cuebmaps for the ultimate realism. Also the alpha, we need to make certain that we can use the alpha channel's full range for stuff like hair. And trying this in with Skyshop, will be a bonus.
Hmm, this is probably for a later feature request, but way to populate the editor with a premade shaders. Like a preset drop down list, probably looking in a subfolder for existing ones to dupe from. Obviously one can use it for examples too.
That way someone like a tech or lead artist can define the library before the project starts, to avoid too many custom, unique shaders that duplicate similar effects.
Hmm, this is probably for a later feature request, but way to populate the editor with a premade shaders. Like a preset drop down list, probably looking in a subfolder for existing ones to dupe from. Obviously one can use it for examples too.
That way someone like a tech or lead artist can define the library before the project starts, to avoid too many custom, unique shaders that duplicate similar effects.
I've planned to support saving node groups, which is pretty much exactly that
A set of nodes combined into one, which will be in its own file. Modifying it will affect all other shaders using it.
slider = lerp between min/max? or is it simply a clamped variable you can plug into a lerp node?
Slider = Real-time tweakable variable with a user-specified min and max.
You can plug it into whatever you want
It *can* be used as a t value in a lerp, but here's another example, where you can easily tweak the speed of a panning texture, with separate values for X and Y.
I've planned to support saving node groups, which is pretty much exactly that
A set of nodes combined into one, which will be in its own file. Modifying it will affect all other shaders using it.
So you can have fragments that are linked to a master template, not merely duplicating the tree into a new, separated one? Nifty!
would it be possible to make a node, that can expose the mip levels of another node? for example:
Mipnode1 > LerpInput1
Mipnode2 > LerpInput2
Roughness map > LerpAlpha
LerpOutput > Tex2D
the function above would describe a situation where you could, for example, load in a cubemap as your Tex2D, and lerp between two of its mip levels based on a roughness map.
I've been working on exactly that today, and yes, that would work
Cubemaps specifically have mipping issues though, I'm getting seams on the more low-res mips. I'm working on it anyhow Screenshots coming later tonight/tomorrow!
but would it work to have the lerp plugged into the mip input on the tex2d? from what i understand, doing it as you've done in your screenshot would mean two calls of the texture.
so instead of getting two texture mips, and lerping those outputs together, you lerp the two vectors with a texture as the lerp alpha and plug that result into the mip.
but would it work to have the lerp plugged into the mip input on the tex2d? from what i understand, doing it as you've done in your screenshot would mean two calls of the texture.
so instead of getting two texture mips, and lerping those outputs together, you lerp the two vectors with a texture as the lerp alpha and plug that result into the mip.
You can, but you'll get seams. The MIP levels are still selected with integers, so there's no blending, in which case you would need two texture samplers. However, you can make a seamless Lerp across all MIP levels with a Frac node, a few offsets, and two samplers
You can, but you'll get seams. The MIP levels are still selected with integers, so there's no blending, in which case you would need two texture samplers. However, you can make a seamless Lerp across all MIP levels with a Frac node, a few offsets, and two samplers
But yeah, seams
Thanks a bunch!
yeah i figured that would be how it works, i guess it just depends on how heavy you're willing to have your shader? if you can afford the cost of multiple cubemaps it'll give a smoother result, if not you will have to live with the stepping. personally i don't mind either way, it's a fantastic tool to have, something UDK should have had years ago.
also, i don't think of those as seams :P, they're just clearly defined boundaries between blur levels, that's all. it's not like there's a black line separating them all.
Cubemaps! The node previews for it is currently broken, but I'm working on it
You also have access to the Normal vector, View vector and the View Reflection vector now!
Sweet progress on the cubemaps. Just spoke to one of the programmers at work and he's really interested in it and wanted to know about the possibility of implementating vertex shaders and vertex texture fetches?
Transmission is now implemented! Transmission allows you to say how much light should pass through the mesh, and with which color.
Double-sided lighting now works properly as well, so here are some spinning wet leaves and a transmission pic:
Sweet progress on the cubemaps. Just spoke to one of the programmers at work and he's really interested in it and wanted to know about the possibility of implementating vertex shaders and vertex texture fetches?
I've planned to try and make sure you can specify which things should be done per-vertex or per-fragment, but I think it can only go to a certain limit. I might end up making an advanced mode, that basically gives you full control, with separate vert/frag areas, and a final color output.
Awesome, unity is progressing into a real power house engine and a node based shaded editor will allow us artist much more power and control of our artwork! Keep up the good work!
Also, no update today; I've been digging around in the shader generation code, which is now more optimized and prettier, for those that want to look at it
Questions:
-Does it output the code or does is create only compiled shaders ?
-Does it allow to write vertex/fragment shaders from scratch for optimal performance even for mobile platforms, or does it create alot redundant code.
-Would it be possible to optimize variable types inside the visual editor for parameters connected to nodes and maintaining the preview features, so that we can use fixed/half/float without even touching the code.
Special Question:
- Would be possible to create shaders with this editor and take advantage of Skyshop features, from what I've read Skyshop is one of the most positive reviewed addition to Unity at the moment.
Features Requests:
-Being optimized oriented
-Buttons to optimize the script automatically using different range of options for optimizing variables hierarchically from fixed to float depending on the importance of precision requested for specific variables (with the possibility to add a precision tag to nodes we want to have specific precision even on multiple LOD levels)
-Shaders LODs support in the editor.
-Custom Code Node
-Does it output the code or does is create only compiled shaders ?
It writes fragment shaders, which is then passed through Unitys compiler and optimizer.
-Does it allow to write vertex/fragment shaders from scratch for optimal performance even for mobile platforms, or does it create alot redundant code.
The goal is to prevent redundant code as much as possible; but it's hard to predict all cases. This is why beta testing will help a lot
Being able to work with vert/frag separately is not the plan at the moment, as it's supposed to be as user friendly as possible, at least for the very first version released. I do have a pretty good idea on a multi-pass vert/frag layout though, that might make it in as an "Advanced mode" down the line
-Would it be possible to optimize variable types inside the visual editor for parameters connected to nodes and maintaining the preview features, so that we can use fixed/half/float without even touching the code.
Variable precision settings are planned
- Would be possible to create shaders with this editor and take advantage of Skyshop features, from what I've read Skyshop is one of the most positive reviewed addition to Unity at the moment.
Maybe! I haven't looked into it very much; but as far as I know, Skyshop is essentially a neat package to generate HDR cubemaps and utilizing them with IBL shaders? It should be possible to simply plug the Skyshop cubemaps into Shader Forge as IBL samples
-Being optimized oriented
As much as possible, yes! It's targeted at both high-end shaders as well as low-end mobile shaders.
Nice work! Shaping up real nice.
Hmm, this "component mask" node.... I is there anything you can do besides just picking either r,g, b or a? The one I had made for a shader editor before is simply called "swizzle". You can type in a text box "x, y, z, w" or "r, g, b, a". You can also type "x x x x" if you want to make your variable a float4, using only x, y or z... etc.
Nice work! Shaping up real nice.
Hmm, this "component mask" node.... I is there anything you can do besides just picking either r,g, b or a? The one I had made for a shader editor before is simply called "swizzle". You can type in a text box "x, y, z, w" or "r, g, b, a". You can also type "x x x x" if you want to make your variable a float4, using only x, y or z... etc.
Just a thought!
Sounds pretty much exactly like what it currently is
(Alpha previewing inside the nodes is planned, not implemented yet though)
Sorry for the lack of updates these last few days! I've been dealing with the underlying code, primarily to ensure stability. For example; when you press play or recompile the scripts in Unity, it does an assembly reload. What that means, is that all the data is lost in editor panels (Like Shader Forge), unless the data is made to serialize/deserialize properly. Making the data serialized puts lots of restrictions on what you can and cannot do, and which data structures you can and cannot use.
I just finished up ensuring that it survives soft assembly reloads (Scripts recompiling), but I still have some work left to do to make sure it keeps all data when you press play in the editor, which also has a tendency to mess up the interface as well!
I've been following this thread for a while, nodding quietly, but now I feel the
need to express how excited I am: I am very excited.
Regarding the tangled link lines: consider making straight lines an option. Back
in the day the Unreal engine's material editor had straight lines but then they
switched to splines. That is fancy, yes, but not exactly a usability improvement.
The most apparent problem (beside becoming visually "busy") is that curves lines
can partially overlap even if their end positions are spread apart, which makes
following them difficult. Straight lines (or mostly straight lines, where only the very
ends bend) seem to work much better on complex networks. It is a reasonably
simple thing to implement but could really define the user experience. (One of the
few things Mental Mill got horribly wrong.)
So yeah, I can't wait for an alpha/beta/gold release.
Can we create terrain shaders? can't wait can't wait
Of course!
Today I've been fiddling with serialization all day long, and it's starting to near a complete stage
You can soon save/load the node data inside the shader file!
Replies
This type of transparency doesn't have any sorting issues, which makes it great for stuff like foliage
(The lighting is wrong on the backfaces, this is on the TODO list!
i appreciate you already have a huge list of beta applicants. but if you're happy to have another, i'll break the shit out of it for you haha!
Sure! Send me a PM with:
1. Your first & last name
2. Your email address
3. Your general field of work (2D Art / 3D Art / Level Design / Coding / etc.)
4. Which Company are you working at? (Or Freelance / Indie / School / etc.)
That way someone like a tech or lead artist can define the library before the project starts, to avoid too many custom, unique shaders that duplicate similar effects.
I've planned to support saving node groups, which is pretty much exactly that
A set of nodes combined into one, which will be in its own file. Modifying it will affect all other shaders using it.
slider = lerp between min/max? or is it simply a clamped variable you can plug into a lerp node?
Slider = Real-time tweakable variable with a user-specified min and max.
You can plug it into whatever you want
It *can* be used as a t value in a lerp, but here's another example, where you can easily tweak the speed of a panning texture, with separate values for X and Y.
A
So you can have fragments that are linked to a master template, not merely duplicating the tree into a new, separated one? Nifty!
OK now you're just showing off.
would it be possible to make a node, that can expose the mip levels of another node? for example:
Mipnode1 > LerpInput1
Mipnode2 > LerpInput2
Roughness map > LerpAlpha
LerpOutput > Tex2D
the function above would describe a situation where you could, for example, load in a cubemap as your Tex2D, and lerp between two of its mip levels based on a roughness map.
Cubemaps specifically have mipping issues though, I'm getting seams on the more low-res mips. I'm working on it anyhow Screenshots coming later tonight/tomorrow!
also, it makes sense that you would have seams
It makes sense, but it looks bad! There should be a fairly straightforward way of dealing with it.
(Thanks for the cookies!)
Every time I get wowed more and more.
Amazing work!
quick question regarding this screenshot.
but would it work to have the lerp plugged into the mip input on the tex2d? from what i understand, doing it as you've done in your screenshot would mean two calls of the texture.
so instead of getting two texture mips, and lerping those outputs together, you lerp the two vectors with a texture as the lerp alpha and plug that result into the mip.
You can, but you'll get seams. The MIP levels are still selected with integers, so there's no blending, in which case you would need two texture samplers. However, you can make a seamless Lerp across all MIP levels with a Frac node, a few offsets, and two samplers
But yeah, seams:
Haha, you were not the only one, so don't worry
Thanks a bunch!
yeah i figured that would be how it works, i guess it just depends on how heavy you're willing to have your shader? if you can afford the cost of multiple cubemaps it'll give a smoother result, if not you will have to live with the stepping. personally i don't mind either way, it's a fantastic tool to have, something UDK should have had years ago.
also, i don't think of those as seams :P, they're just clearly defined boundaries between blur levels, that's all. it's not like there's a black line separating them all.
You also have access to the Normal vector, View vector and the View Reflection vector now!
Double-sided lighting now works properly as well, so here are some spinning wet leaves and a transmission pic:
I've planned to try and make sure you can specify which things should be done per-vertex or per-fragment, but I think it can only go to a certain limit. I might end up making an advanced mode, that basically gives you full control, with separate vert/frag areas, and a final color output.
Here's an example of a Matcap shader:
That's planned! They don't animate at the moment
Hard to say! In about a week or two perhaps?
Also, no update today; I've been digging around in the shader generation code, which is now more optimized and prettier, for those that want to look at it
-Does it output the code or does is create only compiled shaders ?
-Does it allow to write vertex/fragment shaders from scratch for optimal performance even for mobile platforms, or does it create alot redundant code.
-Would it be possible to optimize variable types inside the visual editor for parameters connected to nodes and maintaining the preview features, so that we can use fixed/half/float without even touching the code.
Special Question:
- Would be possible to create shaders with this editor and take advantage of Skyshop features, from what I've read Skyshop is one of the most positive reviewed addition to Unity at the moment.
Features Requests:
-Being optimized oriented
-Buttons to optimize the script automatically using different range of options for optimizing variables hierarchically from fixed to float depending on the importance of precision requested for specific variables (with the possibility to add a precision tag to nodes we want to have specific precision even on multiple LOD levels)
-Shaders LODs support in the editor.
-Custom Code Node
Other requests when I'll think about them
-Does it output the code or does is create only compiled shaders ?
It writes fragment shaders, which is then passed through Unitys compiler and optimizer.
-Does it allow to write vertex/fragment shaders from scratch for optimal performance even for mobile platforms, or does it create alot redundant code.
The goal is to prevent redundant code as much as possible; but it's hard to predict all cases. This is why beta testing will help a lot
Being able to work with vert/frag separately is not the plan at the moment, as it's supposed to be as user friendly as possible, at least for the very first version released. I do have a pretty good idea on a multi-pass vert/frag layout though, that might make it in as an "Advanced mode" down the line
-Would it be possible to optimize variable types inside the visual editor for parameters connected to nodes and maintaining the preview features, so that we can use fixed/half/float without even touching the code.
Variable precision settings are planned
- Would be possible to create shaders with this editor and take advantage of Skyshop features, from what I've read Skyshop is one of the most positive reviewed addition to Unity at the moment.
Maybe! I haven't looked into it very much; but as far as I know, Skyshop is essentially a neat package to generate HDR cubemaps and utilizing them with IBL shaders? It should be possible to simply plug the Skyshop cubemaps into Shader Forge as IBL samples
-Being optimized oriented
As much as possible, yes! It's targeted at both high-end shaders as well as low-end mobile shaders.
-Shaders LODs support in the editor.
Planned
-Custom Code Node
Also planned!
Hmm, this "component mask" node.... I is there anything you can do besides just picking either r,g, b or a? The one I had made for a shader editor before is simply called "swizzle". You can type in a text box "x, y, z, w" or "r, g, b, a". You can also type "x x x x" if you want to make your variable a float4, using only x, y or z... etc.
Just a thought!
Sounds pretty much exactly like what it currently is
(Alpha previewing inside the nodes is planned, not implemented yet though)
I just finished up ensuring that it survives soft assembly reloads (Scripts recompiling), but I still have some work left to do to make sure it keeps all data when you press play in the editor, which also has a tendency to mess up the interface as well!
So, here's what I need to untangle next:
Take my money.
need to express how excited I am: I am very excited.
Regarding the tangled link lines: consider making straight lines an option. Back
in the day the Unreal engine's material editor had straight lines but then they
switched to splines. That is fancy, yes, but not exactly a usability improvement.
The most apparent problem (beside becoming visually "busy") is that curves lines
can partially overlap even if their end positions are spread apart, which makes
following them difficult. Straight lines (or mostly straight lines, where only the very
ends bend) seem to work much better on complex networks. It is a reasonably
simple thing to implement but could really define the user experience. (One of the
few things Mental Mill got horribly wrong.)
So yeah, I can't wait for an alpha/beta/gold release.
Glad you're excited!
Added an option for that now.
It can of course be polished later to be a bit more visually appealing, especially at extreme angles.
Bonus gif! Rectilinear included
Of course!
Today I've been fiddling with serialization all day long, and it's starting to near a complete stage
You can soon save/load the node data inside the shader file!