Hello im looking for a way to make vertex color based shader but with more than 4 textures.
I mean how can i tell shader to apply a texture to some exact RGB color value instead of just channel? Ultimately i want a shader that can hold up to 10-20 tiled textures and im looking for a way to implement that.
I was also thinking that if that would be possible to interpret the vertex (or poly im not sure) info about material ID that comes out of 3d packages (its called MatID in 3dsmax), to interpret that on shader level so instead of putting different materials for different materials id, it will put different textures of the same material.
Please i need help with this really.
It's not always good to cram everything into a single shader. It means that all pixels being rendered become more expensive, even where you aren't using the said expensive feature.
There are also texture lookup limits, in which case you'll hit the limit somewhere around 8-11 textures, depending on if you want lightmaps and and/or lighting on or not. In other words, you won't be able to hold 20 textures in a single shader. And even if you could, it would be too expensive.
You may be able to get it to work if you atlas multiple textures into one, but it would require even more instructions for UV manipulation.
Splitting it up into multiple shaders/materials can sometimes be a good thing
Hm. That's bad Is there a way to "render the texture" so in editor we have tiled multimaterial version of object and then you press button and it creates 1 material that bakes all those tiled into one texture (by uvw layout). Im trying to come up with a way to work non destructively with 1000s of objects, so we can swap textures, change tiling without need to use 3d apps all the time for everything. If i decide to change a part of a building i currently need to change model, uvw layout, psd with texture just to change 1 thing and thats not effective, at all.
Im talking 1000s of objects like 10k, we already use mesh combine optimizations all over the place, but i can't afford to keep 50k materials if i want to use tiling, so im forced to create 1 texture for each obj, or atlas for themed group, but again if i need to change im going down the pipeline seeking raw files, blabla 2 hours of work.
That situation forces me to seek ways to improve the workflow, i want to use multiple tiled textures per object but i don't want to end up with zero performance because of that, and i thought if you combine those into one mat it will lessen the impact.
If you have any ideas of how to improve on this situation can you elaborate? Some direction, advice, anything.
Being a noob to all this visual shader editing stuff and after watching some UDK4 videos was wondering if anyone can point in the right direction for creating some shaders that tile textures based on world coordinates and creating a shader that display a different texture based on height in Y.
Looking for hints on node order so I can hack my way to a solution with trail and error so I can learn the process.
Being a noob to all this visual shader editing stuff and after watching some UDK4 videos was wondering if anyone can point in the right direction for creating some shaders that tile textures based on world coordinates and creating a shader that display a different texture based on height in Y.
Looking for hints on node order so I can hack my way to a solution with trail and error so I can learn the process.
Cheers.
Use the world position node, and append the R and B channels, and plug it into textures as UV coordinates. You can also use the Y channel as t-value in the lerp node, after passing through a clamp01 node, in order to blend between two colors based on height.
Did anyone figure out yet, how to integrate Skyshop Sky and it's rotation with the ShaderForge created shader, automatically?
There should be a manual way of achieving this.
You may be able to do it with a global variable with a specific name, but I'm not sure. In any case, the Skyshop update with SF integration should be coming out pretty soon
I have one problem tho, I currently reduce specular ambient light with a fresnel, but the masked area is coming through as black and not as the diffuse base. Im using a older version tho (27) and we cannot upgrade yet. Is there anything to fix that or a workaround ? your fresnel options is ultra strong in the older versions as you said yourself, and I need a control slider anyways
Oh, i totally forgot to ask, is there any way to make shader forge work with Candela SSR ? Do I have to request from you or should i flame the candela guys ? ; )
I think you can hack it in somehow, but there's no official support for it yet.
There has been some discussion on this in the Unity Shader Forge thread as well as on the SF UserEcho page. You may find some helpful info there
Is it possible to use a texture with moving UVs on a vertex offset ? It gets me an error with this configuration :
The error is "Shader error in 'FX/Clouds': undeclared identifier 'node_3860_piv' at line 84"
It shows me the same error with the same configuration and a vector2 linked to the rotator at piv so it doesn't seem to come from that.
edit : The only thing wich produces that error is the rotator node, the panner works correctly, , is there any way i could control the direction it is panning towards to at runtime ?
Shader Forge Beta 0.35 is now released!
Fixed a bug where shaders, especially larger ones, took a very long time to open
The camera will now pan when dragging a connection to the edge of the node view
You can now switch tessellation mode to edge length based tessellation
You can now switch mode on the quickpicker to use either the scrollwheel, or the cursor position
The slider node looks and works a bit better (added a margin to the slider, and gave more space for dragging the node)
Fixed a bug where sliders using a larger min value than max value, caused massive performance drops
Fixed a bug where you got serialization depth errors when running Shader Forge in Unity 4.5 and above
Fixed a bug where you couldn't open compiled shaders in Unity 4.5
Fixed a bug where you couldn't use depth testing in the normal input
It's not currently planned, but it's possible to make it using custom lighting. You essentially make your specular component in two parts, one for the binormal, and one for the tangent, based on the anisotropy of the surface, and then simply add them together
Shader Forge Beta 0.36 is now out!
Nodes such as the Normal Direction node and the View Direction node, now have a visual representation instead of being black
The internal assets of Shader Forge are now no longer in a Resources folder
I get no dynamic shadows on shaders that have lightmap support ticked. I'm using single lightmaps (it's for mobile) but still want to receive shadows from a few dynamic objects. Am I missing something fundamental? If I change to a simple Unity Diffuse shader all is good, but anything from SF seems to loose the shadows. Sheers in advance
Hello! I'm making a shader at the moment that includes scrolling textures. So far so good but I have hit a bit of a wall as I'm not entirely sure how I'd go about something. Basically in the image below, I'm using one normal map twice. One of them being rotated at an angle, then blending them together afterwards.
Ringed in red, I have so far made that half of the normal offsetting based on the value of X and Y speed. I'm trying to do the same to the other half (ringed in white) but I've already rotated in 90 degrees and am not sure how to do both (rotating it by 90 as well as offsetting).
Would anyone know how to do both rotating and then offsetting the UV's? I'd be much appreciative of anyone's help!
Hello! I'm making a shader at the moment that includes scrolling textures. So far so good but I have hit a bit of a wall as I'm not entirely sure how I'd go about something. Basically in the image below, I'm using one normal map twice. One of them being rotated at an angle, then blending them together afterwards.
Ringed in red, I have so far made that half of the normal offsetting based on the value of X and Y speed. I'm trying to do the same to the other half (ringed in white) but I've already rotated in 90 degrees and am not sure how to do both (rotating it by 90 as well as offsetting).
Would anyone know how to do both rotating and then offsetting the UV's? I'd be much appreciative of anyone's help!
*facepalm* I was just being dumb, I figured it out now! Never mind.
Simple question I'm sure. Why is it that if I have lightmaps on my SF shader, I get no shadows cast by dynamic objects? Using simple single lightmap settings with just the simplest SF shader, but I get no dynamic shadows.
Sloppy coding on my end. The lightmap system is a mess, and things like this seems to always slip through the cracks. It's because the dependency system for shader writing is really naive, and almost any change I make, or any bug I fix, will cause another bug to pop up, such as this one.
I can have an extra look at it today and see if I can get it sorted, but I can't promise anything
Sloppy coding on my end. The lightmap system is a mess, and things like this seems to always slip through the cracks. It's because the dependency system for shader writing is really naive, and almost any change I make, or any bug I fix, will cause another bug to pop up, such as this one.
I can have an extra look at it today and see if I can get it sorted, but I can't promise anything
I really hope lightmaps in general can get a bit more attention... I work for probably one of your biggest customers (in terms of company size) working on some very big IPs and we need lightmap and performance way before advanced lighting features, since we need to support a wide range of hardware and lightmaps/probes are a key feature for us.
Sure we can edit the shaders by hand, but then the shaders we use in-game are no longer in sync with our ShaderForge produced files, which when you need to iterate rapidly and work with dozens of developers, is not an ideal workflow.
Love all the work done on the tool, of course. I've been really happy using it. My team is just concerned over the short-to-long term focus being more on features that are above and beyond what we need, and not getting more love in some of the more foundational areas. It becomes harder for me to justify keeping the tool in our pipeline.
But we'd love to come out of our projects with IP's that almost everyone on earth has heard of and say "and it was made possible by ShaderForge!"
You should do it by setting the texture's wrapping mode to Clamp rather than Wrap, and make sure the borders aren't bleeding unwanted pixels out into the outer areas.
This may not always be possible (due to mipping etc), so in that case you can make a mask that replaces whatever is outside the 0 to 1 UV range
Question: I dont understand about the borders Bleeding. Here is my Shader so far.. and it sorta works.
Textures:
_MainTex
_Mask
_Mask2
If you look at the _MASK and _MASK2 textures.. you will see the last poxel row and column as transparent.
Problem: If I fill the Transparent last pixel row and column with the same color... the Sliders dont work anymore?
Mask(s) import settings:
Q: Is this how its suppose to work? Did i mess up somewhere?
Helps me, pulezz!
Hi guys. this plugin is awesome! I'm quite new into shaders but this plugin has made a lot of things easier to learn.
just a question if anybody knows...
how can i lerp vertically from 1 texture to another texture? like if i want a water texture to cover a sand texture vertically in real time. had no clue on what to do :poly141:
I'm trying to make a vertex animation shader. in UE when I multiply my Red painted vertices to the sin * Time nodes, red painted vertices stay still and other colors move but in SF it works If I connect the blue or green color .
Hi guys. this plugin is awesome! I'm quite new into shaders but this plugin has made a lot of things easier to learn.
just a question if anybody knows...
how can i lerp vertically from 1 texture to another texture? like if i want a water texture to cover a sand texture vertically in real time. had no clue on what to do :poly141:
anyway acegikmo good job on this plugin!
Glad you like SF!
Essentially, you'll want to have one texture and the other in the A and B inputs of a lerp node, and then animate a black and white mask that plugs into the T input of the lerp node. What you need to do is to define "vertically". Vertically in what space?
UV space? World space? Local space? View space?
I presume you mean UV space, in which case you can use the V output of the UV coordinate node and add/subtract to make it "move vertically".
I'm trying to make a vertex animation shader. in UE when I multiply my Red painted vertices to the sin * Time nodes, red painted vertices stay still and other colors move but in SF it works If I connect the blue or green color .
how the vertex offset works exactly ?
Ty
Vertex offset is essentially; "Move the current vertex this far". I think the difference between SF and UE, is that UE automatically presumes tangent space offset, whereas SF expects a world space vector. You should be able to use the red channel of the vertex colors, multiply it by sin( time ), then multiply the result by the normal direction, and plug it into the vertex offset input of the main node.
Hi, I bought Shader Forge a while ago, and it ROCKS. I was wondering if it was possible not only to get the three vertices in the triangle, because apparently that is possible in a standard shader (correct me if I'm wrong). The following standard shader code seems to be able to do that:
Shader "Custom/Wireframe"
{
Properties
{
_WireColor("WireColor", Color) = (1,0,0,1)
_Color("Color", Color) = (1,1,1,1)
}
SubShader
{
Pass
{
CGPROGRAM
#include "UnityCG.cginc"
#pragma target 5.0
#pragma vertex vert
#pragma geometry geom
#pragma fragment frag
half4 _WireColor, _Color;
struct v2g
{
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
};
struct g2f
{
float4 pos : SV_POSITION;
float2 uv : TEXCOORD0;
float3 dist : TEXCOORD1;
};
v2g vert(appdata_base v)
{
v2g OUT;
OUT.pos = mul(UNITY_MATRIX_MVP, v.vertex);
OUT.uv = v.texcoord; //the uv's arent used in this shader but are included in case you want to use them
return OUT;
}
[maxvertexcount(3)]
void geom(triangle v2g IN[3], inout TriangleStream<g2f> triStream)
{
float2 WIN_SCALE = float2(_ScreenParams.x/2.0, _ScreenParams.y/2.0);
//frag position
float2 p0 = WIN_SCALE * IN[0].pos.xy / IN[0].pos.w;
float2 p1 = WIN_SCALE * IN[1].pos.xy / IN[1].pos.w;
float2 p2 = WIN_SCALE * IN[2].pos.xy / IN[2].pos.w;
//barycentric position
float2 v0 = p2-p1;
float2 v1 = p2-p0;
float2 v2 = p1-p0;
//triangles area
float area = abs(v1.x*v2.y - v1.y * v2.x);
g2f OUT;
OUT.pos = IN[0].pos;
OUT.uv = IN[0].uv;
OUT.dist = float3(area/length(v0),0,0);
triStream.Append(OUT);
OUT.pos = IN[1].pos;
OUT.uv = IN[1].uv;
OUT.dist = float3(0,area/length(v1),0);
triStream.Append(OUT);
OUT.pos = IN[2].pos;
OUT.uv = IN[2].uv;
OUT.dist = float3(0,0,area/length(v2));
triStream.Append(OUT);
}
half4 frag(g2f IN) : COLOR
{
//distance of frag from triangles center
float d = min(IN.dist.x, min(IN.dist.y, IN.dist.z));
//fade based on dist from center
float I = exp2(-4.0*d*d);
return lerp(_Color, _WireColor, I);
}
ENDCG
}
}
}
On the line that says "void geom(triangle v2g IN[3], inout TriangleStream<g2f> triStream)", apparently "IN" contains the three vertices of the triangle. Is it possible to do this in Shader Forge? Thanks
It's not possible in SF at the moment, as you need explicit access to the geo/domain/hull shaders, which has 3 issues:
1. It will take a lot of time to make, as I have to update the dependency system within SF
2. It will be Windows/DX11 only, until Unity updates their OpenGL implementation
3. It will require a lot of redesigning of the interface
Also, Shader Forge is nominated at the Unity awards 2014, for the technical achievement award
If you enjoy using SF, more than the other products listed, a vote would be heavily appreciated! https://unity3d.com/awards/2014
Hey guys. Spent a few hours on Shader Forge with lots of experimentation and excitement. Finally achieved something close to what I wanted to make.
It's a fully 3D nuke explosion mushroom cloud with lots of adjustable sliders for stuff like speed, flame intensity, parallax depth, smoke transmission, alpha, seperate cloud layers and flame layers. There are some issues with the alpha masks right now but they are easily fixable with changing the mesh.
It only uses a single static mesh for the mushroom cloud and the animated texture offset is done inside SF using Time node. The gif is way too fast but you can try the exe to get the full effect. You can orbit around the nuke with the mouse.
I have just started using Shader Forge and It has been a lot of fun. I was wondering if anyone has been able to successfully recreate the node tree that Bill Kladis creates in his creating dynamic smoke dissipation with flow maps tutorial? I have been working on recreating it in SF, but I run into an issue where my flow map will cause the texture being distorted to behave like it is scaling along one axis, instead of being confined within the parameter of my polygonal plane.
I have rechecked all of my settings and it should work. Unfortunately I cannot see any documentation by anyone encountering the same issue, nor have I seen anyone successfully create particles that behave in a similar fashion using SF.
So after consulting with a friend of mine I was able to solve the issue, and it was a very simple fix. So to start off, the assets I were using were directly from the Bill Kladis' flow map tutorial, which is tailored for the Unreal engine. It turns out that the way Unity and Unreal interpret UV space is flipped. Where in Unity (or at least in Shader Forge) Zero is at the bottom left of the UV space and 1 is at the top right, while Unreal interprets 0 on the top right and 1 on the bottom left. To fix the problem I simply had to flip the flow map vertically. Very easy to fix, and overlook. I hope this helps someone else who might fall into this issue.
Does anyone have a good idea for how I can do a shader that has a secondary texture that only renders "on top" of a mesh, like grass would for instance? Ideally I want to have a mask texture between the two, and a slider for determining at which inclination the change should occur, but I'm fairly sure I can figure those out on my own once I have that crucial beginning down - but I seriously cannot figure it out.
Take a normal direction node, plug into a component mask node (use G channel I think) plug that into a lerp node's alpha, then plug grass tex in A, whatever tex in B
Take a normal direction node, plug into a component mask node (use G channel I think) plug that into a lerp node's alpha, then plug grass tex in A, whatever tex in B
Oho! Thanks, that gives me something to work with. I shall experiment.
Shader needs some work but it's definitely going the right direction. Especially the smoke rolling speed needs tweaking so it doesn't roll back due to Unity's animation curves.
Now added soft edge blend so it creates a smooth transition where it touches other meshes instead of hard edged clipping.
Replies
I mean how can i tell shader to apply a texture to some exact RGB color value instead of just channel? Ultimately i want a shader that can hold up to 10-20 tiled textures and im looking for a way to implement that.
I was also thinking that if that would be possible to interpret the vertex (or poly im not sure) info about material ID that comes out of 3d packages (its called MatID in 3dsmax), to interpret that on shader level so instead of putting different materials for different materials id, it will put different textures of the same material.
Please i need help with this really.
There are also texture lookup limits, in which case you'll hit the limit somewhere around 8-11 textures, depending on if you want lightmaps and and/or lighting on or not. In other words, you won't be able to hold 20 textures in a single shader. And even if you could, it would be too expensive.
You may be able to get it to work if you atlas multiple textures into one, but it would require even more instructions for UV manipulation.
Splitting it up into multiple shaders/materials can sometimes be a good thing
Im talking 1000s of objects like 10k, we already use mesh combine optimizations all over the place, but i can't afford to keep 50k materials if i want to use tiling, so im forced to create 1 texture for each obj, or atlas for themed group, but again if i need to change im going down the pipeline seeking raw files, blabla 2 hours of work.
That situation forces me to seek ways to improve the workflow, i want to use multiple tiled textures per object but i don't want to end up with zero performance because of that, and i thought if you combine those into one mat it will lessen the impact.
If you have any ideas of how to improve on this situation can you elaborate? Some direction, advice, anything.
Looking for hints on node order so I can hack my way to a solution with trail and error so I can learn the process.
Cheers.
There should be a manual way of achieving this.
Use the world position node, and append the R and B channels, and plug it into textures as UV coordinates. You can also use the Y channel as t-value in the lerp node, after passing through a clamp01 node, in order to blend between two colors based on height.
You may be able to do it with a global variable with a specific name, but I'm not sure. In any case, the Skyshop update with SF integration should be coming out pretty soon
I was able to pickup SkyShop Cubemaps by setting the variable in my shader to _DiffCubeIBL and _SpecCubeIBL but the rotation is still a no no.
Just saw the Skyshop 1.07 update with ShaderForge Extension. Exactly what i am looking for.
I will have to wait for that.
https://hostr.co/file/989X5ut6mBlz/UZI2.PNG
https://hostr.co/file/eJ774kxLALok/spectre.PNG
https://hostr.co/file/3Lialglwl535/sc2.png
I have one problem tho, I currently reduce specular ambient light with a fresnel, but the masked area is coming through as black and not as the diffuse base. Im using a older version tho (27) and we cannot upgrade yet. Is there anything to fix that or a workaround ? your fresnel options is ultra strong in the older versions as you said yourself, and I need a control slider anyways
There has been some discussion on this in the Unity Shader Forge thread as well as on the SF UserEcho page. You may find some helpful info there
The error is "Shader error in 'FX/Clouds': undeclared identifier 'node_3860_piv' at line 84"
It shows me the same error with the same configuration and a vector2 linked to the rotator at piv so it doesn't seem to come from that.
edit : The only thing wich produces that error is the rotator node, the panner works correctly, , is there any way i could control the direction it is panning towards to at runtime ?
Fixed a bug where shaders, especially larger ones, took a very long time to open
The camera will now pan when dragging a connection to the edge of the node view
You can now switch tessellation mode to edge length based tessellation
You can now switch mode on the quickpicker to use either the scrollwheel, or the cursor position
The slider node looks and works a bit better (added a margin to the slider, and gave more space for dragging the node)
Fixed a bug where sliders using a larger min value than max value, caused massive performance drops
Fixed a bug where you got serialization depth errors when running Shader Forge in Unity 4.5 and above
Fixed a bug where you couldn't open compiled shaders in Unity 4.5
Fixed a bug where you couldn't use depth testing in the normal input
Could you try to reduce the shader to minimal parts? It would be nice to isolate the issue as much as possible
Also, a small, but important update:
Shader Forge Beta 0.36 is now out!
Nodes such as the Normal Direction node and the View Direction node, now have a visual representation instead of being black
The internal assets of Shader Forge are now no longer in a Resources folder
Ringed in red, I have so far made that half of the normal offsetting based on the value of X and Y speed. I'm trying to do the same to the other half (ringed in white) but I've already rotated in 90 degrees and am not sure how to do both (rotating it by 90 as well as offsetting).
Would anyone know how to do both rotating and then offsetting the UV's? I'd be much appreciative of anyone's help!
Thanks in advance
I can have an extra look at it today and see if I can get it sorted, but I can't promise anything
I really hope lightmaps in general can get a bit more attention... I work for probably one of your biggest customers (in terms of company size) working on some very big IPs and we need lightmap and performance way before advanced lighting features, since we need to support a wide range of hardware and lightmaps/probes are a key feature for us.
Sure we can edit the shaders by hand, but then the shaders we use in-game are no longer in sync with our ShaderForge produced files, which when you need to iterate rapidly and work with dozens of developers, is not an ideal workflow.
Love all the work done on the tool, of course. I've been really happy using it. My team is just concerned over the short-to-long term focus being more on features that are above and beyond what we need, and not getting more love in some of the more foundational areas. It becomes harder for me to justify keeping the tool in our pipeline.
But we'd love to come out of our projects with IP's that almost everyone on earth has heard of and say "and it was made possible by ShaderForge!"
Question: I dont understand about the borders Bleeding. Here is my Shader so far.. and it sorta works.
Textures:
_MainTex
_Mask
_Mask2
If you look at the _MASK and _MASK2 textures.. you will see the last poxel row and column as transparent.
Problem: If I fill the Transparent last pixel row and column with the same color... the Sliders dont work anymore?
Mask(s) import settings:
Q: Is this how its suppose to work? Did i mess up somewhere?
Helps me, pulezz!
just a question if anybody knows...
how can i lerp vertically from 1 texture to another texture? like if i want a water texture to cover a sand texture vertically in real time. had no clue on what to do :poly141:
anyway acegikmo good job on this plugin!
I'm trying to make a vertex animation shader. in UE when I multiply my Red painted vertices to the sin * Time nodes, red painted vertices stay still and other colors move but in SF it works If I connect the blue or green color .
how the vertex offset works exactly ?
Ty
Glad you like SF!
Essentially, you'll want to have one texture and the other in the A and B inputs of a lerp node, and then animate a black and white mask that plugs into the T input of the lerp node. What you need to do is to define "vertically". Vertically in what space?
UV space? World space? Local space? View space?
I presume you mean UV space, in which case you can use the V output of the UV coordinate node and add/subtract to make it "move vertically".
Also, the wiki may help you some on this, especially the gradient related entries
http://acegikmo.com/shaderforge/wiki/index.php?title=Shader_Forge_Wiki
Vertex offset is essentially; "Move the current vertex this far". I think the difference between SF and UE, is that UE automatically presumes tangent space offset, whereas SF expects a world space vector. You should be able to use the red channel of the vertex colors, multiply it by sin( time ), then multiply the result by the normal direction, and plug it into the vertex offset input of the main node.
Thx
my bad
1. It will take a lot of time to make, as I have to update the dependency system within SF
2. It will be Windows/DX11 only, until Unity updates their OpenGL implementation
3. It will require a lot of redesigning of the interface
Also, Shader Forge is nominated at the Unity awards 2014, for the technical achievement award
If you enjoy using SF, more than the other products listed, a vote would be heavily appreciated!
https://unity3d.com/awards/2014
It's a fully 3D nuke explosion mushroom cloud with lots of adjustable sliders for stuff like speed, flame intensity, parallax depth, smoke transmission, alpha, seperate cloud layers and flame layers. There are some issues with the alpha masks right now but they are easily fixable with changing the mesh.
It only uses a single static mesh for the mushroom cloud and the animated texture offset is done inside SF using Time node. The gif is way too fast but you can try the exe to get the full effect. You can orbit around the nuke with the mouse.
http://www.mediafire.com/download/v2pe7zd6r73i5it/NUKE+FULL.rar
I have just started using Shader Forge and It has been a lot of fun. I was wondering if anyone has been able to successfully recreate the node tree that Bill Kladis creates in his creating dynamic smoke dissipation with flow maps tutorial? I have been working on recreating it in SF, but I run into an issue where my flow map will cause the texture being distorted to behave like it is scaling along one axis, instead of being confined within the parameter of my polygonal plane.
I have rechecked all of my settings and it should work. Unfortunately I cannot see any documentation by anyone encountering the same issue, nor have I seen anyone successfully create particles that behave in a similar fashion using SF.
Is it possible to do "alpha to coverage" in shader forge? and if it is can anyone please point me in the right direction.
Oho! Thanks, that gives me something to work with. I shall experiment.
Now added soft edge blend so it creates a smooth transition where it touches other meshes instead of hard edged clipping.
Nope, it's not possible in Unity yet.
Also, my talk from Unite 2014 is now live
[ame="http://www.youtube.com/watch?v=WMHpBpjWUlY"]Shader Forge at Unite 2014[/ame]