Home Unity

Texture as texture coordinates?

polycounter lvl 10
Offline / Send Message
naked_chicken polycounter lvl 10
So in unreal you can use a texture to warp the uv's of another texture like so:

warpIt.jpg

I'm starting to figure out Unity Shaders but I'm kind of at a loss figuring out how to achieve this. Any input would be much appreciated.

Replies

  • bugo
    Offline / Send Message
    bugo polycounter lvl 17
    Strumpy Shader editor has some of the nodes you are putting in there. It works a bit different but you CAN do the same effect. I'm trying to think which one you would switch to be the Mask RG, I know there's a splat thing in there... that separates the channels.

    I would look into the forums of Unity, and look how they apply their nodes, I also remember seeing a documentation PDF there.
  • LoTekK
    Offline / Send Message
    LoTekK polycounter lvl 17
    If you're talking about shader code (as opposed to Strumpy):
    half4 warp = tex2D(_WarpTexture, IN.uv_MainTex);
      //this is your warped texture sample
    warp *= 0.1;
      //Multiply by 0.1
    half4 c = tex2D(_MainTex, IN.uv_MainTex + warp.rg);
      //"warp.rg" is the equivalent of your Channel Mask node
      //all it's doing here is taking your R & G channels of the warp texture
      //" + warp.rg" is adding the value of the warp texture to the UVs of _MainTex
    

    You could, if you wanted, cut down the number of lines of code, as well:
    half2 warp = (tex2D(_WarpTexture, IN.uv_MainTex)).rg * 0.1;  //if you absolutely won't be using the B and A channels
    half4 c = tex2D(_MainTex, IN.uv_MainTex + warp);
    
  • naked_chicken
    Offline / Send Message
    naked_chicken polycounter lvl 10
    Awesome guy's, thanks.

    Now what if I want to use this on particles? How/would that work with a vertex shader?
  • Brendan
    Offline / Send Message
    Brendan polycounter lvl 8
    Shuriken, from memory, alters colours and what not by vertex, and if you're doing additional vertex stuff it would obviously be affected by that. Which means any further changes to vertex colour and position will still work, but the warping will only affect what is in the sampler that uses those UVs (per-pixel).

    In Strumpy, the example would be just like the UDK picture, with the following changes:

    TextureSample changes to Sampled2D
    Mask (RG) may work in Strumpy, if not try Split, then Assemble. Feed Sampled2D into Split, then R and G into Assemble.
    Multiply works the same, 0.1 can be a FloatConst (isn't exposed) or a Float (create an input, then it's exposed).

    For the top parts, you'll need a Sampler2d and a Tex2d. Add the UV from Sampler2D to the results of the multiply, then feed that into the Tex2D UV input. Connect the top node from Sampler2D to Tex2D.
  • Farfarer
    Awesome guy's, thanks.

    Now what if I want to use this on particles? How/would that work with a vertex shader?
    It doesn't work with a vertex shader - all you'd get is the UVs at each vertex being warped, rather than per-pixel.

    But the particle systems use fragment shaders anyway, so it works exactly as it does with the code that Tekk gave you (and that I gave you on the Unity forums).

    That said, if you've got atlased textures on your particles (for animation or variation) bear in mind that your warp texture will have to be equally atlased, otherwise you'll only sample a small amount of it at any given time.
  • naked_chicken
    Offline / Send Message
    naked_chicken polycounter lvl 10
    Thanks guys. I was able to get a pretty good grasp with your help. Talon, since you brought up atlassed textures I have another question.
    Most of the examples I've seen of particles use atlased animated textures rather than animating the textures in the shader. Is there a specific reason for this? The hardware Unity shoots for handling larger, atlassed textures better than more expensive shaders perhaps?

    Say I wanted to create a flickering flame in Unity. In unreal I would use the above method to warp the uv's of a single, small image and put that animated material on particle sprites. Does that method cross over well?

    Again, thanks guys. Switching platforms is always a pain.
  • LoTekK
    Offline / Send Message
    LoTekK polycounter lvl 17
    Warping UVs definitely translates over. I've done up a number of particle shaders that do exactly that, but I've also done some atlassed ones. In the end it depends on what works better for you.

    (Also, I'd imagine that part of the reason you're seeing that bias is that doing a tutorial with atlassed textures is a bit easier to both prep and grok, than one involving shaders with warping UVs and whatnot, especially given Unity's historically-casual target audience)
  • naked_chicken
    Offline / Send Message
    naked_chicken polycounter lvl 10
    cool, thanks LoTekK. On with the shader creation I go then :P
  • naked_chicken
    Offline / Send Message
    naked_chicken polycounter lvl 10
    So I've about got what I want in my shader but I've run into a snag.
    SubShader {
    		Pass {
    		
    			CGPROGRAM
    			#pragma vertex vert
    			#pragma fragment frag
    			#pragma fragmentoption ARB_precision_hint_fastest
    			#pragma multi_compile_particles
    
    			#include "UnityCG.cginc"
    
    			sampler2D _MainTex;
    			sampler2D _WarpTex;
    			float _WarpInt;
    			float _WarpSpeedX;
    			float _WarpSpeedY;
    			fixed4 _TintColor;
    			
    			struct appdata_t {
    				float4 vertex : POSITION;
    				fixed4 color : COLOR;
    				float2 texcoord : TEXCOORD0;
    			};
    
    			struct v2f {
    				float4 vertex : POSITION;
    				fixed4 color : COLOR;
    				float2 uv : TEXCOORD0;
    
    				#ifdef SOFTPARTICLES_ON
    				float4 projPos : TEXCOORD1;
    				#endif
    			};
    			
    			float4 _MainTex_ST;
    			float4 _WarpTex_ST;
    
    			v2f vert (appdata_t v)
    			{
    				v2f o;
    				o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    				#ifdef SOFTPARTICLES_ON
    				o.projPos = ComputeScreenPos (o.vertex);
    				COMPUTE_EYEDEPTH(o.projPos.z);
    				#endif
    				o.color = v.color;
    				o.uv = TRANSFORM_TEX(v.texcoord,_MainTex);
    				return o;
    			}
    
    			sampler2D _CameraDepthTexture;
    			float _InvFade;
    			
    			fixed4 frag (v2f i) : COLOR
    			{
    				#ifdef SOFTPARTICLES_ON
    				float sceneZ = LinearEyeDepth (UNITY_SAMPLE_DEPTH(tex2Dproj(_CameraDepthTexture, UNITY_PROJ_COORD(i.projPos))));
    				float partZ = i.projPos.z;
    				float fade = saturate (_InvFade * (sceneZ-partZ));
    				i.color.a *= fade;
    				#endif
    				float2 uvPan = float2(_Time.x * _WarpSpeedX, _Time.x * _WarpSpeedY);
    				float2 warp = tex2D(_WarpTex, i.uv + uvPan) * _WarpInt;
    				return 2.0f * i.color * _TintColor * tex2D(_MainTex, i.uv + warp);
    			}
    			ENDCG 
    		}
    	}
    

    I want to tile the _WarpTex seperately but I'm not sure where to add in the uv's for that.

    Right now I'm giving it the "i.uv" coordinates but that means the uv's for the main texture.

    I tried plugging a i.uv2 into the shader but it always kicks back an error.

    Any ideas where I would plug in another uv set that comes from my second input?
  • naked_chicken
    Offline / Send Message
    naked_chicken polycounter lvl 10
    Balls... thought of something right after I hit post and lo and behold that was it.
    SubShader {
    		Pass {
    		
    			CGPROGRAM
    			#pragma vertex vert
    			#pragma fragment frag
    			#pragma fragmentoption ARB_precision_hint_fastest
    			#pragma multi_compile_particles
    
    			#include "UnityCG.cginc"
    
    			sampler2D _MainTex;
    			sampler2D _WarpTex;
    			float _WarpInt;
    			float _WarpSpeedX;
    			float _WarpSpeedY;
    			fixed4 _TintColor;
    			
    			struct appdata_t {
    				float4 vertex : POSITION;
    				fixed4 color : COLOR;
    				float2 texcoord : TEXCOORD0;
    			};
    
    			struct v2f {
    				float4 vertex : POSITION;
    				fixed4 color : COLOR;
    				float2 uv : TEXCOORD0;
    				float2 uv2 : TEXCOORD2;
    				#ifdef SOFTPARTICLES_ON
    				float4 projPos : TEXCOORD1;
    				#endif
    			};
    			
    			float4 _MainTex_ST;
    			float4 _WarpTex_ST;
    
    			v2f vert (appdata_t v)
    			{
    				v2f o;
    				o.vertex = mul(UNITY_MATRIX_MVP, v.vertex);
    				#ifdef SOFTPARTICLES_ON
    				o.projPos = ComputeScreenPos (o.vertex);
    				COMPUTE_EYEDEPTH(o.projPos.z);
    				#endif
    				o.color = v.color;
    				o.uv = TRANSFORM_TEX(v.texcoord,_MainTex);
    				o.uv2 = TRANSFORM_TEX(v.texcoord,_WarpTex);
    				return o;
    			}
    
    			sampler2D _CameraDepthTexture;
    			float _InvFade;
    			
    			fixed4 frag (v2f i) : COLOR
    			{
    				#ifdef SOFTPARTICLES_ON
    				float sceneZ = LinearEyeDepth (UNITY_SAMPLE_DEPTH(tex2Dproj(_CameraDepthTexture, UNITY_PROJ_COORD(i.projPos))));
    				float partZ = i.projPos.z;
    				float fade = saturate (_InvFade * (sceneZ-partZ));
    				i.color.a *= fade;
    				#endif
    				float2 uvPan = float2(_Time.x * _WarpSpeedX, _Time.x * _WarpSpeedY);
    				float2 warp = tex2D(_WarpTex, i.uv2 + uvPan) * _WarpInt;
    				return 2.0f * i.color * _TintColor * tex2D(_MainTex, i.uv + warp);
    			}
    			ENDCG 
    		}
    	}
    

    Since you're here though, I'd love to get the opinion of some more experienced shader peeps. How's this look?

    I just edited the default additive particle shader to include a warp texture and use that to warp the uv's for the main texture.

    Anything look superfluous or strange?
  • Farfarer
    Looks ok to me.

    You might be better doing the UV panning in the vertex shader rather than the fragment shader, though.
Sign In or Register to comment.