Hey folks,
I found a really neat trick when working with custom HLSL nodes in the UDK material editor and wanted to share.
As of right now, to pull in a texture as an input (assuming you want to mess with the texture itself, not just a float3 or float 4 value) you need to use a custom texture node.
HOWEVER
If you want to pull in a a different type of texture, for example a texCUBE or something like that, you can't since you can't plug those into a custom texture node.
Solution:
Create a normal texture sample like you are using a cube map in a normal material, multiply it by zero, then add it to your results of your custom node before moving on in your chain.
Now that texture is part of the pixel shader, and if you go and look at the source, you can get the name of the texture.
After you have the name, you can go back and call that texture sample manually in the HLSL just like if you had plugged it in as a custom texture.
Hope that helps folks. I'm currently working on some custom lighting models and figuring that out was a must.
Sorry no pics. Will add them later.
Alex
Replies
I am looking into way to import file formats into the UDK that allow for custom mips. Haven't found much but I have no looked hard either.
The alternative option is to use a static image for reflections but that would create other problems with the image not being accurate. You would need to implement this:
http://en.wikipedia.org/wiki/Lambert_cylindrical_equal-area_projection
in order to form more accurate reflections.
More to come if I find more....
If it's possible to set it up the way I think you're saying, then you could use a simple Luminance mask, to sample the colors from the un-LOD'ed feed of the Cube and feed the colors from there, to your LOD'ed Cube.
I would like to test this idea out, it would be so awesome!
It would be possible to do that. What did you have in mind with using a rough Luminance value of cube map?
Also for those who wish to see.
I was hoping to drop LOD's so that only the seams would appear and from there, isolate them via Lum or pad them (with a simple offset in 4 directions) with the colors of the original Cube (maybe even use Mip 1 or 2 since doesn't break too badly), and with the help of a slightly noisy Normal Map (something really small) hide any color break.
But running some tests, UE really screws up the Mips (I'm getting some horrible blotching all over the place. Even at Mip 3 or 4, it looks like Unreal is using 2 different Mips on the same model, especially around UV seams).
I also tried going to each texture and the CubemapTex items, and setting all of their Mip options to Blur 5, nothing changed. Even enabling, disabling SRGB doesn't seem to work, I'm guessing these are broken in the latest version of UDK?
So I'm unable to simply isolate or pad anything.
Also, about your quote in another thread:
I think UDK should import TGA with MIP's or create them as 'new' textures one imported, using ATiCubeGen, although I'm not sure if the extra time spent on that would be worth it.
Also, seems like Epic has their own implementation of LOD's for the Real-Time reflection Cubemap, it has a small Mip button on the CaptureActor item, but it's causing the same issues for me (EI: seams once you drop it too far), sad news is, the Material method doesn't work for RT reflections.
I may check out other solutions in the coming weeks but for now Ill just deal with it. The only other idea is super expensive, and requires lerping between different versions of the same cube map with weighting. Ill investigate and if I see anything cool, Ill post it here
Alex
The closest thing to using this effect (in the case of Metal) I found was simply sticking in a detail Normal with Spec/Diffuse information in it, just strong enough to break up the seams, and with enough information to seems like it's following the others maps in the Normal Map. Good way to fake it.
My Diffuse and Spec terms aren't using the detail Normal in this case, standard Normals of the model I might need from a direct bake instead.
It's a good way to fake a rough effect, especially at around Mip 7 or 8, where you mainly have color information only, especially for Skin, but it will require alot of re-learning from the artist to setup the correct information in all the Maps, and when to use details, etc.
Hope you come up with an awesome solution! So close, yet so far!
The seam issue can be fixed by using dx 11 in udk.
Most dx 11 hardware blur mip maps.
No worries man. That shot was taken using dx11 UDK. Was there something specific you have to do? I tried all the built in mipping options but none of them removed the seam as I traveled down the mip chain. Any suggestions would be awesome!
Alex
What I do :
I capture my cubemap in Dx 9 ( it still buggy dx 11 ).
I go on each face texture of the cubemap and change NoMipmaps to FromTextureGroup. Do it on all faces !
I go on the cubemap texture and check if FromTextureGroup is used.
In the material I use the setup you posted above but I changed texCUBEbias to texCUBElod.
Some examples :
Hope that will help you !
Tell me if it's working.
Exactly!
I will test it out tonight when I get home! Thanks so much if this is really the fix. I think what may have been the problem is the cube map I was using was not my own. It was one thats been around since unreal 3 so it may not have correctly blurred mips. I will make my own one, and update my version (I am a couple behind) just to be sure!
Alex
Neox : Yeah agree but this not affecting the mip at all ( in my test ). UDK set it to wrap for some reason ;s by default. Maybe that could help.
This is on July version of 2012 by the way.
Cheers!
Now that I was able to correctly judge the blurring, some of you guys may find that the first mips look almost identical. If you guys are looking to offset a roughness input so that the mips change texture fidelity correctly as you change the value (vs 30% of your roughness not actually affecting the blurriness of your cube maps) should check out what I did. Some simple fraction stuff but I created an offset value on a scale so that you can tell the mips that will be read where to start and where to finish, but still scale correctly between the two.
Shots are a comparison of the default phong and my shader with approximate material matching. The shader supports cube map based ambient diffuse and specularity as well as regular lights. No metallic spec just yet. Also has full roughness support with appropriately blurring environment maps.
Unfortunately it does not like UDK lightmaps at all. Probably only good for showing off models or characters.
The far left shot is without any lights, just getting lit from the environment
My setup :
If your are using phong or blinn you might want to multiply your glossiness by 128 in order to get a range of 0 - 1 to avoid value like 180 or 200.
But with a cook torrance or a spherical gaussian approx no need to multiply by 128.
Good node for doing IBL ( Valve approach ) and no need to blur the cube :
And yeah UDK ( UE3 ) don't support lightmaps with custom lighting. To avoid that what you can do is go back to the 2011 02 version and change the shader code in Common.usf.
check the function to change your spec model :
half3 PointLightPhong(half3 DiffuseColor,half DiffusePower,half3 TransmissionMask,half3 SpecularColor,half SpecularPower, half3 L, float3 E, half3 N, float3 R)
{
What ever you want !
}
This will work perfectly :
half3 PointLightPhong(half3 DiffuseColor,half DiffusePower,half3 TransmissionMask,half3 SpecularColor,half SpecularPower, half3 L, float3 E, half3 N, float3 R)
{
half gamma = dot( E - N * dot( E, N ), L - N * dot( L, N ) );
half rough_sq= DiffusePower;
rough_sq *= rough_sq;
half A = 1.0f - 0.5f * (rough_sq / (rough_sq + 0.57f));
half B = 0.45f * (rough_sq / (rough_sq + 0.09));
half alpha = max( acos( dot( E, N ) ), acos( dot( L, N ) ) );
half beta = min( acos( dot( E, N ) ), acos( dot( L, N ) ) );
half C = sin(alpha) * tan(beta);
half3 OrenNayar = (A + B * max( 0.0f, gamma ) * C);
half DiffuseLighting = saturate(dot(N, L))*OrenNayar;
#ifndef DISABLE_DYNAMIC_SPECULAR
half3 H = normalize( L + E );
half NdotL = saturate( dot( N, L ) );
half NdotH = saturate( dot( N, H ) );
half NdotV = saturate( dot( N, E ) );
half VdotH = saturate( dot( E, H ) );
half3 fRange = half3( 1, 2, 3);
half Fresnel = 1-saturate( NdotV );
Fresnel *= Fresnel;
float FresnelRange, f = Fresnel;
if ( f > 0.5f )
FresnelRange = lerp( fRange.y, fRange.z, (2*f)-1 );
else
FresnelRange = lerp( fRange.x, fRange.y, 2*f );
half Dot = saturate( dot( N, H ) ); // 2 allu
half Threshold = 0.04; // 1 allu
half CosAngle = pow( Threshold, 1 / SpecularPower ); // 2 allu
half NormAngle = ( Dot - 1 ) / ( CosAngle - 1 ); // 3 allu
half D = exp( -NormAngle * NormAngle ); // 2 allu
D *= FresnelRange;
half SpecularLighting = NdotL * D;
//half SpecularLighting = PhongShadingPow(saturate(dot(N, H)), SpecularPower);
#else
half SpecularLighting = 0.0f;
#endif
// Only apply transmission to diffuse, not specular
return DiffuseColor * lerp(DiffuseLighting, TransmissionMask, TransmissionMask) + SpecularLighting * SpecularColor;
}
( Cook torrance and Oren nayar taken from D3DBOOKS )
Glad to see it's now working for everyone !
I was wondering how far back you'd have to go to access those. I don't get why they removed access to them from UDK