Hi.
Is it possible to blend some textures together, and then manipulate the UV:s for it.
As I get it, you can only plug Texture coordinates in to a texture2D but the results from my blends would be a float....
Is it possible to turn that float in to a texture2D somehow?
Here's a screen of what I'm working on. I'm trying to create a normalmap from heightmap (very wip still :P). Now, everywhere I have the texture samples I would like to be able to plug in the combined results from a few blends...
Replies
The network here is just me trying to generate a normalmap from a greyscale heightmap (one channel in this case) I know it's probably not that optimized, or good for that matter.... The whole end part for the blue channel is messed up and should be done completely differently.
My initial thought was if it was possible to combine different heightmaps and then generate normals from that info.
I first tried using the HeightmapToNormal-node but that one only takes a texture2D, which was why I decided to try to make my own, and now I know why it's like that :P
Like adding rivets to a surface or multiplying cracks on to another....
Why not use a program like crazybump to generate a normal map from a height map?