We're adding a new feature in our game to help us control the river-current in some of our game levels, so the physics solver can send floating things down-stream, into eddies and out, slowing, speeding, etc. Pretty cool.
We're going to use a normal map to control the current, per-pixel. Awesome. Red and green channels will control direction and speed of the current in the XY plane. Also we can use the length of the normals for additional speed control.
I'm wondering how to author these textures though. So far I'm thinking the best way is just to hand-paint each channel individually. Full-on red for full-strength left-wards current, black "red" for full-on right-wards current, etc. Eddies and swirls get weird though, hard to visualize.
And then there's the length of the normals... how do we paint that? Can't just sculpt a heightfield, how would you represent full-speed current going left down the middle of a river?
Anyone else make something like this, or have some ideas how they would create the map?
Replies
deffinatly think modellling is the way to go it will look like a big slide if you imagin how water would run down that slide it should give you a fair idea of how its going to work, but i do think the speed channel will need some extra work in PS after as you want the centre of each stream to run faster than the edges, and i cant for the life of me figure out how to do that without funking up the direction channels
sounds like fun, and could look ace
And then bake that out to a texture if needed, or leave it as a per vertex thing.
Editing per vertex has several advantages, and you can easily bake the per-vertex info by projecting the object in texture space in the viewport and doing a hardware rendering (like i did for uv reprojection: http://boards.polycount.net/showthread.php?t=61438)
- you can visualize the info you're painting realtime with a custom cgfx, like a sin() wave per axis moving over time or any feedback more relevant than plain colors.
- you can generate custom info procedurally: for ex using a sphere's translate and scale transforms as input of your 'baking shader' like you would do with point lights, you can easily generate vectors 'pushed away' by those spheres in texture space of your river, to simulate obstacles. I did a similar test recently: baking the light direction in texture space to create specular from selfillumination maps.
Having a math control over the output of the shader also gives you the ability to normalize or whatever useful operation, like transferring to world space coordinates.
You don't really benefit from having a normalized vector tough, since you use information only in the texture plane. Two axis already give you full control over the speed. It's not really relevant having an info into the length when you don't use the 3rd dimension of the vector.
I guess you could use it for spinning/rotation info or something else
I'd like to see what you can come up with it'll be really interesting good luck
The game is using a non-realistic style, and the water won't be using a visually complicated shader for the rendermesh. In fact the normalmap for the flow will probably be super-low like 32x32. It's a late feature, so not much time to devote for tools. Vertex color might make sense for storage though. I thought about playing with Particle Flow, but I know that'll take some time to get it right, and I have way too many other things that need attention too.
I'll post some pics in a few months when the project goes public. More ideas welcome anytime.
Vertex colour would have been useful if you were having a complicated surface so you could treat bends in the river differently without having to create more and more maps, though you'd probably need a fair few polys to get any definition.
I'd be interested to see how this turns out.
maya ftw, I'd need to learn a lot more about max to do it.
x and y channel can be created, by rearranging the smoothing groups so that you have some sort of segmented optics, now you have 2 normals per vertex and so you have 3 points, this way you can use the vertex and the 2 normal destination points into a layer, and rotate the aiverage of the 2 normal vectors 90 degree downwards and you have the flow direction, this is then rendered out as a targent normal map, and the blue channel is replaced by the grayscale image from earlier. if then normalizing or not is a matter of trying.
but i thing this is a simple lazymans algorithm to generate your flow normal maps, and not having to think about them lol
hre you can see how i mean that with the normals
edit: i found spelling mistakes you are free to keep them
ie
L[r+g] = 1
B = 0-1 * MaxRiverFlowSpeed
So the force or speed of the river at any given pixel is
\Normal[r,g,0] * (b*MaxRiverFlowSpeed)
Not sure how many textures you're going for, but you can get the info with a particle system like you were thinking. If you link its the particle's color to its normalized velocity you effectively get your normal map info so long as you can capture the particle colors with a bake.
made a test for my version, if your features are already implemented, and this model + map make results that akt like they should act my methot works, and you only have to scult the river like you want it to look and follow the baking steps
Also one of the coders on TCE posted a link to some work he did recently. Very interesting demo, could be an interesting way to create a flow map. Fun to play with too.
For expediency though I hand-painted a quickie test map this afternoon, just picking colors from a world-space-lit hemisphere, painting swaths and blending them together. Should be good enough for now.