hi everyone,
I've been working on a water shader that has custom flows in it, and it seems that I am stuck at the flow maps. The thing is, I would like not to have to paint them in photoshop, but I would rather have something like Valve has; a plugin for your 3D software (in my case 3ds max) that allows you to sculpt normals to generate the flow map.
The thing is, generating the normal map once you have the vectors is easy, but you need to have the vectors first.
Max has this modifier called "edit normals", that seems to fix that problem, but moving the normals around one by one is just not acceptable.
I want to have my script/rollout/plugin to be brush-based, using the ThePainterInferface in MaxScript or something, but that is precisely where I am stuck.
for one, I have never worked with this "ThePainterInterface" before, and from what I've been reading about it, it is only usable for things like bitmap painting, and general data getting and setting. I have no idea how to make it into the 3ds max equivalent of the move brush in zbrush though.
can anyone provide any help for me on this? it would be greatly appreciated
oh, and once this is done I will put it on scriptspot so everyone can download it.
Replies
basically I want to combine the brush interface like the one of the hair and fur modifier with the possibility to grab the handles of normals in the edit normals modifier
http://tech-artists.org/forum/showthread.php?t=429
i'm now going for the painting way: paint directly onto your mesh, with a brush that changes direction depending on which direction you're going. it's the same as the one ben cloward made really, but made by me he ofcourse hasn't put it online, which i will do
A few things I tried:
1) Wrote a tool for max that would push verts on a grid plane and move them away from colliding verts. That gives you a base "flow" around geo (like rocks in a pond)
2) Wrote an external tool that takes a mask image, and generates a 2d flow offset map using a fluid dynamics solver. That was pretty good, but I didn't quite get what I wanted out of it.
3) Use a dense grid plane as your base map with 1:1 uv's. Using transform tools (move with soft select, etc) deform this mesh by hand, then bake out the uv's to a map (or project the uv's onto vertex colors of a more sparse mesh.
4) Painting in photoshop - works OK, but isn't intuitive and is a pain for iteration.
All of these things provided results of varying usefulness.. In the end, just do whatever looks good. You need surprisingly little data to actually get a flow that is better than nothing. Doing a full simulation is a bit overkill. What Valve did with Houdini was quite nice, but honestly the final in game result didn't seem to warrant the R&D time - at least what I could see in L4D2 and portal.
the two options at this point are:
- painting directly onto your mesh or in 2D using a brush that changes to the correct colors depending on which direction you are moving in
- drawing vectors in max using dummies (just click twice, once for the start and once for the end point), and then automatically get a generated flow map out of that by sampling the nearest vertices of the mesh to the vectors, and setting the vertex colors of the mesh depending on how close they are to the vector.
the downs/ups of both:
the first method uses painting, which is pretty intensive if it's used in max, and tends not to work a lot, together with being pretty demanding on your computer. the up side is that it's very user friendly ofcourse.
the second method on the other hand is not that intensive to get to work. the massive down side is that you need lots and lots of vertices to make this map look sharp enough. at least in some cases; most of the time a flow map is only made worse by sharpness.
which would you guys prefer at this point, listening to the plans?
anyone have any idea of how to influence the brush tool in max to have this?
especially if it worked on the objects volume, so i could say import in level geometry and have it slightly repell the normals.
What Ben from tech-artists was talking about:
http://www.kxcad.net/autodesk/Autodesk_MAXScript_Reference_9/How_To_Introduction_to_the_Tutorials.htm [Scroll down to the Painting tutorial]
this is taken from this paper:
http://www.valvesoftware.com/publications/2010/siggraph2010_vlachos_waterflow.pdf
in any case, i'm still open to suggestions on how to do this in max, if anyone's got anything good
I remember some guy in Tech Talk topic (the pinned topic up top) made a Fracture/Vornoi scrip for Max, which essentially allows you to 'paint' the stokes of the fracture outline in a spline like fashion from screen position before breaking the mesh. That could be a good way to go about, it might sound limited, but in practice, it works.
Will post a link to the script once I find it.
Found it, here is what I mean: http://folk.ntnu.no/havardsc/site/wordpress/?page_id=15
thanks for sharing
I will definitely try that if it turns out that my current prospective doesn't work
i've come across this function in the painter interface in maxscript here:
thePainterInterface.buildNormals
it sounds like it might be something that I can use, but I can't find it's use anywhere, can anyone fill me in please?
right, so I managed to make the painter interface work the way i wanted and originally intended it to, meaning, you can now create actual flow maps with my script, if you render to texture manually.
the problem, though, is performance; around 2000 or so vertices, the script stops working entirely (at least, smoothly)
so, that is simply not acceptable, since I aim to be brushing normals on terrain meshes, that can easily go over 100 k.
now, I have decided to convert my script to a full-fledged plugin, meanin I will have to use the 3ds max sdk and program in the C++ coding language.
This should not be a big problem, since I have a pretty solid base of the C++ language.
however, i've been at it all day now, and I haven't gotten anything to work, not even the tutorial lessons. I always get error messages, the lesson doesn't compile and i get external linking errors. at first this was due to some windows 7 persmission issues, that I fixed, but now that I fixed that, the problems have gotten even worse....
is there anyone here with some experience with the 3ds max 2012 or earlier SDK that can help me out here? I realise my explanation isn't really that specific, but it's the best I can do for now, the whole thing would be too confusing....
This would be awesome with a live viewport shader, although I don't know how you could extract the flow map in real-time.
about the live shader: i'm afraid that won't be possible, not even valve has that, so I certainly won't be able to do that it would require live gpu streaming and recompiling of the shader in real time, which is something not a single hardware configuration that I know of can do at the moment
http://valvesoftware.com/publications/2011/gdc_2011_grimes_nonstandard_textures.pdf
unfortunately it's not exactly what I need to complete my plugin the combing itself is pretty simple actually, it's just that the performance of it when using maxscript sucks incredibly, so I'll have to do it with C++
the problems that I had with the SDK are solved in the meantime, so the development can now begin properly
i'll keep you all updated on my development and on whether or not i'll release it to public.
The paint tool part of it is easy. You can just base it off of the tutorial in the MaxScript documentation on creating a paint tool here:
http://docs.autodesk.com/3DSMAX/14/ENU/MAXScript%20Help%202012/index.html?url=files/GUID-FA2D9E9C-90EA-4E95-926E-29BB2FA416F-329.htm,topicNumber=d28e110104
Once you've got that bit of it implemented, you can just put in some code to replace the color of the brush strokes with the color-encoded vectors. I'll dig around a bit and see if I can find my script. I'll post it for you to grab if I can find it.
one thing though; the reason I went for normal vector "brushing" is because it's clearer for the user -> you can review and go over your level without having to think in color values in your head
I just got approvement to release my plugin open source, so you guys can be expecting it to be put online as soon as I finish it ^^
step 1: pick the object you wish to render your normals from:
step 2: enable the comb Normals button. this disables buttons that would screw up vertex ID's and other script functions
step 3: comb your normals!
step 4: render to texture a normals map using the projection modifier;
the script automatically copies your original mesh before you start sculpting your normals, so you'll always have a mesh to RTT from
step 5: resulting render
the resulting render's R and G channels can now be used to offset the time factor of a panner that is panning the UV coordinates of your textures. the result will be that your textures will have "flows" in them corresponding to where your vectors were sculpted.
ps: the resulting render looks weird, I know, but that's because I used my script on a box that I used flatten mapping on. the goal was to be able to use this script on a plane, but I was having trouble with convex/concave meshes, more precisely, the normal ID's didn't correspond to the vertex ID's, no matter how many different ways I converted / transferred the ID's. I won't have this problem with the SDK though.
pps: you will also note that I used a fairly un-tesselated box for this example. the reason for this is that my script stops working at around 1 k vertices; another reason why I'm switching to the SDK
Al there is to assuming you'll have to re-expand the R and G channels (times 2 minus 1, just like normalmaps), then using that knowledge, paint directions into vertexcolors (by getting the direction from the delta between brush updates or something like that, you must already have code for this I assume). You could even use the Blue or Alpha channel for speed.
And if necessary, you could transfer the vertex color directions into normals and bake them in some sort of final processing step.
yes, I thought about that, and actually that was my first intention. The problem is not that it's not doable, because as you say I do indeed have the code for that (or at least most of it, since most of it is the same as the code needed for sculpting the normals.).
the problem with that suggestion, is that it's not very clear to the user what his/her final result will look like. With the normal sculpting method, you can instantly see the direction your normals will flow in by the direction the normals are pointing at. It's much harder to "convert" so to speak, color values in your head than it is to foresee what the normals will look like if you can instantly see them like in my script
I will, though, seriously consider that option for usage instead of rendering a normal map.
about the render to texture though; I intended that to be automatic, but since the script wasn't finished because I changed from maxscript to plugin, that option wasn't implemented yet
about using the B channel as speed: I will certainly use that, that's a great suggestion
I will still create a plug-in instead of a maxscript though, since performance will still be a problem no matter what kind of implementation you use; I'm assuming vertex color painting will become a problem on a mesh of around 10 K vertices.
also, there is still the very impractical problem with convex/concave meshes that is easily solvable in C++, and insanely elusive in maxscript
the second actually that's exactly what i'm doing
the problem with performance though, is caused by the fact that every normal in the mesh has to be tested against a hit with the brush, every time the brush is used and on every tick of the brush, and then if the hit is true, the normal has to be moved. this is very calculation-intensive, so that's why i've chosen to go from maxscript to C++, because it's much more performant
thanks for the compliment btw
thanks guys
good work though
gj m8
If it's not very clear to the user, what about having a preview shader ? I can write you one quickly if you want. It seems like that's a very minor disadvantage compared to the advantages of vertexcolors (no texture memory used, just light vertexcolor data, no pre-processign RTT step, live viewable...).
Also vertex color painting should be pretty fast. Access to the vertex colors is meant to be efficient, that's the whole idea: low overhead ata. I remember a tutorial from 2004 or something that showed someone painting vertexcolors on a SubD object, with hardware from back then.