So, I understand how UVW mapping works and how the U and V axis relate to the X and Y axis and whatnot, but...what IS the W axis? Is there any cool stuff we can do by messing around with it? I assume it comes into play with '3d textures' (don't know if this is an actual term) like cubemaps. Anyway, just curious about what the fuck this is and if this can be turned into something cool and useful.
Like I remember when using cinema4D a long time ago, if you applied c4d's noise texture and moved the object in 3d space the texture changed.
Gets in the way sometimes if you're trying to Weld but your W distance is greater than your UV distance.
I vaguely remember someone had a cool W trick for something or other. Sorry, maybe they'll repost...
Not used very much in games because they usually take a lot of memory. I did some work with 3d textures, we used it to store a custom noise pattern for a volume smoke effect coming from a spotlight. Really a pain in the ass to make though, it needed to tile in all three dimensions, blargh!
I guess you could store other data there, but most exporters and engines simply strip it out.
Maybe you could store something like vertex alpha there... monochrome per-vertex info. Physics info comes to mind (friction or bounciness or whatever), or sound data (dirt sound vs. rock sound).
Hmm, now we're getting somewhere. Weights for animated vertex shaders too (rippling water or waving flag come to mind). Still, is it worth the hassle? Do vertex colors add that much memory overhead to make it worth the trouble of authoring this stuff into the W channel?
I posted this yonks ago and its a very handy trick.
As for hassle, I guess it's just a matter of finding or making tools to let you manipulate W as paintable vertex data. Then you also need an exporter that stores W instead of tossing it. And an importer for the engine that reads the W data.
Exactly, now, is it worth developing all this stuff instead of just using vertex colors straight away? I've always assumed they don't add much in terms of memory so as much as I like the idea of milking stuff to the very last drop it all sounds a bit overkill.
if the render engine supports it, you can have a lot of per-vertex attributes encoding whatever you want. Hardware typically supports 16 x 128 bit per vertex (you can add even more through custom buffer fetching in shader), but of course it costs more time to fetch more data...