someone mentioned a while ago here(i think)about baking in ji styles skin shader via render to texture. i was just wondering if anyone does this. and if so, how did the results come out? what hurdles did you experience?
i dunno...i'm messin' round right now. but i read something a while back. someone here posted something about baking in styles skin shader....kinda like you do when doing a shadow map. it was a while ago and am not sure how credible it was.
I guess you could do something similar to if you wanted to bake specular, get multiple bakes from different angles and comp them in photoshop to remove the dodgy bits?
Real time shaders can't be rendered (AFAIK) by software renderers.
However, you can render to texture via the actual FX file (like one would do in a post-process fx/shader). You can render the mesh to UV space (this is often done for all sorts of blurring/processing effects to the actual texture). How you would grab this from the disk, I'm not sure, most likely a programmer will need to write something to save the rendered texture from the buffer or something.
I'm no expert, but I know you would be able to "render to texture" from a real-time application, but there's probably no way to do so via software.
that's what I meant with "send UVS as positions", if he changes the vertex shader in such a way, he would see a "render-to-texture" like effect in the viewport. Once that is running, just do a screenshot of viewport.
todo so edit line 398:
OUT.Position = IN.UV*2 -1;
however I further analyzed the effect, and this likely wont work because the effect relies on Z and backfaces. And you cannot simulate that properly when taking to UV space.
Basically he renders the backfaces first to lay down maximum distance of object's backside to camera. And then frontfaces, and then he knows the distance between front and back = which is the object thickness at that pixel. And thickness is what you need for SSS. And thickness will change with every different camera angle, hence you cannot bake this.
and render-to-texture cannot work as "backfaces" in the 2D UV dont really make up the backside of an object. While you could pass the depth of the original vertex, you cannot find out what faces are the current backside. Well the more I think about it, you actually could, but you would need to manually pass triangle normals (not vertex normals) and then discard those vertices yourself, but that would be quite complex and messy, and not easily doable with max.
Replies
AFAIK, realtime HLSL shader cant be rendered.
but how would you bake something that relies heavily on light and camera position ?
However, you can render to texture via the actual FX file (like one would do in a post-process fx/shader). You can render the mesh to UV space (this is often done for all sorts of blurring/processing effects to the actual texture). How you would grab this from the disk, I'm not sure, most likely a programmer will need to write something to save the rendered texture from the buffer or something.
I'm no expert, but I know you would be able to "render to texture" from a real-time application, but there's probably no way to do so via software.
todo so edit line 398:
OUT.Position = IN.UV*2 -1;
however I further analyzed the effect, and this likely wont work because the effect relies on Z and backfaces. And you cannot simulate that properly when taking to UV space.
Basically he renders the backfaces first to lay down maximum distance of object's backside to camera. And then frontfaces, and then he knows the distance between front and back = which is the object thickness at that pixel. And thickness is what you need for SSS. And thickness will change with every different camera angle, hence you cannot bake this.
and render-to-texture cannot work as "backfaces" in the 2D UV dont really make up the backside of an object. While you could pass the depth of the original vertex, you cannot find out what faces are the current backside. Well the more I think about it, you actually could, but you would need to manually pass triangle normals (not vertex normals) and then discard those vertices yourself, but that would be quite complex and messy, and not easily doable with max.