Started a new wiki page, inspired from the nice hair work from slipgatecentral, but also to hopefully showcase wires & techniques from various artists.
Thanks for posting this Eric. Its a great idea imo.
I developed a bunch of techniques to deal with the hair on my sheperherd. I actually painted much of that by hand so that I could use the strokes in the normals and spec to get everything working properly.
Anyways, I'll post up my "notes" on what I did in this thread in the near future. I just have to get them a bit more organized and legible first.
About ff13. Heres how its hair looks like on ps3 @ 720p (I dont think the game runs any higher except for cutscenes, hence I would suspect its been designed to look exactly like this)
Looks like they render one line out of two in order to get a nice fuzzy effect ... pretty awesome if you ask me. I guess this prevents any 'background showing through' artefacts, while still looking smooth and fuzzy enough...
pior, looks like that is faked soft alpha. A lot of games seem to go with that now, although I have never seen it done with horizontal lines like that.
shadow of the colossus hair: http://game.watch.impress.co.jp/docs/20051207/3dwa.htm
(Japanese but the pictures explain it enough) kinda dated but still gives a pretty good result, and if you animate uvs you could get a nice wind motion. For good results the alpha cutoff value needs to get higher as the hair surface gets further away from the original mesh (vertex color would be a good way to do that).
Also this and this is a nice idea but only for environments.
Yep, that FF looks like "alpha to coverage", where you use a screen-space low res tiling map to break up the outline of the 1-bit alpha. Most games that I've seen using it actually use a noise map for more fuzziness, quite interesting to see it done with a line-based mask (if that is indeed what it is).
Ah I was wondering what the official term was :P I have tried this and found that noise doesnt really work that good. Bfbc2 uses a small square pattern with alternating thresholds, but that requires that you have the screen size in pixels accessible in the shader.
Unfortunately I dont have the disc anymore hence I cannot try - but one way to find out about FF would be to simply force it at 480p output and check it out. It looks alot like the description of screen space stipple indeed. Shots of the 360 version could help too.
Come on guys be brave, someone pull out this crappy game and take SD screenshots! hehe
Well for ff13 I have these two shots if they help:
hehe I posted those images months ago somewhere, I'm surprised someone picked it up. I've got plenty of models from various games I can show pictures of.
How did you get these? They look like shots from 3ds Max.
I've been trying to use 3D Ripper DX, and RipperDX, to capture meshes from Stalker, but they don't seem to like Steam games. Something about how the EXE has to run thru Steam I think.
I get tools from people who reverse engineer games and stuff, and their tools can export mostly to .obj or .dae and yes the screenshots are from 3ds max.
I don't use 3D Ripper DX much (I don't have many pc games) so I don't really know the problem.
Prolly all seen this but I'll post it just encase, can't remember where I found it originally but its a hair techniques pdf from a GDC 07 presentation.
Hehe very cool to see that the trick is slightly different on her! (screen I took was of a secondary character) Is such a technique doable in any popular PC engine ? I like the 50% blurry pass too - I would suspect it would break if it was rendered over another character with similar hair (the usual 'tree leaves showing through hair' problem)...
What's to stop you doing the dithering by hand with a dx1 texture? Might be a good way to get *some* kind of softening effect in engines that don't support alpha to coverage or any other nice blending.
What's to stop you doing the dithering by hand with a dx1 texture? Might be a good way to get *some* kind of softening effect in engines that don't support alpha to coverage or any other nice blending.
Nothing to stop you at all, I've used it in the past. In fact I think I used a filter to convert a soft edges alpha into a dithered alpha.
Different effect though. The hair above is dithered in screen pixel space, so the checkerboard pattern exactly matches the screen pixels.
If you do it in texture space, you won't get that correspondence, the dither is going to be drawn in perspective, mipped, filtered, etc. I mean, it's useful to dither in texture, but you won't get the above effect.
Personally I don't like the dithering at all. Pretty distracting. Crysis does it on their tree foliage and it shimmers like crazy. Better than just alpha test alone, but still not great.
But anyways, can anyone explain this some more so I can follow here. How do you create and save out so you can use this dithering trick? Would 3 point shader support something like this?
I don't know of any off-the-shelf shaders that support this, in fact I don't know the exact mechanism either.
But I bet it's basically that you feed it a small tiling checkerboard pattern, and the custom shader uses that as a mask for whatever passes it's set up to mask.
Yep, you just put a small tiling greyscale pattern in your shader and map it in screenspace so 1 pixel on the pattern == 1 screen pixel. if your pattern was 4x4 pixels you would do something like this (unless I fail at math, which I do):
float2 patternUVs;
patternUVs.x = (screenUVs.x/screenWidth)*4.0;
patternUVs.y = (screenUVs.y/screenHeight)*4.0;
float patternValue = tex2D(patternTexture,patternUVs);
result.a = diffuse.a*patternValue; //Maybe theres a better way to combine it but this seems like it should work.
return result;
and pior, the reason for using this is to avoid sorting issues. There does look like there is a 50% pass rendered under the opaque hair though. What they probably did is take all the hair objects, sort them by depth, render them to a separate screen buffer (with masked out areas where geometry occludes them) blur it, then render that just before rendering the opaque hair. I bet they dont do that step during gameplay though, just in cinematics.
But anyways, can anyone explain this some more so I can follow here. How do you create and save out so you can use this dithering trick? Would 3 point shader support something like this?
Dithering is pretty old school. back in the CGA/EGA/VGA days colour palettes were extemely limited(you'd literally have 16 colours and that was it). to broaden the range of colours you dithered. for instance if you wanted orange but only had red and yellow available to you you put red and yellow pixels down in a checker pattern, The eye mixes them together and you get orange.. or that was the idea
If your familiar with the painting term 'optical mixing' its the same phenomenon.
If you want to bake it into the texture you can edit the alpha channel by hand. You could paste your alpha into a new image and set the image mode to indexed colour/bitmap(depending on the depth of your alpha).
Ran across this doing some research: Maya PaintFX Hair: Getting Started… His examples at the end are just for demonstration, with some smart placement I think it would look pretty sharp, and probably closer to the final fantasy hair.
Replies
I developed a bunch of techniques to deal with the hair on my sheperherd. I actually painted much of that by hand so that I could use the strokes in the normals and spec to get everything working properly.
Anyways, I'll post up my "notes" on what I did in this thread in the near future. I just have to get them a bit more organized and legible first.
I added some GOW2, Yuna, and Varga shots.
Looks like they render one line out of two in order to get a nice fuzzy effect ... pretty awesome if you ask me. I guess this prevents any 'background showing through' artefacts, while still looking smooth and fuzzy enough...
http://game.watch.impress.co.jp/docs/20051207/3dwa.htm
(Japanese but the pictures explain it enough) kinda dated but still gives a pretty good result, and if you animate uvs you could get a nice wind motion. For good results the alpha cutoff value needs to get higher as the hair surface gets further away from the original mesh (vertex color would be a good way to do that).
Also this and this is a nice idea but only for environments.
Added an English translation of the SOTC article though.
Cheap wikipedia link... too lazy to find anything more in-depth at the moment
http://en.wikipedia.org/wiki/Alpha_to_coverage
While looking for pics, found this great Wolfire blog post.
http://blog.wolfire.com/2009/02/rendering-plants-with-smooth-edges/
Come on guys be brave, someone pull out this crappy game and take SD screenshots! hehe
hehe I posted those images months ago somewhere, I'm surprised someone picked it up. I've got plenty of models from various games I can show pictures of.
Moar pls kthxbye!
The models from this game are full of 'seams', it's not from the normal maps 'cause it's still there when no textures are applied.
I've been trying to use 3D Ripper DX, and RipperDX, to capture meshes from Stalker, but they don't seem to like Steam games. Something about how the EXE has to run thru Steam I think.
I don't use 3D Ripper DX much (I don't have many pc games) so I don't really know the problem.
Prolly all seen this but I'll post it just encase, can't remember where I found it originally but its a hair techniques pdf from a GDC 07 presentation.
https://www.cmpevents.com/sessions/GD/S4585i1.pdf
Valkyria chronicles had some nice hair aswell, Engineer Homer Pieroni in particular. couldn't find any good images on him via google tho
http://www.zbrushcentral.com/attachment.php?attachmentid=171653
http://www.zbrushcentral.com/attachment.php?attachmentid=171652
EDIT: Wont let me embed the images for some reason : /
(vBulletin won't embed PHP-generated image links)
Video comparison here, 360 vs. PS3. Hair looks like a combo of alpha blend and alpha test with dither...
http://ve3d.ign.com/videos/play/68727/Xbox-360/Final-Fantasy-XIII/Trailer/Playstation-3-vs-Xbox-360-Visual-Comparison/Flash-Video
Nothing to stop you at all, I've used it in the past. In fact I think I used a filter to convert a soft edges alpha into a dithered alpha.
If you do it in texture space, you won't get that correspondence, the dither is going to be drawn in perspective, mipped, filtered, etc. I mean, it's useful to dither in texture, but you won't get the above effect.
Personally I don't like the dithering at all. Pretty distracting. Crysis does it on their tree foliage and it shimmers like crazy. Better than just alpha test alone, but still not great.
But anyways, can anyone explain this some more so I can follow here. How do you create and save out so you can use this dithering trick? Would 3 point shader support something like this?
But I bet it's basically that you feed it a small tiling checkerboard pattern, and the custom shader uses that as a mask for whatever passes it's set up to mask.
and pior, the reason for using this is to avoid sorting issues. There does look like there is a 50% pass rendered under the opaque hair though. What they probably did is take all the hair objects, sort them by depth, render them to a separate screen buffer (with masked out areas where geometry occludes them) blur it, then render that just before rendering the opaque hair. I bet they dont do that step during gameplay though, just in cinematics.
Dithering is pretty old school. back in the CGA/EGA/VGA days colour palettes were extemely limited(you'd literally have 16 colours and that was it). to broaden the range of colours you dithered. for instance if you wanted orange but only had red and yellow available to you you put red and yellow pixels down in a checker pattern, The eye mixes them together and you get orange.. or that was the idea
If your familiar with the painting term 'optical mixing' its the same phenomenon.
If you want to bake it into the texture you can edit the alpha channel by hand. You could paste your alpha into a new image and set the image mode to indexed colour/bitmap(depending on the depth of your alpha).
Cheers!
~t