Home Technical Talk

Scanned data or hand painted...?

DustyShinigami
polycounter lvl 5
Offline / Send Message
Pinned
DustyShinigami polycounter lvl 5

Hi

Looking at the head texture for Arthur Morgan from RDR2, would you say scanned data was used, or do you think it's hand-painted in Photoshop...?

I mean, I'm inclined to say it's scanned data, due to how detailed it all is. Especially the stubble. But then, development for RDR2 probably started before using scanned data was more commonplace...? When did that start being used exactly? I know Rockstar did (and no doubt still does) use Photoshop for a lot of its texturing, but this seems a lot more realistic than if it were handpainted.

I doubt the fine hair was from Xgen. I tried mimicking it for a hair cap, but it looks crap in comparison. Even with a Normal map added. The root and ID maps will have to be used once it's all set up in UE4...

But I've also observed, with scanned data packs available online, that all the head textures from models are shaved bald. Or the shaved hair is much finer. Though I suppose Rockstar probably take their own scanned data from models, right?

Apart from the 3D Scan Store and Texturing.xyz, are there any other sites out there that provide scanned texture data, but that actually include hair similar to Arthur's above? Or does it normally have to be shaved completely due to scanners not being able to pick it up? Is the hair added separately? And if so, how exactly? Again, it doesn't look like the results I'd get from XGen. 😅

Thanks

Replies

  • pior
    Online / Send Message
    pior grand marshal polycounter

    "But then, development for RDR2 probably started before using scanned data was more commonplace...?"

    You're getting tripped by your own assumptions, assuming that "realistic" = "scanned" = "high quality 3Dscanned". Digital cameras have existed for decades, and photo sourcing has been used in games and movies for exactly just as long - with or without 3D scanned data.

    https://www.google.com/search?q=3dsk+head+textures&rlz=1C1CHBD_enFR996FR996&source=lnms&tbm=isch&sa=X&ved=2ahUKEwjIwci5zMj4AhUMExoKHY2CA6EQ_AUoAXoECAEQAw&biw=1413&bih=864&dpr=1

    https://www.renderhub.com/3d-sk/male-high-res-head-texture-009

    The following is from 2001. That's 21 years ago.

    https://pbs.twimg.com/tweet_video_thumb/DhMF2h-X4AE423V.jpg

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Ahhhh. I totally forgot about those. I've been caught up in the world of manual painting (Photoshop/Substance) and 3D scans (Texturing.xyz and 3D Scan Store). I'll have to look into how they're set up...

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    Rockstar have been using scanning and performance capture for a lot longer than most game studios. I believe LA Noire made heavy use and it's entirely possible they were doing it before that

  • pior
    Online / Send Message
    pior grand marshal polycounter

    @poopipe Well, I can almost guarantee you that the model in the screenshots wasn't done in the way of LA Noire though (which itself isn't using scanning as it is generally done anyways). This here absolutely looks like a mostly handcrafted model onto which photosourced textures were projected/mapped - perhaps with a rough scan as a likeness reference or as starting point, but that's about it.

    This method is actually incredibly fast and efficient in order to create ps3-ish era models. Load up a base mesh, shape it to match the desired character, project/map a detailled photosourced texture onto it, and keep refining the model according to the landmarks of the texture. It is extremely fast and easy to do.

    That's a pretty common old school movie CG approach, there are a few old Gnomon DVDs showing assets made that way (before baking from highres became the norm). Around 2005/2010 or so.

    https://www.youtube.com/watch?v=X56aCO7v2uo

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    I wasn't there.. just pointing out they've been using rigs for capture etc for a lot longer than many people think.

    Given its RDR2 which is relatively recent I'd expect them to have leveraged their capture setup as it reduces the workload for animation etc.


    No captured data ever makes it to the game without extensive manual work so I'd expect your interpretation of the process to be fairly close to reality

  • Klunk
    Online / Send Message
    Klunk ngon master

    we used some software from the mid 2000's that generated "life like" "photorealistic" heads from a front and side image for a football game. Though they required a lot of post processing in max afterwards to get a good result. Can't remember the name of it though :( but that top texture looks very much like the textures the software would spit out. Though it seems to have better under the chin resolution than we got.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    This is all useful and interesting info. I'm always intrigued with different approaches to things. It's great when unsure how to approach a certain element/situation and you can borrow a trick/technique from a different workflow. :)

  • Neox
    Offline / Send Message
    Neox godlike master sticky

    regarding the xgen scalp. i found it to be better to just use masks/ids for this and do the actual hair color roughness and normals in substance painter or photoshop using those masks. the baked down xgen always looked kinda shit. but the same hair baked down with random normal values softly blended into the texture looked very nice.


    you can see some of my dabbling with the topic here. and while it is stylized i am sure it could be used for realistic content

    i'd bake a root to tip gradient, some random values for greyscale and normal directions and play with those in painter

    https://polycount.com/discussion/225558/xgen-eyelashes-eyebrows-best-way-to-convert-to-hair-cards

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Thanks for the link. Some of the stuff there is what I've been following, such as Manjiana's tutorial on baking a hair cap. :) Thanks to that, I have an ID and Root map. The only thing I don't have, which looks to be essential, is the flow map. I'm a little unsure about that one though.

    Regarding the masks/ids - I'm a little unsure how I'd generate the Normal map using those in SP. Do you mean to bake the Normal map first and then do something with a mask afterwards...? Like some final compositing...?

    These are all the maps I have so far, which were done in Marmoset Toolbag 4...

    Alpha:

    AO:

    Height:

    ID:

    Normal:

    Root:

    I've also been working on eyelashes/eyebrows recently too. I did try baking everything to a single plane, but changed my mind for some reason. It escapes me why. It might have been the workflow I was using - I've been using Johan Lithvall's method of baking XGen hair clumps with xNormal. Either way, in the end, I decided to just make individual clumps/strands for the eyebrows/lashes. I suppose with those, I'd have more control on the level of thickness for them with placing them manually.

    EDIT: Oh. I also noticed someone mentioned to use Maya's Arnold render for the baking in that thread. I tried that method too, but keep getting an error. I even posted about it on the Autodesk forum, but I've still not had any solution to it. :-\

    EDIT 2: Another thing... With Johan's method, he doesn't appear to generate a Normal map for his hair cards. Are they necessary or recommended for the cards themselves? Or is it better sticking to ID, Root, Flow maps etc.?

  • Neox
    Offline / Send Message
    Neox godlike master sticky

    when you convert your hair to geo you can tell it to create 2x2 3x3 or 4x4 UV pattern. knowing that, all you need to do is apply random normal vectors and gray scale values to a texture which you apply before baking.

    the normals you baked look exactly like the "shit" i baked and which never gave me nice results. once i just used randomized normals which i mix into the normals of the head itself it became a lot more controllable.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Yep, that's what I did as part of Manjiana's tutorial. :) I think I created a 3x3 UV pattern and then assigned a gradient and greyscale texture to them and baked. The part I'm lost with is the Normal vectors. I've not done anything like that before, so I'm unsure what you mean or how to go about doing that. 😅

  • Neox
    Offline / Send Message
    Neox godlike master sticky

    pick 9 random colors, use them in your grid.

    much like 9 random greyscale values you could use to breakup the diffuse or roughness a bit.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Ohhhh. Cool. I'll give that a go and post the results. Thanks. :)

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Okay, this is what I've got...

    I've loaded it into the Normal map slot in MT, added my alpha mask and assigned a random colour and this is how it's looking:

    In your linked thread... when you say you can tweak how random you want the individual hairs to be, do you do this by using the same approach as if you were going to paint out issues on a Normal map, or...?

    Also, with the Normal map down-vector, how do you set up that colour wheel you provided so it's 128, 255, 128? I'm guessing that same wheel can be used and converted for the next set of 9 colours for the flow map? Thanks.

  • Neox
    Offline / Send Message
    Neox godlike master sticky

    the flowmap is just one color, the normal down vector. which is applied to all hairs. but you will only need this if you will be using an anisotropic shader. from the previews posts i thought you wanted to texutrize it onto the head texture like the RDR2 texture. But this can of course work in a seperate scalp mesh where you can use a flowmap for tha anisotropic shading.


    as for the random normals, don't use them at 100%, i would use them on very low values just break up the shading a bit, you can mask this as well and randomize some areas more than others etc.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Ideally, I would have liked to have approached it all similar to the RDR2 method, but it looks as though a head wrapping texture was used like as mentioned above. I've gone about it differently and used Wrap4D along with a head scan from the 3D Scan Store. So for now, I'll just stick to what I have and use a separate hair cap. But I'd still like to learn this approach as it's all new and useful knowledge. :) Plus, I'm still learning hair, so I've yet to make and use flow maps etc. As to the anisotropic shader - I'll be using the hair shader in Unreal. Is that considered the same thing...?

    I'm still a bit lost. From one of your posts in that thread, your flow map looks to have a range from blue to green, yellow, and orange to red...? I'm still doing this baking in Marmoset at the moment, so do I need to be taking this new baked Albedo map with the different Vector colours into Substance Painter? Or can it be done in Marmoset, too?

    RE, the random Normals - was I correct about how you go about adjusting them...? That you need to paint-out/replace the Normals...? Like outlined here: https://substance3d.adobe.com/documentation/spdoc/normal-map-painting-109608964.html

  • Neox
    Offline / Send Message
    Neox godlike master sticky

    I don't quite understand, if you can bake it onto the hair cap you made here, what stops you from baking it onto the head you will texturize?

    As for the flowmap what you are seeing is the output of the bake. the input for the individual hair is just a normal down vector. baking it down into something else is what gives it the gradients.

    for the random normals, just load this map into substance as a fill layer, plug in the random normalmap into the normalmap slot and lower the transparency inside painter. as saiud you could even mask it if you want some areas with stronger randomized hair.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Sorry. It's probably because of my Asperger's, but I can find it quite difficult to process and understand what someone means if it's not explained clearly. 😅 I feel like I'm getting more and more confused.

    Unless I'm overlooking something obvious, and I probably am as usual, I think it might be difficult to bake it straight to the head mesh...? As mentioned, I used Wrap4D with a 3D Scan Store texture. Those meshes have their own UVs set up that have been transferred over. I imagine if I tamper with the UV and separate the scalp portion, it's going to mess up the UV and the texture from 3D Scan Store's mesh will no longer project properly...? Otherwise, if I leave the UV of the head as it is, the high poly hair is going to bake all over the head...? If you can mask out where it'll bake, I'm not sure how.

    I'm still totally lost about generating the flowmap. Do you mean, I need to take the Vector colour bake I've made, or the original Normal bake, load them into Photoshop, and export out the Green channel...? And apply that to the high poly...?

    For the random Normals - when you say 'this' bake, do you mean the original Normal map I have that looks bad, or the new Vector colour one?

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    Okay, I think I got it... I extracted the green channel from the new Vector colour map, added it to the high poly in Marmoset, baked out a Normal map, and got this...

    I hope that's correct...

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    if you're using xgen you should use arnold to bake things


    there are two reasons

    1: you dont have to convert to geometry so your bakes take seconds instead of minutes - you also get IPR and decent antialiasing

    2: you can generate a flow/direction map by using the derivative shader (dPdV / dpDu)


    I dont know if they've put a workable interface onto arnold's RTT in maya yet. it used to be worse than useless so I wrote my own.

  • DustyShinigami
    Offline / Send Message
    DustyShinigami polycounter lvl 5

    See, I wanted to do the baking in Maya with Arnold. I'd read that the results are much better compared to xNormal due to antialiasing, but sadly, I'm not able to get it working for some reason. :( Is it done via Transfer Maps? Where you bake a high poly to a low? I'm guessing it does. Or can you bake it all straight from the high poly?

    If it's done via the Transfer Maps, I'm kinda screwed there. Despite setting everything up correctly, I keep getting an error saying 'No object matches name: cap'. :-\

  • poopipe
    Offline / Send Message
    poopipe grand marshal polycounter

    Arnold has its own render to texture tool


    on 2018 it lived in the arnold menu not sure what they've done since

Sign In or Register to comment.