Home 3D Art Showcase & Critiques
The BRAWL² Tournament Challenge has been announced!

It starts May 12, and ends Oct 17. Let's see what you got!

https://polycount.com/discussion/237047/the-brawl²-tournament

[WIP] - UE5 - Ryan Gosling: Drive

polycounter lvl 7
Offline / Send Message
de_cbble polycounter lvl 7
Hi!
What started out as a school bust sculpt turned into a full character art project that blew out of proportion.
The project is Ryan Gosling, specifically from the iconic movie, Drive.
I am still looking to break into the industry and figured I needed more quality portfolio pieces. This is another step in that direction. The journey has been incredibly long with hiatuses, other work and shifting goalposts but I'm almost near the end, I think!

As I learned a lot of things, this will be a journal for me to chronicle my process, for future me to reference when I inevitably forget. It will also be for me to talk about some of the issues I faced (and some solutions I found) that I hope other people can chime in on. I hope to learn from others!

Actual progress:


Thanks for reading.  :)

Replies

  • de_cbble
    Offline / Send Message
    de_cbble polycounter lvl 7
    Some overarching goals: Quality character piece, learn Marvelous, use strand hair, and use Unreal Engine for the medium. As you'll see, a lot more goals were added onto these eventually.

    Chronologically, it's a mess, so I'll try doing sections to make it coherent.
    Starting from the beginning, I'll post the sculpt I submitted for school so we can all laugh at it:


    A year or two later I dusted it off, slapped on a new basemesh. The one I eventually used was by Tom Parker, who interestingly also based it off Ryan's appearance in Blade Runner 2049. I marveled at how he achieved his likeness, but I projected my sculpt onto it, knuckled down and then spent many hours refining it. I also sought some feedback from a local lecturer which helped a ton.

    Among many likeness Youtube tutorials, one technique I gleaned that worked for me was to rapidly turn on and off Spotlight to be able to sculpt the 'impression' rather than trying to match exactly the shape for every photograph or screenshot (though I see some sculpt masters on Youtube do this).

    It got a step closer than where I began, but it wasn't enough. I then bought a face pack from TexturingXYZ, VFace #89 Oleh, and started texture work. I did basic noise with Noisemaker in Zbrush first. I then wrap the VFace model, projected to my mesh, took the textures into Substance+Photoshop, and painted, layering on stuff to get more variation. I knew Ryan in Drive was a very tired man since he always worked night jobs, so I went a bit heavier on the eye circles.
    Zbrush evolution:
    I simultaneously brought the head into the Unreal Metahumans sample project to use the lighting for lookdev of the head sculpt. With real rendering powers, I immediately saw how far off I was with the sculpt, and I think the progress from there was very rapid. I was surprised myself. What people say about getting it in engine ASAP must be true, I recommend doing so. It looked correct in ZBrush but not in Unreal.
    Thanks to JHill's Youtube videos, I ripped out eyes from Metahuman, roughly adjusted their eye shells to fit my character, and placed it in.
    Unreal evolution:

    The rightmost sculpt stage in the Unreal shot above is around the 1/3 mark of the changes I made since.
    Also, you can already see it above, but I will circle back to hair at a later post.

    TL;DR: Sculpting likeness is very hard. Even after this process I can only hope I got better at it.
    I think that's it for now, I'll probably talk about clothing next.
  • de_cbble
    Offline / Send Message
    de_cbble polycounter lvl 7
    I'm kind of close to the end I think, I'm fixing some bugs and running renders through MRQ now so I should really catch up on the posts.
    Clothing creation was really problematic and I think one of the things I spent the most time on. There's a lot I could talk about but I guess I will focus on 2 things I realized:

    1. Industrial/institutional knowledge
    I ran through Marvelous to generate the clothing, I think it's really great for that. But I had to refer to Youtube tutorials for the closest clothing pattern type to work off and modify, since I'm not a pattern designer. Basic cloth patterns aside, designers also have fitting considerations like darts and measurements which need a high level of insight.

    Ryan's jacket was based off a bomber which I began cutting into and modifying. And with this being a touch and go stage in the entire character pipeline, I foresee this being a constant thing I will have to do in future projects. For that I'm very glad people are kind enough to upload pattern/clothing tutorials online.

    2. The retopology to low poly process
    This was the painful part. At the moment, I don't feel like there is a reliable and established workflow for taking Marvelous to game model.
    For retopo there are a few options- Marvelous' inbuilt retopo tools, or doing it in an external application. After giving Marvelous' tools a try, I need to say that it is still at a very basic stage. It also doesn't seem great for complex shapes like the jacket I have.
    • Highlighting vertex connections at pattern edges is a very nice feature they have. Maybe the only good thing.
    • You can easily overwrite your existing patterns with your retopo patches (I don't know why they allow this)
    • Patch quad tool has a lot of limitations to use reliably
    • No way to 'relax' topology
    • Cutting, subdividing, sliding edge loops, etc. doesn't exist. You just have to manually click each point
    I ended up just doing it in Maya with the conform workflow. But that also brought the next stage of problems.

    Because you don't have the 'vertex connections at pattern edge' feature, you have misaligned vertices at your pattern edges despite having the correct edge vertex count after conforming. Manually target welding is a tedious process. And if you have to make any adjustments again before this stage, good luck, you need to target weld all over again. It's a timesink.

    Next is adding thickness to this retopo model.
    There is the ZBrush method, but panel looping and eventually reconstructing subdivision levels is a coinflip.
    I also found a Geometry Nodes tool by Outgang for blender that will do this process, but that is also a coinflip. No kidding, I felt like I brought in the same mesh to run it through the nodes, no changes, just reopening the scene and importing my mesh, and it will work maybe 4 out of 10 times.
    You might be able to understand why I felt like running my head through a wall  :)

    MD has been around a long time already, I really hope for better tools to improve this process soon.

    UV and texturing was straightforward. Past the tediousness and frustration of the previous stage, it was nice to see things coming together.
  • de_cbble
    Offline / Send Message
    de_cbble polycounter lvl 7
    For hair, I was looking at hair cards, at the start. I looked at some game models to see what their breakdown was like, namely from the newer RE games since it's fantastic stuff.

    Then I started trying to place everything by hand. I quickly realized that was probably not the smartest nor fastest way to work given the shorter hairstyle, so I gave up on this and refocused on the hair strand goal.
    I went to XGen, saw some tutorials, had that one Gordon Freeman artwork as a reference and went to town.
    I think it's alright. XGen obviously has a very high skill cap, and I do not have the intuitive feel for the modifiers and guides to know how the artist created such natural hair for the Gordon piece yet.

    I can bring that into Unreal easily. The next step is, if I hypothetically wanted to make hair cards, how do I get it to match this groom I already have?
    I think the solution is Unreal's Hair Card Generator.
    For this shot I actually used values that were a bit lower in the parameters, if you give it slightly higher numbers, you can get a better result (including the hairline). I tested this but didn't take a screenshot since I ended up not using hair cards for the project.

    I don't know if it was just me or Unreal, but I also did have a lot of bugs getting the HCG to work, concerning Python modules then groom bindings in UE5.6. I posted my solution in the link above for the Python stuff which I think works, but for the groom bindings, the bindings for my eyebrows and eyelashes just caused the project to crash on startup. I don't know why or the solution for this. But I think it somehow works now also. Either way it was a very buggy and tiresome process to troubleshoot.

    Also, along the way I blindly decided to convert my mesh to Metahuman, which of course brought its own set of quirks.
    • XGen depends on a scalp mesh that is not the shape of the converted Metahuman head.
    • I needed to recreate the texture to the new Metahuman head UVs.
    • Because I didn't start with Metahuman topology, there will be some minor volume differences during the conform process.
    The solution for XGen is to use blendshape for the scalp mesh. But for points 1 and 2 there is also a lot of back and forth wrap projecting between the Metahuman head basemesh, my current head mesh, and the projected Metahuman head mesh. Each is transferring different data like the shape or texture detail from one to another for the end result. Since I didn't start with Metahuman in mind, this is a lot of additional work.

    In Unreal, the main thing is that Metahuman just has a lot of subsurface scattering, no clue why. Adjusting the subsurface profile has limited effects. It will still look overly waxy.
    This is exacerbated when you load a custom head/skin texture, which I did, see below. Left is a normal material on my original mesh. Middle is the converted Metahuman.
    On the right is the Metahuman, but this time I managed to modify the material enough to allow me to control roughness and the subsurface scattering amount. I had to drop it down to basically zero. Even then, it still looks like there is some subsurface scattering applied as it is much brighter than my original mesh (discounting the lighting difference which I was messing around with).

    In case anyone is interested, you just have to insert these nodes into this section of the Master material of the Metahuman skin shader. You can then control SSS and roughness in the MI after.


    Learning, discovering and fighting against all of this took tens of hours. I went back and forth on giving up on using Metahuman. But here we are.
    Next I might talk about rigging, if not then I'm just jumping to my lighting and artwork posts.
  • HarlequinWerewolf
    Offline / Send Message
    HarlequinWerewolf sublime tool
    Super cool to see all your work on this. I have also had to deal with the whole xgen file into a metahuman mesh before and it's not fun :p But your results are looking great! 

    For the SSS thing, I agree that the default metahuman material is a bit waxy, but you can edit it in the actual material:

    There's also edits for spec and rough if needed too
  • de_cbble
    Offline / Send Message
    de_cbble polycounter lvl 7
    For the SSS thing, I agree that the default metahuman material is a bit waxy, but you can edit it in the actual material:
    There's also edits for spec and rough if needed too
    Thank you! I feel like I did tweak those sliders at once point but couldn't notice any change. But now it does work the same way though, I'll use this in the future

    Also I really liked your Jedi artwork, I had a lot of questions to ask when I saw it but I didn't want to clog up your thread at the time 🫣
Sign In or Register to comment.