Wasn't sure which forum to post this in, since the presentation gets a little broad (especially the last 10 minutes), but figured this might spark a good discussion on any number of production principles. Gamespot was cool enough to livestream my GDC presentation, and give a Youtube link:
[ame]
https://www.youtube.com/watch?v=N6pdeECc5d4[/ame]
Edit: added the slides here for no video peeps:
http://www.slideshare.net/MichaelPavlovich/gdc15-michael-pavlovichupload
In the last few day's since GDC we've streamlined the process even more (very close to a Designer > Painter Polish, skipping Player entirely for example); always pushing for speed and artist-friendly when and where we can.
Anyway, if you have any questions or want to see any renders I can certainly provide those, and like I say in the video I'll be working on my LOTRExtended edition version of this presentation in a bit. At the very least I hope it's a somewhat entertaining look into the life of a development artist.
Replies
Slides uploaded in case video is not your thing!
http://www.slideshare.net/MichaelPavlovich/gdc15-michael-pavlovichupload
http://www.polycount.com/forum/showthread.php?t=149163
Do you have a link to the 3 hour video you mentioned, has that been recorded and uploaded?
I'm hoping to get to that this weekend, but don't hold me to that...I'm wrapping up some last minute deadline stuff at work this week, minus a day since CA signed me up for SXSW on Friday (doing pretty much the same song and dance, come by and say hi!), plus I'll probably be in San Antonio for at least a day this weekend...
Long story short, I hope to get to it soon, with some newer info on the texture pipeline.
Hahaha, no you're not alone. I went off the deep end (kind of on purpose) just to show how far you could conceivably extrapolate the role proceduralism will have in the creation of the ever expanding worlds we're going to need to make...and then even more off the deep end. Then even more. But by then I was in the last minute of my hour just to sneak in some Star Trek references.
So well done for tying together practical development tips with big picture sci-fi dreaming.
also looking forward to using Kuardro!
Don't. Use Houdini Engine and the rest leave to me
As I said in this => http://www.polycount.com/forum/showthread.php?t=149831 <= topic, give it a spin guys. If you want to see some tool, just say it. Preparing a tool like this (in the link), that will work with Unity/Maya and probably UE4 in April will take maybe 1-2 weeks with all crazy options you want. You don't have to learn Houdini, you will have 1-3 buttons on the tool wich you will have to only press to get effect you want. Is is to complicated? Before you take it for a spin, make sure to give me a sign so I could prepare tool for you to download, so you could take full 2 weeks of trial time to test it.
PS.
Houdini is 3 times Maya. You are looking at 2-4 years of learning before you will be comfortable with it enough to fully embrace its power. That's not a Substance Designer. It's not artist friendly tool. It's a TD friendly tool.
Interesting... So your saying that preparing houdini prop(example: barrel, stairs, doorway) would take a week to someone experienced with it? Sounds crazy long, especially when Pavlovich states they have a button to do that
As for the barrels, I think you have to stop thinking in terms of complete props there. It's a long story and I'm not sure which direction they took but I suppose they use a mix of procedural stuff + manual work to prepare some things that are then procedurally merged together.
It's like when you have button texture in Unity, you define what part of it can be scaled and which shouldn't, that way it always looks good, even when button have non standard size.
Or TriMesh brush in Zbrush. Of course with Houdini you can go a lot more crazy than this. Just going with procedural will be a little limiting.
Or using more modeler friendly terms, I think they are kitbashing procedurally
But don't let that stop you, if you're interested!!! One of our badass houdini artists is an ex-environment guy! Pretty sure you can try out Houdini for free, so get to downloading, and fire up some of their tutorials on vimeo. Worst case scenario you get a little taste of procedural tools, nothing wrong with that!
You do jump around a bit fast and skip over some essential info, so I have some questions:
-You have this uber-tool that allows you to take a zbrush highres mesh, and turn it into a sketch ingame mesh, all automatically. Seems to be part Houdini for mesh crunching and so on, then some substance batching for textures. Would love to get a bit more detailed overview of the steps it runs through. Wondering if it does things like collisions, lightmap uvs, rigging even. How long did it take to put this together, how adaptable is this, is it really locked to the pipeline of a certain game?
-You seem to suggest that you've solved the problem of having a limited pallete of pre-set swatches for material ID's, that's super cool. It sounds like it analayzes/determines what color is most used, what color is second most used, and sets up the Substance parameters for that. Can you elaborate a bit more how that works?
-It's smart to store the presets for Substance texture setups, so they can be reused and solve issues with bad defaults. How do you store/manage these presets though? If you only store them locally, for every user you don't get much use, but if you have some central place for them it could get cluttered with everybody's presets. What's the process here?
I'm super inspired here, would love to build something like this too! Also loved your little allegory with bringing water to the house, and having artists juggle it on their heads and all, haha!
--You're correct, most of the behind the scenes stuff is Houdini doing its thing for meshes and UVs, and Substance handling the textures. For things like collision, UV sets, rigging / weights, that starts getting into the assembly part of the pipeline. For example, Houdini could handle the collision, but the assembly code would determine how the scene would be set up, in what package, with what tags, group nodes, naming, nesting, export options, etc...same process for UV sets
Rigging / weights gets into a whole other thing, but there's some SUPER interesting stuff to do there, too! I'm uber-simplifying here for brevity, but have skeleton reference scenes, archetype scenes with good weights to transfer from (or a system in place to transfer weights and allow you to clean up the weights on one of your iterative meshes and then continue to use that mesh to transfer weights from on iterations), etc... then animation is a whole thing, physics, etc... That's an entire presentation in and of itself, maybe next year lol. We haven't tackled the character side fully, but with the experience at this studio I'm totally psyched for what we come up with on that front.
The automated iteration system on meshes / UVs and assembly didn't take that long...but Luiz is an amazing TA, so your time estimates may vary? There's always some back and forth and working out process and bugs, and it's kind of hard to say since we are doing production at the same time...maybe a month? Plus, there's different needs between environments, props, characters, etc... that we're still working on, but the philosophy remains the same quick iteration, non-destructive automation. In some instances we end up baking layered masks instead of a flattened texture, large scale environment objects for example, etc... theres always something beyond the base level of functionality that might be quick to implement, but take a long time to fully flesh out.
And I didn't stress this as much as I wanted to in the presentation, but the "agnostic" part of the pipeline is the models and textures-- that process never changes, regardless of the game or engine. It's on the artist to perform up to a certain standard, how they choose to do that is up to them. Materials, textures, and models can be used anywhere, the assembly part is where the engine-specific scene set up: texture packing, naming, etc... happens. This allows us to point our agnostic pipeline in any direction, and fire out the correct assets in any direction to any engine.
--Yes, this was super cool! Having to keep track of color ID's for specific materials wasn't going to work over the course of an entire game, so giving the artists the 255/128/0 limited swatches for robot eyes and handling final material selection as a preset seemed to be the best option.
Basically (in the case of a complex multi material object thats going to be baked down), the artist assigns whatever color they want to each material section. The texture artist, when creating the custom graph, is able to arrange the materials from most commonly used to least commonly used. This of course will have varying degrees of success (especially when you start doing super generic custom graphs like military_crate.sbsar), but it at least provides a start for material assignment. The robots look at the material ID texture, evaluate the most used color to the least used color, then assigns the corresponding materials accordingly.
Like I said, this might not work every time, so the user is always able to go in, and (using a drop down, even more elegant than whats in the slides), swap materials. Basically all its doing is taking the original material ID, and re-writing a new one based on per-determined material ID colors. This and any wear slider changes will be saved as a Player preset, which is auto-generated by the robots (where its saved, what its name is, what baked maps are plugged in, default wear / what wear has been changed, material ID re-write, etc ). Which again can be opened by the artist, changed, then saved, which allows us to continue to batch through iterations.
--for sure, presets could be a mess if theyre not able to be accessed by everyone, especially when a source file (material, mesh, etc ) gets changed and needs to be batched through. Presets are auto-set up and saved by the robots, and its source controlled on a per-asset basis, so it gets checked in, and any particular preset (even one from a generic custom graph like military_crate.sbsar) will be specifically assigned and organized with a unique object (military_ammo_crate_medium_003.obj). The best thing about organization, naming, structure, etc is that its all handled by the automation robotsit saves out the lows, makes folders for baked maps, exported maps, places and saves the presets, etc No more boring-yet-exhausting asset juggling, incorrect-naming, forgot-to-check-in human error stuff. Which allows for more time to do artist things!
Whew! I know that even with the more detailed write up it STILL leaves a lot of unanswered specifics on the assembly, auto-modeling, substance utility swapping and preset side of things, Luiz might have to jump in here and field some questions. Im actually hoping this becomes more mainstream and just gets integrated somehow with Substance on the texture side. Like I may have mentioned before, were REALLY close to a Designer-to-Painter workflow, with the same functionality, so having the ability to batch through Substance, fire up painter, adjust some wear, then ALREADY begin to non-destructive polish so awesome. Cant wait, so much exciting stuff to work on!!
Sorry for the wall of text!
Xoliul, are you Robert? If so, Hi!
Yeah, as pav mentioned the pipeline is designed to work with multiple engines, it already does work with 2 different ones. You should check out my talk when it hits the vault.
The oversimplication is
High Res(Zbrush) > Game Res(Houdini) > Bake Maps(Substance Batch) > Render Substance (Substance Batch) > Assemble (Set up files for game specific needs) > Import
The "tool" is a bunch of glue code that makes all the software talk to each other and hand files from one command line to another. I basically wrote automation modules for every software in the pipeline, so now I can take a high res, run houdini get a game res, run maya and setup a scene for export. That kind of stuff, it is pretty holy grail, but the point of the whole pipeline is to be modular, and let you mix and match the pieces you want. If you want to use blender or simplygon to generate your gameres, swap out that module.
The material id evaluation is pretty basic, I just look at every pixel at the image and index the colors together (within a threshold) then I can see which colors are the ones that are used the most, and do a simple color replace for which colors the graph is expecting. We're kinda doing red, green, blue, yellow, magenta, etc in order. So the artists can also have a base idea of how to paint the source object and then the material pallete is art directed to have materials that work well together.
And then finally the presets, we do save them with the asset, because it's basically the formula to regenerate that asset. The textures and game res are less important because we can regenerate them, but the presets, material template and high res are the "source". Substance Player has the concept of sbspr files, we're bugging the allegorithmic guys to add it to Painter as well.
Let me know if you have any more questions or if something I said didnt make sense
Best!
Luiz
You spoke about having other videos online, any chance you could link those? (i'll check your youtube page).
I've been to a couple GDC's now and this one is hands-down one of my favourite talks to date.
The question that I cant help myself to ask is why dont you sell that glue code? :]
That would be "shut up and take my money" thing. Even if that is not a "software" that runs out of the box.
Also a big Thank You guys for this enlightning presentation and thread and motivating me to learn houdini :]
Also Orbolt could use some game centric additions (hint hint).
Thanks for the kind words! Yeah, most of my more current stuff is on my youtube channel...but of course things change so rapidly it gets difficult to stay on top of the latest and greatest tutorial stuff, I hope to get back into that sometime this year. I made a bunch of videos for www.eat3D.com, but like I said, it's been a while so some of it might be a little dated.
Youtube playlists:
https://www.youtube.com/user/Pavlovich2005/playlists
it's funny how both you and Mike have done Eat3D vids, so have I with the Dozer series. Wish I could get back to updating that, applying some of this stuff to it...
So a few more things i wonder:
Are these generated lowres meshes really shippable? I work in a studio where everybody is super skeptical about that stuff and hates any generated results, it always gets tossed in the bin as 'shit quality', so it would be interesting to hear what sort of ratio of generated stuff you shipped with.
I mean it's gotta have problems with thin stuff, or things that should get rigged (splitting)?
More technical even, what did you write this in, just curious. Python, C#?
Isn't going over every pixel in a 2-4K image on the slow side?
HOWEVER. I am certain that at some point we will have a shippable game mesh and UV solution that, even if not completely automatic, will require minimal input by the artist. Right now we're inching toward that goal (better edge detection, remesh algorithms that work for animatable meshes vs. hard surface models, UV algorithms that can auto detect where seams need to go based on any number of parameters), but if another company out there solves one or each of these problems, it's just a matter of adding it to our glue and running assets through it.
The iteration part on the tools side is adjusting parameters based on problem areas and refining as you go through test cases...it may not start out shippable quality, but it's still useful for getting things in game to evaluate as you iterate. Eventually, if there's enough massaging of the tool based on test cases, balancing speed vs. quality, etc...you can start upping that percentage until you reach that holy grail of "automatic game res and UVs". Has it happened yet? Not even close! Is it easy? Hell no, if it was somebody would have done it already! BUT, it's in, the system is running, and there's a lot of clever people and companies out there...it'll happen!
Yeah, like Mike mentioned, the meshes are decimated meshes. They're fine for props and organic stuff like rocks, characters and subD like hard edge modelling they're not as good. BUT the point of it, like Mike said, is to iterate, do the beautiful game res, once, during polish, after it has been in the game for months (if you want to). In his video he shows the auto generated model rigged and animated, it looks fine for 90% of the world. And honestly, better than some rushed asset that some poor soul had to crank out at 2am before a milestone.
That being said, we're looking at better algorithms, there are a lot of good papers out there on this stuff. It's just a matter of time before I roll my own out. UV Master and ZRemesher are at the top of the quality and I bet there's a lot of smart people working on this problem right now. And yeah, it's easy for artists to dismiss the robots works as garbage, but you gotta find the Pav's out there in your studio and help you make the robots smarter.
On the technical side, Python for everything!!11!!one!!! numpy is actually pretty fast at taking an image as an array and doing stuff for it, doesn't take long at all to get the colors from a 4096 map.
Thanks! That's very kind of you, and yeah my talk will be up on the vault. I go a bit more in depth on the tech side of the pipeline and some procedural modelling stuff we did as well.
Also just wondering, Mike jumped over it without mentioning, but I assume the reason you went from Quixel to Substance is because of the 'pipelineability' ? Like you can't call it to batch (fast) and you can't build your own effects (easily)? No perceived 'bad quality' issues with texturing with Substance instead ?
thnx for pointing me in the direction of this thread. Yeah ZBC threads accumulate so rapidly if you blink you can miss a post. Also it's great to see lots of folks here asking for the extended breakdown you had mentioned. I was gonna ask in both of the other threads but didn't have the stones.
Looking forward it.:thumbup:
Pipelineability played a good part in it, but speed and control are really the winners for us. I'm sure Quixel can be automated with the batch preset features they have. But being able to feed our own materials, make custom graphs for non realistic materials and generate our own wear nodes was pretty huge. If anything we're getting better quality out of Substance than we were with dDo.
If you're a single guy doing a model by yourself Quixel is awesome, but if you're a company and you can afford to get a guy to make a material library to your liking and generate some good wear presets. By leveraging your best texture guy to build something better suited for your needs.
That and baking something in Substance is super fast compared to waiting for dDo to do its magic with Photoshop. We'll still get the Megascans whenever they come out, because they look amazing, but on the generation side, Substance is the current winner in my book.
Can't wait for the extended version if you are able to do that . Kudos to ikruel as well of course for coding it! I'm going to try out Kuadro too when I get home.
Thank you for talking about it. It's super inspiring!
This is why I stayed making old school art, more time being creative rather than juggling layers and masks in photoshop, not really knowing what the end result will be until it's in engine. I'll gladly work on a AAA game if this automated pipeline becomes the norm!
(no self promotion intended here)
Just HEngine or Houdini?
BTW:
What the hell is C++ POO? Is it OOP in french? In that case I would change it to english version to avoid confusion
Haha, indeed, that part is kinda old. My website needs a good refresh.
Ehm... you need tools for the engine to use it. Since Orbolt is not to heavy on game specific tools, or tools for other software, you just got yourself plugin that doesn't do much. At the moment it kinda works the best in pair with Houdini
Houdini was (still is?) a movie VFX tool, so there are not many people that uses it for anything else. And those that do use it for other things are mostly old users that doesn't need Orbolt to get tools for Engine. They will just create their own.
If there will be more interest from you guys, the tools will come and you will only need Engine. That's why I'm promoting it in topics that I think some of you could take advantage of it, if some experienced (me) person would prepare tools for you. one tool after another and you will end up with library of cool tools that you could reuse in multiple apps.
But you have to start asking about those tools. You will not have to learn new software. Plugin will contain couple buttons/field/whatever and that will be executed inside of your host app.
Fast iteration, procedural work and easy pipelines can make the process so much better for dev teams and I hope it becomes a standard!
I'd be really interested in seeing that extended talk also
Edit:
I might also quickly note that you mentioned in your talk that artists should spend less time on things that are not seen as much. While I agree this works with regular games, anything with a VR experience is dramatically different. If you see a low poly mug on a table that is in the corner of the room it is very easy to catch a persons eye. It just is uncanny that those kind of items are not realistically detailed.
People even mention normal mapping is so obvious to looking fake in VR that it is better off to use actual geometry. I thought this was quite interesting and I think we probably won't have the pc hardware to fully experience the detail required for a fully immersive experience for a while. I would love to have a VR experience running at buttery smooth performance with photogrammetry and real to life looking environments.
i can't stretch how important is what you say about automation. and while we have some of it covered, we have a long way to go. To catch up with what you seem to have already figured out.
Thanks fort this!
Beautifully explained, defiantly alleviated some woes about perusing these work smarter not harder work flows.
Like *raise pitchforks* texture in photoshop 4lyf! hur hur, clumsily propagate like 20 different layers across 10 documents, that stuff was like so last year.
The Substance equivalent is: Engine can load SBARs, and load them inside of Unreal, Maya, etc. But you need Designer to architect your SBARs... Maybe you can exchange your license? they're really good at customer support.
HEngine is a head less Houdini. It can execute tools created in Houdini with interface, but you cannot create new ones with it. API is to allow you to merge HEngine with application of your choice that is not supported by SESI.
Not everybody in company needs to know Houdini. Learning full application will take a lot more than 2 years. And it gets really technical with custom DOP solvers, or mind-bending with CHOPS, so it is not the best artist friendly app.
But Once the tool is finished with interface, artist can use it easily. And this is where HEngine enters. Artist doesn't need Houdini, he needs only possibility to run the tool in their application of choice, Maya/Unity/UE4/3dsmax/whatever, and this is what HEngine is for.
Contact SESI and say they you made a mistake and that you wanted Houdini a not just HEngine. I hope that you wanted Indie version?