Home General Discussion

Resources for Beginning Computer Graphics

Good Day All!

Coming from a programming background, about a month ago, I decided to take a plunge into (what seems like) the bottomless rabbit hole of 3D Modeling and Sculpting. I have no prior experience or training in the subject, and the closest thing I've ever done to anything of the artistic sense is that of Computer-Aided Drafting back in high school.

I've decided the tools I'll use are Autodesk Maya and Mudbox. So far, learning the interface and being able to create simple little models has been a breeze, with some touch ups here and there needed on my part (I've included an example of my work that I completed within the first week using various example photos below).

However, I seem to have a bit of a problem - when going through tutorials "jargon" is commonly thrown out for things I do not fully understand. Such terms as Normals, UV Mapping, Texturing, Transformers, NURBs, Workflows, Rendering, Dynamics, etc.

I've come to the realization that I don't have a foundational understanding of the theory of graphic design. In the programming world, there are many such resources and references - one that I use quite heavily is a book called Programming Language Pragmatics, a language-agnostic book that describes heavily both the academic theory and practical use of all the different types of programming languages and the history of such.

So my question is, are there any such resources for this field? Something that's software-agnostic and hits on the higher level theory and reference of graphic design and computer graphics? I've looked in quite a few places but have not found a general academic reference that fits these needs.

Any help on this would be greatly appreciated!

vnxACnJ.jpg

Replies

  • ZacD
    Offline / Send Message
    ZacD ngon master
    The write ups for game art aren't as good or as solid as programming resources. There's a few reasons for it. You always need a visual example for art related terms and descriptions. Game art is constantly changing, things written even 5 years ago can be really out dated.

    The polycount wiki is a decent place to start, lots of links to tutorials and videos. Just getting started with modeling you probably want to watch a few video tutorials and take a small prop, model, uv, texture, and get it into a game engine. There's a link at the top of every page on polycount, http://wiki.polycount.com/

    Normals. Each vertex has a normal direction, it's basically the direction the surface is facing and determines how light will interact with it. The surface or triangles between verts will have smooth shading determined by blending between the vertex normals. There's also vertexes that have multiple normals, typically you'll see this referred to as a hard edge or a seam. There's hard edges on your model above at the edges of the box shaped wood beams. The edges around the sides of the poles have soft edges, and the normal between the vertices are being blended. You can turn on normals in the viewport to see the actually direction they are pointing.

    Vertex normals become very important when you are talking about normal maps, uv seams, and using game engines to display an asset. A normal map basically does what surface normals do, but at a per pixel level, instead of at the vertex level. Most assets in game engines use tangent based normal maps, where the vertex normal influences the shading, this allows the object to deform or be animated, and allows the texture to compress better. The downside is you have to make sure the normal map and the vertex normals are working together properly. That's a whole other discussion, but just remember normal maps add lighting information to the surface of a model by telling the render what direction a point on the model is facing.

    UV Mapping. This basically tells the game engine how to apply a texture to a model. You need to learn how to make a proper UV map. With a UV map you are basically taking your 3d model and flattening it out onto a 2d plane. you don't want to stretch the faces of your model more than you have to, and making a smarting UV layout can make texturing easier. You can kind of think of UV mapping as those fold up paper craft things where you glue tabs together. The main differences are, you can stretch faces, and you do not want to have too many seams. You should really plan out your seams and start from there. If you are going to UV map a can, you would have the top and bottom be their own UV islands or pieces. The side you would have to have one seam and it would be like a paper label that wraps around the entire can.

    Texturing. Actually before texture we should talk about baking.

    Baking. Basically baking will allow you to take extra information with your model and apply it to a texture. Often times this involves a high poly model, but you can really bake anything. Lighting information, material IDs (what part of the model is going to be what material, gradients, normal maps, displacement maps, ambient occlusion, cavity maps, etc. These maps you bake out are going to help you start your texturing process. You'll almost always at least a ambient occlusion map, material IDs really help with PBR rendering as well, and quickly allow you to set up masks for photoshop.

    Texturing. This really depends on your game, your game engine, and target specs. The three main types you are going to see hand painted, physically based, and everything else (or non-physically based). Games are moving to physically based lighting, Unreal Engine 4, CryEngine 3, Unity, and a ton of in house engines are moving to it, there's too much to explain in just one paragraph, so I'll link you to more information in other threads if you need help with it. Basically with PBL, you'll want to define the materials based of real world values. There's a base color, metalness (is this a metal or not), gloss value (how tight or glossy are the reflections), plus whatever other textures you want like normal maps. It can be easier if you are just starting out before it's very straight forward. It's still new so there's a lot of different names, terms, and variations on the PBL workflow, but in the end, it'll look much more realistic. Handpainted texturing is what you see in games like World Of Warcraft or League of Legends, Dota2 is kinda a mix, Darksiders 2 is kind of a mix as well. With hand painting textures you are adding a lot of lighting information in your textures. You'll be often painting in a lot of details on a wacom or similar tablet, and it required some pretty good 2d art skills. Hand painted work tends to be lower spec so there will be a lot of tiling textures and you might not use normal maps.

    NURBs. You wont ever use them for games, or at least for the foreseeable future. Basically instead of working with vertexs, you are working with curves. The advantage is everything is perfectly smooth and you can zoom in as much as you want and never see edges. The cons are the tools for modeling with nurbs aren't great, and they don't work for games.

    Workflows. Basically the route you are taking with a model, you can start and finish a model a dozen different ways. You could start a model with Zbrush, Maya, or a dozen other applications, you could sculpt once you have a high poly model, or you can do pretty much everything in Zbrush. You can use a few different applications to help you texture, but a workflow is just the flow chart of what you used to start, work on, and finish a model.

    Rendering. For games, you really want to use a real-time engine to display a model to get an accurate representation of what it's going to look like and what it's being made for. The main offline rendering programs are Mental Ray and Vray, and they are used for commercials, movies, and still shots, not games. For games you will want to use viewport shaders (Xoliul shader for maya), or a model viewer (Marmoset Toolbag 2), or a game engine (Unreal Engine 4, CryEngine 3, Unity).
Sign In or Register to comment.