If i create a new cube, inset all the faces, then add a projection modifier and push the cage out, it happens there too. Something to do with the fact the new faces dont have smoothing groups maybe. But i don't want to highjack your thread with this issue.
@kurt_hectic - I use Max 2013 and it does nothing for me. The mesh is editable poly but nothing happens when I run the script. You should check the Syncview script by the way. It does the same thing.
I assume for some workflows you could just use normal thief to grab normals from a pre-chamfered model that has been smoothed how you want it initially.
That pre-chamfered model may also serve well for LOD version/base.
Joost - Yeah silhouette is an anther question. It also a lot faster to do than FWVN. I think it looks better beceuase you get the "fat" looking rounding, as your example also show it.
I decided to muck around with this as well (having first seen it used in AmsterdamHiltonHotel's Atchisson) and when reading this thread I figured bevels would be the clear winner. However after trying it I agree; loops give those fat bevels that just read better at distance as long as you're going for that look.
Here's a comparison of loops and bevels:
In simple examples like a cube it seems that bevels are cheaper and visually better (similar soft shading as well as silhouette changes) but in practice the shapes you have often limit how large your bevels can be.
Originally I was going for FWVN w/ bevels and vcols; what I might do is a comparison of FWVN w/ loops against a lowpoly baked from a sub-d highpoly. Currently this is 25k tri with a loaded mag; which isn't insane.
Great comparison Bek! Yeah 25k seems fine to me nowadays, especially if it worth.
However, I think you could have use a lot wider bevels But in the meanwhile, what you are saying about their size is true. Could you share wires please, so people could see how it works and looks topologically on this.
Eh, particularly with pieces that have long and thin segments or parts of greatly varied scale you're often limited to the smallest bevel. You can only bevel so much before turning some parts into mush. Having said that, both techniques are totally worth using depending on the situation; but overall loops are more adaptable. And if you're going to UV'ing or making LODs by hand then loops are definitely friendlier to work with. As always though you should check your triangulation to make sure you don't have any super long/thin triangles. In taking these screenshots I noticed a few areas I'll tweak but the model is still a w.i.p; I'll probably spend some extra geo here and reduce some other parts. Anyway, wires:
Yeah loops are so much easier to lod. Thanks for the images! I have one question. Wouldn't it worth to simply normalmap those small holes on the left side, instead of modeling them? You know you can combine this these techniques with normalmapped details, and sometimes you can save a lot of geo with doing this. You are going to normalmap the star too right?
Actually I was going to use only geometry and vertex colours; no UV's/textures for something different and a little bit stylised, but now I'm thinking it could be cool to compare various styles. Currently the only area that suffers from this choice is the grip; it'd cost a lot of geometry to get the diamond pattern on there — and that's something that's essentially free with a normal map.
I think if I were to use both loops and a normal map it would be a production / timesaving thing rather than a common approach. Less important pieces might just get some loops while the more important stuff gets a highpoly counterpart.
Since normals maps are so essential to a 'realistic' style (not just in terms of soft edges but also minute surface detail / material definition) I'm curious what can be done without one — I figured the bakelite grip of the makarov would be recognisable even without textures. At a glance you can't really tell the difference between the 'medpoly' and the sub-d mesh which is a good sign.
"At a glance you can't really tell the difference between the 'medpoly' and the sub-d mesh which is a good sign."
Medpoly. Yeah this is how I also call them. And yeah, Its hard to tell the difference sometimes. Well, Personally I can see it now sometimes, but that happens only in the case of loops. Bevels can still fool me.
So you are going for something like AmsterdamHiltonHotel's weapons? As I remember they were also stylizes, and had no textures. You should definitely make a realistic version too, if you have time, so people could see its an actually usable technique. What could show it nicely, if not a weapon
How do you usually split lightmap UVs on such bevels?
I had some success with placing islands close to each other but with a small gap. Then interpolation kicks in and the bevel is lit properly. Still, it's not a true solution as I don't have the time to pack lightmaps manually every time.
Order 1886 used such bevels massively so they definitely had to come up with a solution
I don't understand the issue. I simply UV for lightmaps like I always do. The seam will be on one side of the chamfer. Also, lightmapping usually need precise uvs (straight uvs if possible). Can you post an image from what you tried, and what you get, and what you want to get?
I see the problem, but this is more like a lightmapping issue. What would be the size of this object in a game? Cause its like 2 meters tall then 16 or 17 is obviously not the way to go. But if its only like 10 cm then probably you won't notice it as a player. Does Unity use pixel padding on the lightmap generation? Anyways, sometimes for a this kind of shape, its better to keep that part together on the UVs. If you take a look at this doc (UE) they are trying to minimize the seams on the lightmap UVs, because else they could get similar issues. You should always apply these rules when working with lightmaps.These can appear even when you use edge padding. One solution is to increase the resolution depending on importance and size, and an another is to use almost uniquely made lightmap uvs with keeping the rules in mind.
If you take a look at your examples, you can see the shadowing doesnt properly lines up with the geo so one problem is obviously the not enough resolution. Texture res is one of the most expensive things nowadays, but one directional lightmap requires the same amount of power as one full color (rgb or rgba) texture sample. You can save a lot with using smaller lightmaps, but actually you should not use a lot smaller than the original map if its a unique thing and not something tiling. So if its a something with a unique 2048 map (albedo, normal, roughness) then you can simply use 256 or even 512 if its required for the lightmap (in case of high geo detail, it can be easily required). Its just an another texture sampler. Baking will take longer, but this is one was to reduce errors. But keeping the object size, the distance from the player, and the importance in mind is still a requirement.
The another solution is to simply try to use as least seam as possible. The link shows it how, but you will usually need a uniquely made UV.
Also... In full motion, gameplay experience, with a fully textured and shaded asset, it probably won't be this noticeable.
I see the problem, but this is more like a lightmapping issue. What would be the size of this object in a game? Cause its like 2 meters tall then 16 or 17 is obviously not the way to go.
And also your object could be uvd from 1 or 2 islands. So you wouldn't get these issues.
Big thanks.
I hadn't even thought previously about using a single island, worried too much about stretching. But it's a lightmap, so yeah, I tried and it looks fine! It required bigger resolution though (so pixels could hit the bevels precisely).
As for the Unity: it applies padding (2 pixels) but you can't say as you would in Unreal: 16x16 px. Instead, you specify a global resolution of texels per meter. Then each object's lightmap UV get scaled according to its volume (
It has. If you take a look at post #104 by Avvi then you can see how.
True, and I am totally aware of that. But it is very unpractical and has the potential of being inaccurate on complex models or organic models. That's what I didn't like about.
I should've worded better in my previous post though.. so, sorry if I mislead anybody.
Reading this tonight has been really interesting. Going to have to explore this. Also I found a script that might be useful for Blender users in addition to Blend4Web:
Its a weighted normals calculator, found it to be a bit nicer than Blend4Web in some situations. That said reading the Blend4Web blog, someone has suggested to the developer weighted vertex normals.
EDIT:
Made a crude tutorial on how to achieve this in Blender:
Everyone who has tried existing FWN scripts probably knows this small problem:
To fix it in 3ds, you just add another Edit Normals modifier into the stack, select the offending faces and use Edit Normals -> Average -> Selected command. But what if we automate that too, with a way to exclude chamfer face influence? End result should look like this:
I'm not familiar with MaxScript types and syntax, unfortunately, but I think I know a very simple way to correct uneven normals on faces with matching planes after existing face weighted normal scripts do their job. It's a cleanup operation of sorts, which will ignore all chamfer faces but will average all proper faces per plane. Here it is:
Averaging normals for faces with the same orientation
Input: one float value (vertex distance threshold).
1. Add Edit Normals modifier to the stack
2. Use Edit Normals -> Select By -> Face command
3. Fetch a collection of all faces in the mesh (named e.g. faces)
4. Create an array of bools (named e.g. facesChecked) with the length of faces array - they will mark whether a face was already checked
5. Create an array of vector3 values (named e.g. facesOrientations) with the length of faces array - they will store the orientation of a face
6. Loop through whole collection of all faces in the mesh. For every index i:
6.1. Check the distances between 3 vertices making up a face - if one is smaller than vertex distance threshold input, set facesChecked[i] to true and jump to next i index, if distance is bigger, continue at this i index
6.2. Calculate the orientation (plane) of a face, using only positions of 3 vertices making up a face (do not use vertex normals!)
6.3. Write the calculated orientation to facesOrientations[i]
7. Loop through whole collection of all faces again. For every index i:
7.1. Find whether facesChecked[i] is false, if it is, continue at this i index, if not, jump to next i index // if facesChecked[i] is true at this point, then we have either already averaged all faces with this orientation, or this is a tiny chamfer face that was discarded in 6.1
7.2. Set facesChecked[i] to true.
7.3. Fetch orientation of a face using facesOrientations[i]
7.4. Create an array of ints (named e.g. facesOrientationsMatching)
7.5. Loop through whole facesOrientations array. For every index j:
7.5.1. Find whether facesChecked[j] is false, if it is, continue at this j index, if not, jump to next j index
7.5.2. Find whether facesOrientations[j] is equal to facesOrientation[i], if it is, add j to facesOrientationMatching int array
7.6. Select face faces[i]
7.7. Loop through whole facesOrientationsMatching array. For every index j:
7.7.1. Set facesChecked[j] to true
7.7.2. Add face faces[j] to selection
7.8. Use Edit Normals -> Average -> Selected command
7.9. Clear selection
Can someone attempt to translate this into MaxScript?
Both had the same issue I highlighted in the screenshots. Looks like I missed another script. I looked for "syncview 3ds" in Google right now, but I haven't found a script called like that, unfortunately (only some weird AutoCAD utility)
Edit: Aha, found it on the wiki. Still, manually selecting faces isn't too nice on large meshes. I think the algorithm above can be easily modified to work with syncview or just give you useful selections.
I hadn't even thought previously about using a single island, worried too much about stretching. But it's a lightmap, so yeah, I tried and it looks fine! It required bigger resolution though (so pixels could hit the bevels precisely).
As for the Unity: it applies padding (2 pixels) but you can't say as you would in Unreal: 16x16 px. Instead, you specify a global resolution of texels per meter. Then each object's lightmap UV get scaled according to its volume (
Hmm. I wonder why nobody posted about this yet. We are all attempting to repair the broken face shading left on our preexisting faces when we average-smooth the geometry after chamfering. But what if the chamfer process was never actually allowed to modify any normals of your preexisting hard-edged geometry, and instead used those preexisting normals for it's own boundary vertices?
Stock 3ds Max 2015 Quad Chamfer modifier can't do that, but standalone Quad Chamfer plugin everyone knows and loves can. Try this:
Looks like the chamfer options "smooth chamfers only" + "smooth to adjacent" does that (but i'm on 2016)
But if you collapse or put an edit normal, or poly, it is overriden so, meh. Too bad it's not editable. Maybe an export/import would write the custom normals.
I mentioned it in the fourth post on the first page. It's a massive time saver for my workflow. For simple objects you usually don't need to collapse the modifier anyway. There are ways around it though. For example you can assign a separate material ID to all the chamfers, collapse the modifier, make your edits, select the inverse of the mat ID and then run Syncview's script. Obviously doesn't work if you're working with multiple mat IDs though.
Unfortunately that's my thread so it doesn't reveal anything new at all (well, beyond the confirmation from one of CIG artists that I got the gist of it right). It's only what we all already knew from GDC and Polycount threads on FWN/deferred decals, with a bit simplified explanations.
For Blender users using "data transfer", I created a simple setup that someone might find useful as well.
1. before starting actual modeling create a linked duplicate of your object
2. add a "split edge" modifier to the duplicated object, turn off "edge angle"
3. add a "bevel modifier" to the original object, switch limit method to "weight"
4. add a "data transfer" modifier to the original to transfer custom normals and pick the duplication as a source of course. I usually switch to local space and place the objects next to each other
5. start editing your object, define bevel weights to the edges you want to bevel and set all beveled edges to sharp.
That's it, this way you can see the result of your custom normals while editing.
Of course this might not be working in all situation.
I've noticed with this technique there's some small problems when it comes to UV unwrapping, specifically the relax tools, they don't seem to perform as expected i can only assume this is because it uses vertex normal data to operate correctly?
Anyone else experience errors like this?
I haven't meet this issue, but I think its because I always unwrap first, and add the custom normals at the very end. This can be a simple workaround.
I add them at the very end because Max like to reset the normals on detach, and on adding specific modifiers. So adding them after the model is done won't leave a chance for this to happen.
I've been trying to replicate this method within Maya and create a pipeline around it, but to no avail. Either Maya calculates it incorrectly or I'm misunderstanding the process quite badly and messing it up.
However, I have been able to replicate smooth custom normals akin to the chamfered cube examples in the original post, by simply setting the normal angle myself to 90 degrees. However im fairly positive this is far from the optimal process of doing it that can be applied on a broader art pipeline.
- After more experimentation and rereading the thread ive found a solution to me questions.
A huge thank you for everyones documentation in this thread so far though!
Excuse me if this is basic but I get how FWVN are used on relatively hard edges to get a nice highlight but are they also used on softer edges too? For example in the screenshot below would the curved wall panels on the top right be created via FWVN or would they be support loops? I also guess part and parcel to this question is what dictates when you use support loops and when do you use FWVN? IS there a rule of thumb or is it just what fits where?
This is great info and has been really helpful. I just wanted to share that I found Shawn Olsons Normal Tools http://dev.wallworm.com/topic/68/normal_tools.html really useful when editing normals, so for anyone doing a lot of this it may be worth looking in to.
Sorry for the delay... @Artist_in_a_box - I would say they are done with FWVN. Use one or the another technique. Personally I prefer to use FWVN because it gives you better silhouette, because you are actually rounding the corners, plus it gives you the nice edge shading. Support loops gives you only nice edge shading, but it doesn't improve the silhouette, and it is also more polygons.
Thank you all for such a treasure trove of information! Only a month ago, I had no idea that editing vertex normals could be so beneficial.
Any Blender users following this thread? Checkout my recent addon. You may find it useful for workflows that involve weighting vertex normals. https://github.com/fedackb/yavne
It seems that ByteHazard script for 3d Max doesn't work with 2014 and upwards. Does anyone have a working version or any sort of substitute? What's the best plugin for editing vertex normals in Max currently?
There is a convenient approach for this in Blender. I have pretty old Max so probably it would work in modern Max too.
You make all the edge bevels in modifier . Blender allows to make every beveled edge having its own chamfer width by "weight" adjusting in real time. Then make an object copy and turn bevels/chamfers into multi-segment curved ones (one click). Rarely a few extra sudivision/edge loops might be also necessary. Then just transfer normals or export both in Max and use normal theft script there.
I had been always doing the transfer in Max before but right now Blender seems like doing correct fbx export of normals and its data transfer modifier works no worse than theft script. I would say even better since it transfers in real time like Maya.
holy shit dude!!! thanks for that data transfer tip!!!!!!!!
Hi guys, this is an awesome technique and we're making extensive use of it for Horror City. I've written a script that accelerates the whole process and I'd love to share it with you guys. To use the script, simply copy it to your scripts folder and then create a shelf button that executes "MADE_forceVertexNormalsToFace". Now you can select all faces where you want the vertex normals forced to the face normal and press that shelf button. I've uploaded the script to GitHub and you can get it here https://github.com/MADEAPPS/Maya2016/blob/master/MADE_ForceVertexNormalsToFace.mel
Awesome thread. I've been seeing more of this topic being discussed lately. @mannyn I thought Maya did this automatically in every instance. I've just tried it with very basic shapes so I am not 100% sure. Can you show any examples? Also, here is how it works in Maya. I went into looking for scripts that do this but I realized Maya do it automatically. Here an example with basic bevel and softened edges.
Awesome thread. I've been seeing more of this topic being discussed lately. @mannyn I thought Maya did this automatically in every instance. I've just tried it with very basic shapes so I am not 100% sure. Can you show any examples? Also, here is how it works in Maya. I went into looking for scripts that do this but I realized Maya do it automatically. Here an example with basic bevel and softened edges.
Initially I also thought that "Soft Edges" would do the same, but after closer inspection it does not hold up. The script also helps with different angles of bevels, not only 90° angles. With Maya's soft edges functionality, there is still a gradient, even though it's not as hard as a regular "Average Normals" invocation - it's still there.
If you really want your models and lighting to pop, you have to be precise about the normals. Else it will look kinda right, but it's not perfect. Kinda similar to a normal map that has been baked wrong. Check out the image, the yellow normals are with the script i've posted. The green normals are regular maya "smooth normals" or "set normal angle". The fact that the normals are not perfectly orthogonal, results in the gradient everybody hates.
Replies
An even easier method, just found it in Blender's manual DataTransfer modifier is SO powerful! vertex colors, normals...
https://www.blender.org/manual/modifiers/modify/data_transfer.html
Here's a script (not mine, check out the whole topic) that could help you in fast fake-ing round edges in 3ds max:
http://www.polycount.com/forum/showpost.php?p=2337257&postcount=47
Yes. Just enable 'Tangent Space' in FBX exporter. Then it works greatly in Unity and xNormal.
I assume for some workflows you could just use normal thief to grab normals from a pre-chamfered model that has been smoothed how you want it initially.
That pre-chamfered model may also serve well for LOD version/base.
Cheers
Dave
I decided to muck around with this as well (having first seen it used in AmsterdamHiltonHotel's Atchisson) and when reading this thread I figured bevels would be the clear winner. However after trying it I agree; loops give those fat bevels that just read better at distance as long as you're going for that look.
Here's a comparison of loops and bevels:
In simple examples like a cube it seems that bevels are cheaper and visually better (similar soft shading as well as silhouette changes) but in practice the shapes you have often limit how large your bevels can be.
Originally I was going for FWVN w/ bevels and vcols; what I might do is a comparison of FWVN w/ loops against a lowpoly baked from a sub-d highpoly. Currently this is 25k tri with a loaded mag; which isn't insane.
However, I think you could have use a lot wider bevels But in the meanwhile, what you are saying about their size is true. Could you share wires please, so people could see how it works and looks topologically on this.
wires0
wires1
wires2
I think if I were to use both loops and a normal map it would be a production / timesaving thing rather than a common approach. Less important pieces might just get some loops while the more important stuff gets a highpoly counterpart.
Since normals maps are so essential to a 'realistic' style (not just in terms of soft edges but also minute surface detail / material definition) I'm curious what can be done without one — I figured the bakelite grip of the makarov would be recognisable even without textures. At a glance you can't really tell the difference between the 'medpoly' and the sub-d mesh which is a good sign.
Medpoly. Yeah this is how I also call them. And yeah, Its hard to tell the difference sometimes. Well, Personally I can see it now sometimes, but that happens only in the case of loops. Bevels can still fool me.
So you are going for something like AmsterdamHiltonHotel's weapons? As I remember they were also stylizes, and had no textures. You should definitely make a realistic version too, if you have time, so people could see its an actually usable technique. What could show it nicely, if not a weapon
I had some success with placing islands close to each other but with a small gap. Then interpolation kicks in and the bevel is lit properly. Still, it's not a true solution as I don't have the time to pack lightmaps manually every time.
Order 1886 used such bevels massively so they definitely had to come up with a solution
I spent a while until I got this sharp edges problem here It's obvious why it happens.
Here I increased lightmap resolution by just 1 texel, from 16 to 17 and it looks nice.
Unity scene, if someone'd like to experiment: https://drive.google.com/file/d/0B3aZlreklH8wTW9sbXNOMXVEUFE/view?usp=sharing
If you take a look at your examples, you can see the shadowing doesnt properly lines up with the geo so one problem is obviously the not enough resolution. Texture res is one of the most expensive things nowadays, but one directional lightmap requires the same amount of power as one full color (rgb or rgba) texture sample. You can save a lot with using smaller lightmaps, but actually you should not use a lot smaller than the original map if its a unique thing and not something tiling. So if its a something with a unique 2048 map (albedo, normal, roughness) then you can simply use 256 or even 512 if its required for the lightmap (in case of high geo detail, it can be easily required). Its just an another texture sampler. Baking will take longer, but this is one was to reduce errors. But keeping the object size, the distance from the player, and the importance in mind is still a requirement.
The another solution is to simply try to use as least seam as possible. The link shows it how, but you will usually need a uniquely made UV.
Also... In full motion, gameplay experience, with a fully textured and shaded asset, it probably won't be this noticeable.
Link:
https://udn.epicgames.com/Three/LightMapUnwrapping.html
Oh and an another thing to keep in mind is the padding between the uvs. You can get bleeding very easily.
And also your object could be uvd from 1 or 2 islands. So you wouldn't get these issues.
Big thanks.
I hadn't even thought previously about using a single island, worried too much about stretching. But it's a lightmap, so yeah, I tried and it looks fine! It required bigger resolution though (so pixels could hit the bevels precisely).
As for the Unity: it applies padding (2 pixels) but you can't say as you would in Unreal: 16x16 px. Instead, you specify a global resolution of texels per meter. Then each object's lightmap UV get scaled according to its volume (
True, and I am totally aware of that. But it is very unpractical and has the potential of being inaccurate on complex models or organic models. That's what I didn't like about.
I should've worded better in my previous post though.. so, sorry if I mislead anybody.
http://blenderartists.org/forum/showthread.php?372785-Addon-Weighted-Normals-Calculator
Its a weighted normals calculator, found it to be a bit nicer than Blend4Web in some situations. That said reading the Blend4Web blog, someone has suggested to the developer weighted vertex normals.
EDIT:
Made a crude tutorial on how to achieve this in Blender:
https://www.youtube.com/watch?v=oAGEGBulzSU
To fix it in 3ds, you just add another Edit Normals modifier into the stack, select the offending faces and use Edit Normals -> Average -> Selected command. But what if we automate that too, with a way to exclude chamfer face influence? End result should look like this:
I'm not familiar with MaxScript types and syntax, unfortunately, but I think I know a very simple way to correct uneven normals on faces with matching planes after existing face weighted normal scripts do their job. It's a cleanup operation of sorts, which will ignore all chamfer faces but will average all proper faces per plane. Here it is:
Can someone attempt to translate this into MaxScript?
Both had the same issue I highlighted in the screenshots. Looks like I missed another script. I looked for "syncview 3ds" in Google right now, but I haven't found a script called like that, unfortunately (only some weird AutoCAD utility)
Edit: Aha, found it on the wiki. Still, manually selecting faces isn't too nice on large meshes. I think the algorithm above can be easily modified to work with syncview or just give you useful selections.
Stock 3ds Max 2015 Quad Chamfer modifier can't do that, but standalone Quad Chamfer plugin everyone knows and loves can. Try this:
You get this:
Any caveats I'm missing?
But if you collapse or put an edit normal, or poly, it is overriden so, meh. Too bad it's not editable. Maybe an export/import would write the custom normals.
I mentioned it in the fourth post on the first page. It's a massive time saver for my workflow. For simple objects you usually don't need to collapse the modifier anyway. There are ways around it though. For example you can assign a separate material ID to all the chamfers, collapse the modifier, make your edits, select the inverse of the mat ID and then run Syncview's script. Obviously doesn't work if you're working with multiple mat IDs though.
[ame]https://www.youtube.com/watch?v=34giO1tu43M[/ame]
Covers this technique pretty well, goes over the Star Citizen approach
1. before starting actual modeling create a linked duplicate of your object
2. add a "split edge" modifier to the duplicated object, turn off "edge angle"
3. add a "bevel modifier" to the original object, switch limit method to "weight"
4. add a "data transfer" modifier to the original to transfer custom normals and pick the duplication as a source of course. I usually switch to local space and place the objects next to each other
5. start editing your object, define bevel weights to the edges you want to bevel and set all beveled edges to sharp.
That's it, this way you can see the result of your custom normals while editing.
Of course this might not be working in all situation.
Anyone else experience errors like this?
I add them at the very end because Max like to reset the normals on detach, and on adding specific modifiers. So adding them after the model is done won't leave a chance for this to happen.
However, I have been able to replicate smooth custom normals akin to the chamfered cube examples in the original post, by simply setting the normal angle myself to 90 degrees. However im fairly positive this is far from the optimal process of doing it that can be applied on a broader art pipeline.
- After more experimentation and rereading the thread ive found a solution to me questions.
A huge thank you for everyones documentation in this thread so far though!
Excuse me if this is basic but I get how FWVN are used on relatively hard edges to get a nice highlight but are they also used on softer edges too? For example in the screenshot below would the curved wall panels on the top right be created via FWVN or would they be support loops? I also guess part and parcel to this question is what dictates when you use support loops and when do you use FWVN? IS there a rule of thumb or is it just what fits where?
@Artist_in_a_box - I would say they are done with FWVN. Use one or the another technique. Personally I prefer to use FWVN because it gives you better silhouette, because you are actually rounding the corners, plus it gives you the nice edge shading. Support loops gives you only nice edge shading, but it doesn't improve the silhouette, and it is also more polygons.
Any Blender users following this thread? Checkout my recent addon. You may find it useful for workflows that involve weighting vertex normals.
https://github.com/fedackb/yavne
Now you can select all faces where you want the vertex normals forced to the face normal and press that shelf button.
I've uploaded the script to GitHub and you can get it here https://github.com/MADEAPPS/Maya2016/blob/master/MADE_ForceVertexNormalsToFace.mel
If you really want your models and lighting to pop, you have to be precise about the normals. Else it will look kinda right, but it's not perfect. Kinda similar to a normal map that has been baked wrong. Check out the image, the yellow normals are with the script i've posted. The green normals are regular maya "smooth normals" or "set normal angle". The fact that the normals are not perfectly orthogonal, results in the gradient everybody hates.
Try the script, you'll love it.