Now that this is done, I can actually start criticising Blender from most of it's aspect since I touched most parts of the software...
What can I say...
I think everthing holds up pretty well. MOdeling is extrmely fun as well as Uving..
Rigging is allright but lacks some flexibility in my point of view if you don't know much python coding (you are limited to constrints and drivers, no rigging nodes are available (yet?)).
Animation is...mm there are good points and there bad points.
My main concern is the time slider..there's absolutely no sue to it except sliding your animation. I tihnk it would be wise for the devs to actually take exmaple on Maya for this one and let the user directly edit his keys inside of it..
Instead, we are obliged to open another panel (the dope sheet) just to do some basik key editing..Not only it takes sapce on the screen, it slows down workflow and gets tiresome quickly...
I didn't touch the rendering and cycles engine too much cuz thats totally another world to master..
If there's going to be any chance for you to resolve the startup time issue, you're going to have to head on over to developer.blender.org and give the developers a reproducible test case and instructions that clearly illustrate the problem. It could be your scene with the character or it could be some other file that takes a lot more time to open on 2.76. I can't seem to find any blend file on my machine that loads less than instantly on 2.76, even using Windows and an HDD, so there's not really much that I can do to help you than direct you to the proper place to report, isolate, and fix bugs. Edit: Also if you want to learn Cycles I'm here to help. It's my favorite part of Blender.
I have a question about custom normals, maybe someone can help me with that.
I can create custom normals using the "data transfer" modifier, that is no problem at all, but when applying the modifier, tons of edges are marked as sharp.
I wonder why that is the case. does it have to do with the way the Data is stored in Blender?
If the sharp edges are removed the custom normals are ruined.
So far I haven't tested what happens when exporting the model, might not be an issue.
I just checked prime8, I haven't previously and don't in my latest test get any sharp edges. Do you want to upload a file and we can take a look?
I tested it on different systems actually.
Are you sure you switched "sharp" on in the display panel? Cause it is off by default if you don't add sharp edges manually.
I see what you mean now Prime8. I never noticed that before.
Its kinda strnge though sometimes sharp edges don't show up with I apply the modifier and sometimes all edges are marked sharp.
What makes Maya the top contender in terms of Rigging is it'S ability to have multplies ways and options of doing things, whether you're a python prefessional rigger or just an amateur like me who like to build his own solutions on it's way.
In Blender, you have access to some ''deformer modifiers'', to constraints and of course drivers. If you don't know any coding, this is all you have.
Python coders have access to expressions and the command menu for more advanced work but if you don't know much of it, you can get pretty limited in what you can do.
In Maya, not only you have access to constraints, drivers, expressions and set driven keys, lots of riggers use ''the node editor'' to actually find pre-programmed rigging node to work on more complex stuff without having to actually code.
For example, let's say you wanna make an arm stretch when the IK controller goes beyond it's limit, in Maya, you can use a ''condition node'' that just like normal shading node, you connect to the joints. In Maya, you also have pre-porgrammed math operation nodes like ''multiply divide'', ''get distance'' etc.
However, in Blender, as there are no rigging nodes, all of these math operations have to be done using either scripts or expressions in the driver..
For example, to make conditionnal operations, I need to know the ''proper coding language to execute ''if true/else'' operation (as in Maya, this command is a pre-programmed node usable by non-coders.''
But I remember reading on an important thread on the Blender forum that for Blender 3 project, they were thinking of implementing rigging nodes (sincerely that would be awesome!)
Hey Jed, I do need some explanation on how Blender shding nodes work.
I even watched a complete vide of Andrew Price explaining how he works but I got no luck having an acceptable result (a friend of mine said it was way easier than Maya/Mental Ray)..
In the end, I rendered my animation, Blender's internal render.
In Maya, when I use Physically based shaders (like the MIA materials) , I'm more used to the traditionnal methods of having all of my surface paramters (diffuse, glossy,etc.) already establisehd, and so I just scale them up or down or connect an image file.
In Blender, I understand, you have to add every paramters yourself to the scene right?:)
I thought I find this method a little bit confusing when I have to ''mix'' two shading parameters together.
For example, when I mix a diffuse BSDF and a glossy BSDF together, I'm kinda confused of how the ''factor'' thingy works and things get even more wierd when you start adding other shaders like SSS. It looks like the workflow is to ''multiply'' shaders output but I find it weird as I never get the expected result...
Also, I never noticed, there was no way to control the ambiant occlusion properly.
Even more, no way to manage light bouces and scaterring from indrect lighting.
Also, I coudn't find ways to control the ''hardness'' of my shadows using.
And even worse, the ''lamps'' are quite limited in their parameters..How do I changer the shadow's color? sahdow's softness? Light fog?
I got lost really quickly...
Edit: ALso, why is everthing so freaking noisy!!
Here's a (sad) attempt of trying to put AO on my character...I can never get the desired effect in hwich only the black parts of the AO are visible (a mulitply function). Event ried controlling the threshold with a ramp but no luck...
Here's a (sad) attempt of trying to put AO on my character...I can never get the desired effect in hwich only the black parts of the AO are visible (a mulitply function). Event ried controlling the threshold with a ramp but no luck...
That's because it's an Unbiased renderer, you need more samples, so it can calculate the output picture in more fine details; more samples=more render time. If you use your GPU you can speed up it a lot, compared the the CPU, this obviously depends on how many CUDA cores you have, it should work on AMD cards too, by the way.
OK, first thing you should know about Cycles is that you should never, ever use the ambient occlusion node. It basically just adds ambient diffuse light to your render. Here's the basic Cycles shader.
It gives you Lambertian diffuse and specularity with no Fresnel response. The Mix factor mixes between the first shader and the second shader. It's the probability that a ray that gets shot at the surface will use the second shader instead of the first.
It's important to note that the GGX that Blender has requires you to set the roughness texture to be read as Non-Color Data and to be squared, and if you made a glossiness map for Toolbag's Blinn-Phong you would have to put it through a curve and then square it to get a good result with GGX.
The curve inverts and adjusts the glossiness to the square of GGX roughness (roughly.) For a plain old UE4 roughness map you just need to square it with the Convert->Math node set to Multiply. Plugging in the diffuse texture is easy enough:
From here you can add a normal map:
And maybe some Fresnel:
I've got to go now but that's how you put together a basic Cycles shader that will render 80% of things. Will post more later.
I cannot provide the screenshots right now, but I figured out why some people didn't seem to have experienced that. It happens only when the source is set to smooth and has a split edge modifier, that is my setup always.
I cannot provide the screenshots right now, but I figured out why some people didn't seem to have experienced that. It happens only when the source is set to smooth and has a split edge modifier, that is my setup always.
Found a few minutes time finally bash this together.
As described before, source object has an edge split modifier active and is set to smooth shading. Of course the object in the example wouldn't need that and could be used with flat shading and without the modifier.
By the way, I think it's time they impement a feature that allows numeric input of aboslute vlaues/coordinates when moving objects/components...
Use the 'tab' key. In order to move, for example, a cube in position x=3, y=7, z=9 you can press:
g / 3 / TAB / 7 / TAB / 9 and finally confirm with enter or a left click.
Obviously, please ignore the forward slashes... I put them only to separate the symbols.
Yeh, I guess but I'm sorry, I meant in the Graph editor haha..the ''copy to slected'' option is greyd out...
Edit:
By the way, thanks Jed, I've managed to get an acceptable result for my char following your previous post.:D
But gosh..Cycles is sloooooooooooow!!!
Seriously, why is that? I mean, for a quick noisy render, it's allright but when you wann start removing that grain, oh god, for simple stuff, it takes ages..
And I have I7 3770k which is not that bad...I head we can use the GPU (I have r9 270x) but can we actually share the render processing between the CPU and GPU or it's one or the other?
It's one or the other. Probably your GPU is going to render faster than your CPU. Luxrender can use GPU and CPU but it's even slower than Cycles, plus iirc it doesn't do motion blur which you will probably want in an offline render.
Even a mobile graphics card like the 960M in my notebook has way more cores than the processor.
i7 4720 HQ = 4 cores
NVidia GTX 960M = 640 cores
Always enable the CUDA feature of Cycles, if it's supported.
Alternatively, there are online "render farm" services that charge pennies for the GHz/hour and can render your Blender scenes. http://rentrender.com/cycles-render-farms/
GPU compute has some limitations like texture memory and number of textures, and the AMD implementation can't use HDR textures or subsurface scattering, but given your simple aesthetic you shouldn't run into problems with it.
I'd also suggest checking CGCookie's "Using Cycles Non-Progressive Integrator" to help you getting better performance out of cycles. There's also other tutorials at CGCookie to help with that (Light Path node is your friend).
There are a lot of factors that will effect render times. I find Cycles to be pretty good overall and I've used it for some freelance work. I found that bucket size can effect the render times a bit as well but the live preview that's speckled is a boon for working fast.
You can also network render with other computers you may have for a possible on location render farm.
One thing to keep in mind about doing normals with the Data Transfer modifier is that it can split your normals where you previously had smooth faces. To counteract this you can set the blend mode of the data transfer modifier to Mix instead of Replace. This will keep normals that were together before together.
When the modifier is applied it sometimes splits the normals at face corners which is probably why you see those hard edges. However, UE4's import process welds any face corners that have the same normals, so it's not something you need to worry about as long as you set the Data Transfer modifier blending mode to Mix.
One thing to keep in mind about doing normals with the Data Transfer modifier is that it can split your normals where you previously had smooth faces. To counteract this you can set the blend mode of the data transfer modifier to Mix instead of Replace. This will keep normals that were together before together.
When the modifier is applied it sometimes splits the normals at face corners which is probably why you see those hard edges. However, UE4's import process welds any face corners that have the same normals, so it's not something you need to worry about as long as you set the Data Transfer modifier blending mode to Mix.
Thanks JedTheKrampus!
I tried the Mix blend mode, but actually the result is different. The normals get slightly averaged.
But if UE4 import welds them back together it shouldn't be an issue, with UE4 at least.
Still wondering why the edges are marked as sharp.
Yes, the result should be a little different. UE4 won't merge the face corners together unless they have the same normals, which is why you probably don't want to use Replace on the Data Transfer modifier if you use it like I do (i.e. to smooth out bevels.) Also it's probably a good idea to mask out any sharp edges with a vertex group.
Data Transfer set to Replace is a good idea if you're using Nearest Corner and Best Matching Face Normal for some reason, but is a bad idea if you're using Projected Face Interpolated which is better for getting nice bevels on the lowpoly model.
I completed the codecademy python course with an aim to be able to do blender scripting, has anyone got any recommendations for more blender specific tutorials?
Yes, the result should be a little different. UE4 won't merge the face corners together unless they have the same normals, which is why you probably don't want to use Replace on the Data Transfer modifier if you use it like I do (i.e. to smooth out bevels.) Also it's probably a good idea to mask out any sharp edges with a vertex group.
Data Transfer set to Replace is a good idea if you're using Nearest Corner and Best Matching Face Normal for some reason, but is a bad idea if you're using Projected Face Interpolated which is better for getting nice bevels on the lowpoly model.
I used replace with "Nearest Corner of Nearest Face", that gave the best result for the tests I made so far.
Mix doesn't deliver the result that I'm looking for. Even on that beveled cube you can see triangulation shadings in an otherwise plane quad.
As a quick test I exported and re-imported as fbx with different smoothing methods.
When using "face" or "normals only", the model doesn't have sharp edges after reimporting, but keeps the custom normals and doesn't show any difference.
I've had some success with Projected Face Interpolated and Mix, but I see that after toggling the modifier on and off a couple of times to refresh it you can also get a good result with Nearest Corner of Nearest Face for smoothing out bevels if your source mesh has enough geometry.
Hello guys. Anyone here use 3Dcoat to textures painting ?. I did use blender for some textures paint(see below) but it doesn't have layers so the result comeout not nice. Any tips for me ?. I'am only use default brush (texdraw) for it. Thanks
[SKETCHFAB]31bb9be9f3694aef8a68e93531b6f3e4[/SKETCHFAB]
[SKETCHFAB]8a76bbc52b0d4af6827a1ec92734ba11[/SKETCHFAB]
3D Coat, Mudbox and Mari Indie are all better choices for 3d painting textures IMO. You can also use them in conjunction with a 2d editor like Krita which can work very nicely. (You can edit projections with Blender, but it's more likely to mess up in my experience.)
Just FYI, if that isn't you, you probably shouldn't be posting the art as your own. At best, you might put a description showing that you made it, by following a tutorial *LINK* by *AUTHOR*, just like he's crediting the concept.
Also 'so the results come out not so great', but... you were following a tutorial (bringing to mind the question: why ask us for advice, there's a tutorial!) made in Blender, and that one gets pretty nice results.
If you want feedback on your work, just post it in its own thread, or the WAYWO thread. It's been mentioned before that self-promotion in here feels a bit odd, and we'd like to keep this on topic.
Just FYI, if that isn't you, you probably shouldn't be posting the art as your own. At best, you might put a description showing that you made it, by following a tutorial *LINK* by *AUTHOR*, just like he's crediting the concept.
Opps. I was doing this follow that tuts u just post. sorry about that. I will more carefull when posting now. Thanks
I completed the codecademy python course with an aim to be able to do blender scripting, has anyone got any recommendations for more blender specific tutorials?
Thanks Defunct!
I've been poking at it, it's a bit confusing. Managed to half make a context sensitive modal (so shortcuts change when you activate it)delete script, mesh to object name unifyer, hotkey+lmb based 3d viewport timeline scrubber. None of them are very good but can share if you or anyone else would like them.
Struggling with the concept of how to manipulate actions and the data inside via python so I can make a keyframe dragger and bone renamer that dosent break actions.
Replies
X-( ok ok I know, I shoudnt post that here but it's stil kinda my first lil achievement inside of Blender.
I've managed in my free time after school and work to finish my first animation in Blneder using my own perosnal rig.
[ame]https://www.youtube.com/watch?v=fghfKQNzdPI[/ame]
Now that this is done, I can actually start criticising Blender from most of it's aspect since I touched most parts of the software...
What can I say...
I think everthing holds up pretty well. MOdeling is extrmely fun as well as Uving..
Rigging is allright but lacks some flexibility in my point of view if you don't know much python coding (you are limited to constrints and drivers, no rigging nodes are available (yet?)).
Animation is...mm there are good points and there bad points.
My main concern is the time slider..there's absolutely no sue to it except sliding your animation. I tihnk it would be wise for the devs to actually take exmaple on Maya for this one and let the user directly edit his keys inside of it..
Instead, we are obliged to open another panel (the dope sheet) just to do some basik key editing..Not only it takes sapce on the screen, it slows down workflow and gets tiresome quickly...
I didn't touch the rendering and cycles engine too much cuz thats totally another world to master..
I can create custom normals using the "data transfer" modifier, that is no problem at all, but when applying the modifier, tons of edges are marked as sharp.
I wonder why that is the case. does it have to do with the way the Data is stored in Blender?
If the sharp edges are removed the custom normals are ruined.
So far I haven't tested what happens when exporting the model, might not be an issue.
I'm curious about the sharp edges.
I tested it on different systems actually.
Are you sure you switched "sharp" on in the display panel? Cause it is off by default if you don't add sharp edges manually.
Sure, otherwise the shading is not correct.
can you post a screenshot? the result before and after the modifier is different?
what do you mean as "rigging nodes" ?
Its kinda strnge though sometimes sharp edges don't show up with I apply the modifier and sometimes all edges are marked sharp.
What makes Maya the top contender in terms of Rigging is it'S ability to have multplies ways and options of doing things, whether you're a python prefessional rigger or just an amateur like me who like to build his own solutions on it's way.
In Blender, you have access to some ''deformer modifiers'', to constraints and of course drivers. If you don't know any coding, this is all you have.
Python coders have access to expressions and the command menu for more advanced work but if you don't know much of it, you can get pretty limited in what you can do.
In Maya, not only you have access to constraints, drivers, expressions and set driven keys, lots of riggers use ''the node editor'' to actually find pre-programmed rigging node to work on more complex stuff without having to actually code.
For example, let's say you wanna make an arm stretch when the IK controller goes beyond it's limit, in Maya, you can use a ''condition node'' that just like normal shading node, you connect to the joints. In Maya, you also have pre-porgrammed math operation nodes like ''multiply divide'', ''get distance'' etc.
However, in Blender, as there are no rigging nodes, all of these math operations have to be done using either scripts or expressions in the driver..
For example, to make conditionnal operations, I need to know the ''proper coding language to execute ''if true/else'' operation (as in Maya, this command is a pre-programmed node usable by non-coders.''
But I remember reading on an important thread on the Blender forum that for Blender 3 project, they were thinking of implementing rigging nodes (sincerely that would be awesome!)
I even watched a complete vide of Andrew Price explaining how he works but I got no luck having an acceptable result (a friend of mine said it was way easier than Maya/Mental Ray)..
In the end, I rendered my animation, Blender's internal render.
In Maya, when I use Physically based shaders (like the MIA materials) , I'm more used to the traditionnal methods of having all of my surface paramters (diffuse, glossy,etc.) already establisehd, and so I just scale them up or down or connect an image file.
In Blender, I understand, you have to add every paramters yourself to the scene right?:)
I thought I find this method a little bit confusing when I have to ''mix'' two shading parameters together.
For example, when I mix a diffuse BSDF and a glossy BSDF together, I'm kinda confused of how the ''factor'' thingy works and things get even more wierd when you start adding other shaders like SSS. It looks like the workflow is to ''multiply'' shaders output but I find it weird as I never get the expected result...
Also, I never noticed, there was no way to control the ambiant occlusion properly.
Even more, no way to manage light bouces and scaterring from indrect lighting.
Also, I coudn't find ways to control the ''hardness'' of my shadows using.
And even worse, the ''lamps'' are quite limited in their parameters..How do I changer the shadow's color? sahdow's softness? Light fog?
I got lost really quickly...
Edit: ALso, why is everthing so freaking noisy!!
Here's a (sad) attempt of trying to put AO on my character...I can never get the desired effect in hwich only the black parts of the AO are visible (a mulitply function). Event ried controlling the threshold with a ramp but no luck...
That's because it's an Unbiased renderer, you need more samples, so it can calculate the output picture in more fine details; more samples=more render time. If you use your GPU you can speed up it a lot, compared the the CPU, this obviously depends on how many CUDA cores you have, it should work on AMD cards too, by the way.
[ame]http://www.youtube.com/watch?v=UTwXG3K4l2g[/ame]
Even if it's old, this is a nice introduction to Cycles.
It gives you Lambertian diffuse and specularity with no Fresnel response. The Mix factor mixes between the first shader and the second shader. It's the probability that a ray that gets shot at the surface will use the second shader instead of the first.
It's important to note that the GGX that Blender has requires you to set the roughness texture to be read as Non-Color Data and to be squared, and if you made a glossiness map for Toolbag's Blinn-Phong you would have to put it through a curve and then square it to get a good result with GGX.
The curve inverts and adjusts the glossiness to the square of GGX roughness (roughly.) For a plain old UE4 roughness map you just need to square it with the Convert->Math node set to Multiply. Plugging in the diffuse texture is easy enough:
From here you can add a normal map:
And maybe some Fresnel:
I've got to go now but that's how you put together a basic Cycles shader that will render 80% of things. Will post more later.
I'm still curious 8)
Found a few minutes time finally bash this together.
As described before, source object has an edge split modifier active and is set to smooth shading. Of course the object in the example wouldn't need that and could be used with flat shading and without the modifier.
You mean like this?
g / 3 / TAB / 7 / TAB / 9 and finally confirm with enter or a left click.
Obviously, please ignore the forward slashes... I put them only to separate the symbols.
Edit:
By the way, thanks Jed, I've managed to get an acceptable result for my char following your previous post.:D
But gosh..Cycles is sloooooooooooow!!!
Seriously, why is that? I mean, for a quick noisy render, it's allright but when you wann start removing that grain, oh god, for simple stuff, it takes ages..
And I have I7 3770k which is not that bad...I head we can use the GPU (I have r9 270x) but can we actually share the render processing between the CPU and GPU or it's one or the other?
this might also be of interest
http://www.blenderguru.com/tutorials/using-portals-accelerate-render-times/
i7 4720 HQ = 4 cores
NVidia GTX 960M = 640 cores
Always enable the CUDA feature of Cycles, if it's supported.
Alternatively, there are online "render farm" services that charge pennies for the GHz/hour and can render your Blender scenes.
http://rentrender.com/cycles-render-farms/
GPU compute has some limitations like texture memory and number of textures, and the AMD implementation can't use HDR textures or subsurface scattering, but given your simple aesthetic you shouldn't run into problems with it.
Frankie, dogdamn Blender and it's hidden features
https://cgcookie.com/archive/cycles-non-progressive-integrator/
try to compare cycles and Vray on 100% bruteforce, you will notice not so much difference.
and YES use the gpu
You can also network render with other computers you may have for a possible on location render farm.
When the modifier is applied it sometimes splits the normals at face corners which is probably why you see those hard edges. However, UE4's import process welds any face corners that have the same normals, so it's not something you need to worry about as long as you set the Data Transfer modifier blending mode to Mix.
Thanks JedTheKrampus!
I tried the Mix blend mode, but actually the result is different. The normals get slightly averaged.
But if UE4 import welds them back together it shouldn't be an issue, with UE4 at least.
Still wondering why the edges are marked as sharp.
Data Transfer set to Replace is a good idea if you're using Nearest Corner and Best Matching Face Normal for some reason, but is a bad idea if you're using Projected Face Interpolated which is better for getting nice bevels on the lowpoly model.
I used replace with "Nearest Corner of Nearest Face", that gave the best result for the tests I made so far.
Mix doesn't deliver the result that I'm looking for. Even on that beveled cube you can see triangulation shadings in an otherwise plane quad.
As a quick test I exported and re-imported as fbx with different smoothing methods.
When using "face" or "normals only", the model doesn't have sharp edges after reimporting, but keeps the custom normals and doesn't show any difference.
I should do a few testing with UE4 as well.
[SKETCHFAB]31bb9be9f3694aef8a68e93531b6f3e4[/SKETCHFAB]
[SKETCHFAB]8a76bbc52b0d4af6827a1ec92734ba11[/SKETCHFAB]
Actually, I've not used it, but it seems to be a much better implementation of 3D painting layers inside Blender, with effect layers and such.
Just FYI, if that isn't you, you probably shouldn't be posting the art as your own. At best, you might put a description showing that you made it, by following a tutorial *LINK* by *AUTHOR*, just like he's crediting the concept.
If you want feedback on your work, just post it in its own thread, or the WAYWO thread. It's been mentioned before that self-promotion in here feels a bit odd, and we'd like to keep this on topic.
Opps. I was doing this follow that tuts u just post. sorry about that. I will more carefull when posting now. Thanks
I've been poking at it, it's a bit confusing. Managed to half make a context sensitive modal (so shortcuts change when you activate it)delete script, mesh to object name unifyer, hotkey+lmb based 3d viewport timeline scrubber. None of them are very good but can share if you or anyone else would like them.
Struggling with the concept of how to manipulate actions and the data inside via python so I can make a keyframe dragger and bone renamer that dosent break actions.