Hi! I'm testing the demo of the new Knald 1.1.0, but I always run into problems when importing high poly meshes for the baker.
If I try to import the HP as OBJ, it hangs at 33% and doesn't continue, I have to press esc to regain control of the application.
If I try to import the HP as FBX, the are no crashes or hangs but only one object in the HP file is actually imported, and it doesn't always align with the low poly either. I've done a lot of testing to ensure I'm exporting the FBX correctly, and to the best of my knowledge I am. I've tried moving all pivots to world origin, reset xforms before exporting and made sure the HP & LP import correctly on other software, like Toolbag. The meshes align perfectly there, so not sure what else to try.
I'm using 3ds max 2015, and FBX 2015.1 plugin version, FBX2014 file format. Not sure if I'm missing some newer version.
By the way the other tools in the program look great so far!
Edit: Managed a good bake, but I had to retry several time to get the high poly in place, really strange. I also had to convert it to a single mesh to make sure all of it is loaded.
Hi! I'm testing the demo of the new Knald 1.1.0, but I always run into problems when importing high poly meshes for the baker.
If I try to import the HP as OBJ, it hangs at 33% and doesn't continue, I have to press esc to regain control of the application.
If I try to import the HP as FBX, the are no crashes or hangs but only one object in the HP file is actually imported, and it doesn't always align with the low poly either. I've done a lot of testing to ensure I'm exporting the FBX correctly, and to the best of my knowledge I am. I've tried moving all pivots to world origin, reset xforms before exporting and made sure the HP & LP import correctly on other software, like Toolbag. The meshes align perfectly there, so not sure what else to try.
I'm using 3ds max 2015, and FBX 2015.1 plugin version, FBX2014 file format. Not sure if I'm missing some newer version.
By the way the other tools in the program look great so far!
Edit: Managed a good bake, but I had to retry several time to get the high poly in place, really strange. I also had to convert it to a single mesh to make sure all of it is loaded.
Hi Yogensya,
Sorry for the belated reply.
Thanks for the kind words
Unfortunately we don't currently support sub-meshes for FBX, but it's on the list. At the moment each sub-mesh must either be merged into one master mesh or exported separately as multiple mesh files if you want things like MatID for separate parts.
It would be really great if you could send us the meshes that you were having problems with so we can do some debugging.
Please include:
The OBJ that hangs
The the meshes where the FBX and OBJ were not lining up correctly
Any Lowpoly and cage files that you used
Anything else you think may be relevant
support@knaldtech.com
ASCII FBX rather than binary would be perfect where the FBX files are concerned.
Here's a feature request:
If you have a high poly that has uv's and a base texture it would be great to have an input for that in knald and an output map for base texture. That way you could transfer the albedo from high to low. This would be helpful for processing photogrammetry objects.
On the baker tab under Bake Meshes:
Under High Poly 1 - add a dialogue to add a base texture.
Under Bake Targets:
add a Map Type for Base texture
Thanks!
I would really like to see this added as well!! Or is the base texture baking already in the beta??
Good job on Knald, lightning fast and awesome results!
I came across two issues with the baker so far though.
It seems that even if you supply it with a cage file, it doesn't show it in the 3D viewer. It just shows the default (generated) one when checking "View Cage"
It looks like Knald uses a search distance (like maya's envelope) even when a cage is present Problem is I tried to bake a normal map for a mesh that by design has a hole in it and the range "mesh" is not fitting the model properly, which causes the ray projector to miss the surfaces i guess. Take a look at this pic:.
That part of the range mesh is outside the model while the rest of it is in.
Here's a feature request:
If you have a high poly that has uv's and a base texture it would be great to have an input for that in knald and an output map for base texture. That way you could transfer the albedo from high to low. This would be helpful for processing photogrammetry objects.
On the baker tab under Bake Meshes:
Under High Poly 1 - add a dialogue to add a base texture.
Under Bake Targets:
add a Map Type for Base texture
Thanks!
I would really like to see this added as well!! Or is the base texture baking already in the beta??
The beta was ended when Knald 1.1 was released a few weeks ago, but this is on our list of things to add in the future.
Good job on Knald, lightning fast and awesome results!
I came across two issues with the baker so far though.
It seems that even if you supply it with a cage file, it doesn't show it in the 3D viewer. It just shows the default (generated) one when checking "View Cage"
It looks like Knald uses a search distance (like maya's envelope) even when a cage is present Problem is I tried to bake a normal map for a mesh that by design has a hole in it and the range "mesh" is not fitting the model properly, which causes the ray projector to miss the surfaces i guess. Take a look at this pic:.
That part of the range mesh is outside the model while the rest of it is in.
Am I doing something wrong?
Thanks
Thanks for the kind words!
Any cage mesh that you loaded wont show up until you hit the pre-process/bake button. Is that the issue you are having? Yes, Knald always uses a cage for search distance purposes, be it the internal one or one that is loaded by the user. Can you send us the mesh that you are having issues with please?
Any cage mesh that you loaded wont show up until you hit the pre-process/bake button. Is that the issue you are having? Yes, Knald always uses a cage for search distance purposes, be it the internal one or one that is loaded by the user. Can you send us the mesh that you are having issues with please?
Thanks!
Ok, the imported cage now displays after hitting the Pre-Process/Bake button, so that's sorted ! I PM-ed you the archive with the high-poly, the cage and the low-poly. Knald doesn't seem to import my cage correctly... in Max it covers the entire high-poly object, but in Knald it just seems to stay in place for that particular face.
I reset Xform and converted to mesh then poly, exported it as FBX and OBJ... nothing helps
Has anyone come across this problem? The 3D Preview windows shows some parts of the model from the outside and some like the mesh would be flipped. Have not found anything wrong with my meshes and the primitives in Knald do the same thing.
As far as I can tell this problem is only with the preview and baking works without any problems.
Did a clean install of newest Nvidia drivers and reinstalled Knald, but no help. Weirdest part is that Knald worked fine yesterday and I have no idea what could have changed. Any help and ideas would be great.
Edit: Found the culprit. did not realize I had RivaTuner Statistics Server running in the background. Silly me, should have been the first thing to check
Has anyone come across this problem? The 3D Preview windows shows some parts of the model from the outside and some like the mesh would be flipped. Have not found anything wrong with my meshes and the primitives in Knald do the same thing.
As far as I can tell this problem is only with the preview and baking works without any problems.
Did a clean install of newest Nvidia drivers and reinstalled Knald, but no help. Weirdest part is that Knald worked fine yesterday and I have no idea what could have changed. Any help and ideas would be great.
Edit: Found the culprit. did not realize I had RivaTuner Statistics Server running in the background. Silly me, should have been the first thing to check
The guys at work really loved switching to Knald recently, great work on the app!
A few quick questions/requests, sorry if they've been asked already...
- Any chance of getting drag and drop support for files when adding high and low poly meshes? - The Curvature Type drop-down isn't saved when you save the baking settings (it'll always default back to Dual Channel). - Can you share exactly what the "self-occlusion" check in the 3D Preview is doing? - I love using copy-paste to get maps from Knald to Photoshop, is there any way to get it to copy 16bit info to the clipboard for normal maps? Not sure this is even possible with Windows I guess... - +1 to a transfer map function (with an option to recalculate the tangent basis when transferring normal maps), that would be another solid step in completely replacing xN.
The guys at work really loved switching to Knald recently, great work on the app!
A few quick questions/requests, sorry if they've been asked already...
- Any chance of getting drag and drop support for files when adding high and low poly meshes? - The Curvature Type drop-down isn't saved when you save the baking settings (it'll always default back to Dual Channel). - Can you share exactly what the "self-occlusion" check in the 3D Preview is doing? - I love using copy-paste to get maps from Knald to Photoshop, is there any way to get it to copy 16bit info to the clipboard for normal maps? Not sure this is even possible with Windows I guess... - +1 to a transfer map function (with an option to recalculate the tangent basis when transferring normal maps), that would be another solid step in completely replacing xN.
Thanks again.
Hey Bal,
Thanks for the kind words!
1. Yea, I'm sure we can add that in a future update. I will add it to the list. 2. Thanks for the report. I will make sure it gets fixed. 3. Self Occlusion is a toggle for if you want the 3d preview mesh to self occlude itself or not. It's purely a cosmetic feature for the viewer and has no impact on the generated results. 4. Windows doesn't support anything more than 8bit in the clipboard unfortunately. You can open all the images in Photoshop (or whatever other application you want) automatically on export by using the Post Export Actions which are found in the export tab. The 'Open With Default Handler' preset will open the exported images with whichever program is set to open them by default. for example if TGA is set to open in Photoshop then on export Photoshop will start and load the exported TGAs. https://www.knaldtech.com/docs/doku.php?id=post_export_actions_1.1 5. Thanks for the suggestion. I will add it to the list
For the self-occlusion, I was more wondering if you could describe what it's doing in the viewer, or if there's any documentation on the technique somewhere. We have some self-occlusion type tech in our current engine but it seems quite different and I'm not always satisfied with it. I was wondering if maybe I could ask our engine guys to test some other methods.
For the self-occlusion, I was more wondering if you could describe what it's doing in the viewer, or if there's any documentation on the technique somewhere. We have some self-occlusion type tech in our current engine but it seems quite different and I'm not always satisfied with it. I was wondering if maybe I could ask our engine guys to test some other methods.
Hey Bal, Thanks for the kind words regarding our viewer. It's always great to hear when people like what they see Unfortunately I'm not really at liberty to discuss undocumented/non-public information regarding our technology. Sorry
Having said that, in the future we are planning on releasing a standalone shader proposing an ibl implementation with a Lys generated cube map, which should cover all the relevant bases. I can't give any definitive times scales for that though.
Knald's baking astounds me. I've been getting near perfect bakes in Knald when Substance and xnormal fails me (and those fail me a lot). I'd love to make it a permanent part of my workflow, there's just one thing that holds me back. Fbx seems to be a bit unstable for me when I use it on the highpoly (it causes knald to hang and never complete preprocessing). I'm fine using OBJ instead, but I wish I could bake an ID map based on material ID and not mesh ID from an obj. It's a little tedious exporting my HP out in many pieces, sometimes 20-30 to be able to utilize the mesh ID bake. Is there a detail I'm missing in Knald that allows me to bake from a single OBJ with several material IDs rather than 20 or so separate high poly meshes? Configuration only allows me to use mesh ID, and I have groups and materials checked on export. Maybe it's my export settings?
Knald's baking astounds me. I've been getting near perfect bakes
in Knald when Substance and xnormal fails me (and those fail me a lot).
I'd love to make it a permanent part of my workflow, there's just one
thing that holds me back. Fbx seems to be a bit unstable for me when I
use it on the highpoly (it causes knald to hang and never complete
preprocessing). I'm fine using OBJ instead, but I wish I could bake an
ID map based on material ID and not mesh ID from an obj. It's a little
tedious exporting my HP out in many pieces, sometimes 20-30 to be able
to utilize the mesh ID bake. Is there a detail I'm missing in Knald that
allows me to bake from a single OBJ with several material IDs rather
than 20 or so separate high poly meshes? Configuration only allows me to
use mesh ID, and I have groups and materials checked on export. Maybe
it's my export settings?
Hi Makkon!
Thanks so much for the very kind words! It's something we want to add in a future update for sure. I will make sure it's high on the list. Knald 1.1.1 should fix the issue you are having with FBX. Please let us know if it works for you.
Since
the 1.1.0 release of Knald a couple of months ago we have been hard at
work fixing a few bugs and adding new improvements to Knald. Time to
enjoy Knald 1.1.1!
What’s changed since Knald 1.1.0?
New Features, Improvements and Optimizations
Added a choice of ray distribution methods for the baker.
Added an option to toggle the filtering of reflections within the 3d viewport.
Updated to the latest FBX SDK (2016.1.2).
Update hotkeys and top menus.
Updated tooltips.
Miscellaneous improvements and optimizations.
Bug fixes
Fix for issue where some FBX meshes caused mesh importing to hang.
Fix for issue where High Poly planar faces (axis aligned) baked incorrectly in rare cases.
Fix for issue where Concavity settings were not saving correctly.
Fix for issue where Convexity settings were not saving correctly.
Fix for issue where Curvature settings were not saving correctly.
We
really want to thank everyone for all the hard work and feedback
during Knald's development. Please do continue sending us suggestions & posting everything you
can!
i'm just on the trial but am continually impressed with this app. Spent the morning screwing around with a normal bake in blender/xnormal and this thing pops it out in one shot. Great features and very clean results.
i'm just on the trial but am continually impressed with this app. Spent the morning screwing around with a normal bake in blender/xnormal and this thing pops it out in one shot. Great features and very clean results.
Thanks for the nice comments. It's really awesome when we hear that people like what we are doing Thanks for taking the time to post.
Knald's baking astounds me. I've been getting near perfect bakes in Knald when Substance and xnormal fails me (and those fail me a lot). I'd love to make it a permanent part of my workflow, there's just one thing that holds me back. Fbx seems to be a bit unstable for me when I use it on the highpoly (it causes knald to hang and never complete preprocessing). I'm fine using OBJ instead, but I wish I could bake an ID map based on material ID and not mesh ID from an obj. It's a little tedious exporting my HP out in many pieces, sometimes 20-30 to be able to utilize the mesh ID bake. Is there a detail I'm missing in Knald that allows me to bake from a single OBJ with several material IDs rather than 20 or so separate high poly meshes? Configuration only allows me to use mesh ID, and I have groups and materials checked on export. Maybe it's my export settings?
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Knald's baking astounds me. I've been getting near perfect bakes in Knald when Substance and xnormal fails me (and those fail me a lot). I'd love to make it a permanent part of my workflow, there's just one thing that holds me back. Fbx seems to be a bit unstable for me when I use it on the highpoly (it causes knald to hang and never complete preprocessing). I'm fine using OBJ instead, but I wish I could bake an ID map based on material ID and not mesh ID from an obj. It's a little tedious exporting my HP out in many pieces, sometimes 20-30 to be able to utilize the mesh ID bake. Is there a detail I'm missing in Knald that allows me to bake from a single OBJ with several material IDs rather than 20 or so separate high poly meshes? Configuration only allows me to use mesh ID, and I have groups and materials checked on export. Maybe it's my export settings?
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Hey!
Can you send me the meshes that are sticking for you please? PM or email them me on support@knaldtech.com and I would be happy to take a look.
Knald's baking astounds me. I've been getting near perfect bakes in Knald when Substance and xnormal fails me (and those fail me a lot). I'd love to make it a permanent part of my workflow, there's just one thing that holds me back. Fbx seems to be a bit unstable for me when I use it on the highpoly (it causes knald to hang and never complete preprocessing). I'm fine using OBJ instead, but I wish I could bake an ID map based on material ID and not mesh ID from an obj. It's a little tedious exporting my HP out in many pieces, sometimes 20-30 to be able to utilize the mesh ID bake. Is there a detail I'm missing in Knald that allows me to bake from a single OBJ with several material IDs rather than 20 or so separate high poly meshes? Configuration only allows me to use mesh ID, and I have groups and materials checked on export. Maybe it's my export settings?
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Hey!
Can you send me the meshes that are sticking for you please? PM or email them me on support@knaldtech.com and I would be happy to take a look.
Send an email. Let me know if there's any problem in the files.
Is there no way to frame object in 3D view? Sometimes I get lost and can't get back! :P
You can reset the mesh position by double clicking the MMB or ALT+RMB double click. Generally any rotation or translation you do in the 3d preview regarding meshes, UVs and lights is reset by double clicking the mouse button used to originally perform the action. https://www.knaldtech.com/docs/doku.php?id=interface_knald_1.1#d_preview1
Hello there! I am looking for a baking program for I currently use substance painter for baking, but I actually use 3d coat for texturing. Knald has always produced astonishing results but I have not found a feature to bake things properly without exploding the mesh. In substance painter I can match the different mesh parts between high and low poly by name and suffix - is there any feature like that planned in Knald? It would be a major slowdown to explode things or do separate bakes and combine the maps... Would very much like to switch to Knald
Hello there! I am looking for a baking program for I currently use substance painter for baking, but I actually use 3d coat for texturing. Knald has always produced astonishing results but I have not found a feature to bake things properly without exploding the mesh. In substance painter I can match the different mesh parts between high and low poly by name and suffix - is there any feature like that planned in Knald? It would be a major slowdown to explode things or do separate bakes and combine the maps... Would very much like to switch to Knald
Thanks for the nice comments! Currently we don't have the ability to explode/name match meshes within Knald itself, but it's on the list of things to add for a future update.
Feature requests: 1. Could you make right clicking a numeric input field set it back to default. 2. Could you make kernel type it's own setting, I often find I like to use sobel as my default (gives nice antialiased edges for photo to normal), but each time I press reset to defaults to zero all my settings out I then I have to set the kernel again. 3. Still really want albedo/Vcols to remember between pasting from photoshop, would love to turn this off and never see it turned on again.
Something strange is happening with my custom cages in latest Knald demo. In short — the cage displayed in Knald doesn't match the actual geometry exported from 3d app. I've tried to figure out if this was a specific scenario with my weird mesh, but it looks like that was not the case. Same thing happens with just a cube — quite simple one, beveled and friendly at the same time.
Picture one [Maya]: lowpoly mesh in gray, cage in orange. The cage is basically the same mesh that has two corners lifted straight up.
Picture two [Knald]: lowpoly mesh in gray, cage in orange. The cube got a new haircut, and I was expecting him to keep the old one. He was like a Duke Nukem minus the orange instead of blonde back then, and now he's got some kind of afro.
Looks like vertices on top got pushed out, averaged and kinda inflated, as if Push slider were used. To specify, there were not any warnings in Knald. Push slider was greyed out as it should be when using custom cage mesh.
The question is — am I heavily missing something and this is normal, or this is indeed something strange with the way my cages were handled by Knald? As I stated before, I was expecting Knald to keep my cage exactly the same, but that's what happens instead.
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Just an update to the issue I was having: The reason why Knald was getting stuck during baking was because my high poly had motph map information(to toogle between the default position and the exploded offset for baking). As soon as I removed the blend shape it went fine, so probably best to avoid baking dense meshes with morph information.
Feature requests: 1. Could you make right clicking a numeric input field set it back to default. 2. Could you make kernel type it's own setting, I often find I like to use sobel as my default (gives nice antialiased edges for photo to normal), but each time I press reset to defaults to zero all my settings out I then I have to set the kernel again. 3. Still really want albedo/Vcols to remember between pasting from photoshop, would love to turn this off and never see it turned on again.
Hi Malcolm,
Thanks for the suggestions as always. 1.Sure, that sounds like a good idea. I will see what I can do. 2. You can already do this actually. The reset button is specifically to reload the shipped defaults, but if you save the settings you want
as your personal defaults with the down arrow and then hit the 'cancel'
button (the middle one) it will reload to your last saved settings rather than the defaults. 3.
This is still on the list of stuff to add for a future version, for
sure. I promise will try to get this done as soon as we can, mate.
Something strange is happening with my custom cages in latest Knald demo. In short — the cage displayed in Knald doesn't match the actual geometry exported from 3d app. I've tried to figure out if this was a specific scenario with my weird mesh, but it looks like that was not the case. Same thing happens with just a cube — quite simple one, beveled and friendly at the same time.
Picture one [Maya]: lowpoly mesh in gray, cage in orange. The cage is basically the same mesh that has two corners lifted straight up.
Picture two [Knald]: lowpoly mesh in gray, cage in orange. The cube got a new haircut, and I was expecting him to keep the old one. He was like a Duke Nukem minus the orange instead of blonde back then, and now he's got some kind of afro.
Looks like vertices on top got pushed out, averaged and kinda inflated, as if Push slider were used. To specify, there were not any warnings in Knald. Push slider was greyed out as it should be when using custom cage mesh.
The question is — am I heavily missing something and this is normal, or this is indeed something strange with the way my cages were handled by Knald? As I stated before, I was expecting Knald to keep my cage exactly the same, but that's what happens instead.
We actually added a different method of ray distribution in 1.1.1 that fixes this issue, though the projection you will get from that cage wont be awesome In the Bake Settings change the 'Ray Distribution' to 'Unadjusted' and you should get the result you are looking for.
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Just an update to the issue I was having: The reason why Knald was getting stuck during baking was because my high poly had motph map information(to toogle between the default position and the exploded offset for baking). As soon as I removed the blend shape it went fine, so probably best to avoid baking dense meshes with morph information.
Thanks for updating the thread, MDiamond. For reference this isn't a Knald specific bug as the files also
failed to load in Autodesk FBX Review too. It's likely that the bug is
something to do with Modo or the FBX SDK they are using. For now it's probably best not to use meshes from Modo with morph
information until they update the SDK /fix the issue.
METALLIANDY, thank you for clarification! Changing Ray Distribution to Unadjusted did the trick. By the way, my pics on top were just a generalized example that I put together to simplify demonstation of the issue. Please correct me if I'm wrong, but using a proper Cage mesh with Unadjusted ray distribution and real-world model wouldn't hurt quality of bake results compared to Aligned ray distribution.
To specify the reason I'm asking this — I was trying to get a good bake result from a unexploded mesh using a fine-tuned Cage mesh. With ray distribution set to Aligned the cage was partially scewed in some specific geometry areas, thus affecting the bake in a negative way and making such cage unusable. I was looking for a way to prevent this, and as you pointed out Unadjusted ray distribution will allow me to use an unmodified cage mesh. Are you implying that using Unadjusted ray distribution will always produce subpar results compared to Aligned, or you was referring only to the specific configuration showed on my pics above? Just trying to understand it correctly. Thanks!
METALLIANDY, thank you for clarification! Changing Ray Distribution to Unadjusted did the trick. By the way, my pics on top were just a generalized example that I put together to simplify demonstation of the issue. Please correct me if I'm wrong, but using a proper Cage mesh with Unadjusted ray distribution and real-world model wouldn't hurt quality of bake results compared to Aligned ray distribution.
To specify the reason I'm asking this — I was trying to get a good bake result from a unexploded mesh using a fine-tuned Cage mesh. With ray distribution set to Aligned the cage was partially scewed in some specific geometry areas, thus affecting the bake in a negative way and making such cage unusable. I was looking for a way to prevent this, and as you pointed out Unadjusted ray distribution will allow me to use an unmodified cage mesh. Are you implying that using Unadjusted ray distribution will always produce subpar results compared to Aligned, or you was referring only to the specific configuration showed on my pics above? Just trying to understand it correctly. Thanks!
Hi Treidge,
Yes, that is correct. Using a proper cage mesh with Unadjusted distribution wouldn't hurt bake quality. I was only talking about the specific cage used in the tests above when I said the results wouldn't be awesome.
For clarification: The Unadjusted setting for cages is just different way of controlling ray distribution and will not adversely reduce the quality of the bake vs Aligned.
Aligned: Forces the cage vertices to align with the soft normal direction of the Low Poly vertices.
Unadjusted: Forces the ray direction to be controlled by the unmodified cage vertices.
In general soft normals (Adjusted) results in a better distribution of rays than points placed by hand as they are a mathematically "perfect" angular distribution for the geometry, but for some meshes you may get better results using points placed by hand (unadjusted) as it greatly depends on the topology of the cage as to how the projection will align.
Essentially 'Adjusted' is a way of using the cage to indicate ray distance but not distribution of rays whereas with unadjusted we allow the cage to dictate ray distance and distribution simultaneously.
Hey Andy, I'm currently deciding on which baker to use for a new project, and I wanted to try this out. I searched the 'tangent basis' keywords in this thread, but didn't get suitable answers, nor Google did either...
Is Knald able to give TS normal maps calculated by different algorithms, or are we forced to use MikkTSpace and then convert via Handplane? If so, what what should we put in the "Baked In" field in Handplane for syncing correctly? XNormal?
Hey Andy, I'm currently deciding on which baker to use for a new project, and I wanted to try this out. I searched the 'tangent basis' keywords in this thread, but didn't get suitable answers, nor Google did either...
Is Knald able to give TS normal maps calculated by different algorithms, or are we forced to use MikkTSpace and then convert via Handplane? If so, what what should we put in the "Baked In" field in Handplane for syncing correctly? XNormal?
Hey Justo,
It really depends on what your target engine is as to the advice I can give. Knald uses Mikktspace which is supported natively by Unreal 4.7+, Unity 5.3+ and Lumberyard to name a few.
If you are looking for another tangent basis then Knald does have the option to use the exported FBX tangent space
(binormal and tangent vectors) if you desire, which can be toggled
in the preferences, but you don't specifically need to use this
unless your workflow requires it.
Regarding Handplane it again depends on the target engine. Knald & xNormal both use Mikktspace but Knald calculates the bitangent on the fly (which is compatible with UE4) & xNormal doesn't by default (it's an option in the preferences), so it depends if Handplane recognises the differences between the two methods when it does the conversion. Bitangent on the fly is used by default within Knald due to the fact that it is more efficient on modern
hardware (sending data from vertex to pixel shader is more costly than
executing a minor calculation in the pixel shader.), but this is a trivial change within the shader of your engine so in practice the difference might not be a problem.
Hi Andy! Thanks for replying so soon. A bit of what you said flew over my head, but I understood most of it. The game engine we're using uses the same tangent basis as Max, so I guess I'd have to export the FBX from Max, check that Use Exported Tangent Basis in the preferences and I'd be good to go. I'll start making some tests soon. Thanks!
I baked the normal map from knald.But the render results in toolbage2 and knald's 3d preview window are different.
This is in toolbag 2,the TS is already set to mikkT.You can see that there are few shading errors,which it should be because I didn't set the hard edge to fix it.
This is in knald 3D preview Window,The shading error is gone.
So what cause this diffrent?Is that knald and toolbag2 has something different in rendering?
I baked the normal map from knald.But the render results in toolbage2 and knald's 3d preview window are different.
This is in toolbag 2,the TS is already set to mikkT.You can see that there are few shading errors,which it should be because I didn't set the hard edge to fix it.
This is in knald 3D preview Window,The shading error is gone.
So what cause this diffrent?Is that knald and toolbag2 has something different in rendering?
Thanks.
Hi!
Are you triangulating the mesh before baking? The errors you are seeing are usually because the triangulation of the mesh when baked is different to the triangulation of the the viewer and normal maps are locked to a specific triangulation once baked. If you use the same triangulation that should fix the issue.
Are you triangulating the mesh before baking? The errors you are seeing are usually because the triangulation of the mesh when baked is different to the triangulation of the the viewer and normal maps are locked to a specific triangulation once baked.
If you use the same triangulation that should fix the issue.
Thanks for the reply. Yeah,I forgot the triangulation,very stupid mistake :*
Hi there! There is one thing I'm wondering about since 1.1.0 beta, and that is whether it's still possible to use the old "Mesh AO" baking implementation from 1.0.0. in 1.1.0. The mesh AO bakes from 1.0.0 had some minor issues, like lack of proper AA, but they were absolutely, insanely perfect projection-wise, without a single clipping issue, without a single noticeable pixel missing a ray. I routinely use 1.0.0 to this day and I was not able to replace it with anything so far, because it's the only software I know of that does perfect cageless bakes.
And now, with 1.1.0, that option is gone, and we have the traditional approach to baking with low poly, high poly and cages. It's no doubt great for traditional content like sculpted high poly stuff, but I'm working on strategy and mobile games where highpoly version usually doesn't exist at all and cage creation is really complicated by use of hard edges in low poly models. Nothing from xNormal to SD baker could handle the cageless lowpoly to lowpoly bakes as well as Knald 1.0.0 did, so it was a lifesaver to me.
As far as I see, you can, of course, insert the same mesh into high and low poly slots, and Knald can generate it's own cage from scratch, but they don't give flawless results cageless baker from 1.0.0 gave me. To give one example where pushing fails - if there is a thin polygon like this one, push gives an incorrect cage.
You can manually fix a small case like that by changing the topology, of course:
But that's not always viable if you work with meshes with hundreds of areas like those. Old cageless baker handled that case flawlessly, and handled other complicated cases (like thin heavily occluded gaps smaller than typical push radius) very gracefully, hence my interest in continuing to use it.
Edit: More examples of geometry with very problematic cages:
And here are some comparisons of AO bake of that model in 1.0 and 1.1:
Hopefully that illustrates the reason behind my interest in cageless bakes well.
Thanks for posting and for the very detailed information.
You will be glad to hear that we haven't removed any of the AO baking functionality that was present in 1.0. The only change we made when adding the new baker was to remove the Mesh AO option if the user has generated high poly mesh cache via the high to low baker. You can use exactly the same workflow as you did in 1.0 (Load blank Normal map> Load mesh> AO Tab> Hit Activate).
Can you send me the meshes for both of these cases please? In the second mesh example you might want to install 1.1.1 as we fixed an issue where High Poly planar faces (axis aligned) baked incorrectly in some cases
Hopefully that illustrates the reason behind my interest in cageless bakes well.
High to low cageless baking is possible but the term 'cageless' is a bit of a misnomer. When baking high to low you always need to use a 'cage' or you will get seams in things such as normal maps. Whenever you see the term cageless it is simply referring to a search distance with averaged normals (averaged envelope) & is essentially the same thing as using default projection cage within Knald.
Max Frontal & Rear ray distances correspond to the Push & Range within Knald & generally both methods still require adjustment (as in all baking programs) to avoid issues & to ensure that people are getting the results they desire.
Within Knald you can visualise the projection if you wish, though this is by no means a requirement. It's quite possible to just use the default values and never adjust the cage or set the Push/Range values to whatever you wish without ever visualising the projection.
We also never require a user to import a cage mesh into Knald when baking as one is generated automatically, but of course, we also allow the user to import a cage mesh too if they wish.
So, if I understand you correctly, "Mesh AO" mode in Knald 1.0 worked in a very similar way to the new "Baker" mode, except 1.0 used exactly the same mesh as source and destination (low/high slots in Baker) and used non-configurable, welded, averaged and pushed version of that same mesh as a cage. If that's the case, what was different about 1.0 settings that made it possible to bake objects with very thin faces and small cavities without missed (incorrectly caged) areas?
As about 1.1.1, I'll give it a try, but I think that the second model has exactly the same root issue as the first one - examining it closely, all areas where cage was pushed incorrectly seem to have long thin faces produced by triangulation.
P.S.: As an off-topic stab at working around this - does anyone know any scripts or handy workflows with existing tools for 3ds Max, that allow you to retriangulate a mesh in a way biased to generate shortest edge lengths? Every time I fix those kinds of topologies manually, I catch myself thinking that I do exactly the same repeated checks and actions, which can probably be scripted. Stuff along the lines of:
Iterate through all edges
Check if it's neighbouring two triangles, proceed if it is
Fetch the two remaining vertices that are not involved in our edge from those two triangles
Check if distance between those two edge-unrelated vertices is smaller than current edge length, proceed if it is
Rebuild those two triangles, moving the edge to those two previously edge-unrelated vertices
Or maybe, if that's not too much to suggest, that can be an operation in the cage generator used by Knald. Every case of thin faces I can think of can be killed by this kind of edge flipping. Simple illustration of the idea:
So, if I understand you correctly, "Mesh AO" mode in Knald 1.0 worked in a very similar way to the new "Baker" mode, except 1.0 used exactly the same mesh as source and destination (low/high slots in Baker) and used non-configurable, welded, averaged and pushed version of that same mesh as a cage. If that's the case, what was different about 1.0 settings that made it possible to bake objects with very thin faces and small cavities without missed (incorrectly caged) areas?
Actually, no Knald has 3 ways to bake AO and they are all different methods. You can bake the Mesh AO in exactly the same way as you used to in 1.0 without ever loading the new baker. Just load an empty normal map (8080ff) and a mesh in the 3d preview window, then go to the AO tab and hit activate.
As about 1.1.1, I'll give it a try, but I think that the second model has exactly the same root issue as the first one - examining it closely, all areas where cage was pushed incorrectly seem to have long thin faces produced by triangulation.
P.S.: As an off-topic stab at working around this - does anyone know any scripts or handy workflows with existing tools for 3ds Max, that allow you to retriangulate a mesh in a way biased to generate shortest edge lengths? Every time I fix those kinds of topologies manually, I catch myself thinking that I do exactly the same repeated checks and actions, which can probably be scripted. Stuff along the lines of:
Iterate through all edges
Check if it's neighbouring two triangles, proceed if it is
Fetch the two remaining vertices that are not involved in our edge from those two triangles
Check if distance between those two edge-unrelated vertices is smaller than current edge length, proceed if it is
Rebuild those two triangles, moving the edge to those two previously edge-unrelated vertices
Or maybe, if that's not too much to suggest, that can be an operation in the cage generator used by Knald. Every case of thin faces I can think of can be killed by this kind of edge flipping. Simple illustration of the idea:
Yea, that was what I found too. The reason the cage isn't behaving well is that there is a hidden face (marked as face 1) that has the thickness of an edge & is causing the cage to skew. If you remove that face and then connect the stray verts to the large triangle (marked as face 2) the cage then works as expected (essentially what you are doing in the last image you posted).
(Please note that I moved the verts around in the above image to reveal the hidden face. It didnt import like this)
As it happens Knald already triangulates everything to shortest diagonal before baking, but the reason you are not seeing this on your mesh is that face 2 is already a triangle.
Knald looks interesting, but I think the pricing model is wrong guys.. 100$ for freelancer license and no commercial use? How comes that? And I also think is a bit to much. Honestly, something like 30-50$ for freelancer license that includes commercial use, can be something much closer to reality, and count freelancer a person who don't make more that 100k per year (or something like that), then you can think about studio license. I would buy it in that case, but 100$ and no comercial -> no/no.
Knald looks interesting, but I think the pricing model is wrong guys.. 100$ for freelancer license and no commercial use? How comes that? And I also think is a bit to much. Honestly, something like 30-50$ for freelancer license that includes commercial use, can be something much closer to reality, and count freelancer a person who don't make more that 100k per year (or something like that), then you can think about studio license. I would buy it in that case, but 100$ and no comercial -> no/no.
Just my 2 cents.
Divide the commercial license price by your work hour rate and ask yourself if Knald can save you that number of work hours. Looking at the work hours Knald has saved me in the long run, I'll have to disagree, the price is more than fair. It's like UVLayout in that regard, at first you have your doubts about the price, and then you realize how much more productive you are when you add those new tools.
BAC9-FLCL, what you say is true, no doubt. However, I'm pretty sure that there are plenty of artists out there that just starting their careers and don't have any earnings yet, because they're learning and work on their portfolios to get their first job. Additional thing to keep in mind that in developing countries even $100 is a decent ammount of money, and Knald isn't the only tool one need to afford. Though there is a free alternative like xNormal, everybody would still like to use most progressive and up-to-date tools, no? Can't blame them for it.
I believe that if Knald was to go to the Steam, it would be beneficial both to the company and its customers. Steam has automatic regional prices that can be considerably lower in some regions, and it will allow more people like students, hobbyists and aspiring artists alike to actually afford to buy Knald license.
Allegorithmic did that and I can see how many people jumped on their train just because there was some great sales done and Steam prices were lower than purchasing through their website. Great portion of that people weren't professionals, so they don't have any "hour rate" to them, but they still bought the software because it was cool and affordable. I believe that it is mostly professional thinking — "how much time and thus money this tool gonna save me?". Non-professionals think differently, and something like just $50 will make a huge effect on them in terms of making a decision of were they to make purchase or not.
I can agree to some points, but aren't you contradicting yourself a bit? If a non-professional, a hobbyist, does no commercial work with the tool and therefore can't use work hour rates etc. to support a purchase, then a cheaper non-commercial license Kland offers would be a fine choice. And if you profit from your work, therefore requiring a commercial license, then you can evaluate the usefulness of the tool vs. your rate or return on time figures.
As about regional prices, afaik Steam only suggests them, not enforces them - it's up to discretion of the developers. I don't doubt that regional prices can help with sales in places like CIS countries, but on another hand, I can understand the developers not interested in them - in contrast with games, you can't actually limit your regional pricing to the originally intended region with localization or network limitations, the tool is the same for everyone.
I can agree to some points, but aren't you contradicting yourself a bit? If a non-professional, a hobbyist, does no commercial work with the tool and therefore can't use work hour rates etc. to justify and support a purchase, then a cheaper non-commercial license Kland offers would be a fine choice. And if you profit from your work, therefore requiring a commercial license, then you can evaluate the usefulness of the tool vs. your rate or return on time figures.
As about regional prices, afaik Steam only suggests them, not enforces them - it's up to discretion of the developers. I don't doubt that regional prices can help with sales in places like CIS countries, but on another hand, I can understand why some developers aren't interested in them - in contrast with games, you can't actually limit your regional pricing to the originally intended region with localization or network limitations, the tool is the same for everyone.
Not that I'm the one to decide the pricing of Knald, just my view of the situation here.
Just to clear things up...The Freelancer license absolutely allows for commercial use (regardless of revenue).
The EULA wording is a little more complex in terms of legal wording but essentially the only stipulation for the Freelancer license is that to be eligible for it you have to be an individual, or a single person company and that person has to be the licensee. Other than that you can use it for whatever you want pretty much.
Just to clear things up...The Freelancer license absolutely allows for commercial use (regardless of revenue).
The EULA wording is a little more complex in terms of legal wording but essentially the only stipulation for the Freelancer license is that to be eligible for it you have to be an individual, or a single person company and that person has to be the licensee. Other than that you can use it for whatever you want pretty much.
Replies
Glad you found the solution
If I try to import the HP as OBJ, it hangs at 33% and doesn't continue, I have to press esc to regain control of the application.
If I try to import the HP as FBX, the are no crashes or hangs but only one object in the HP file is actually imported, and it doesn't always align with the low poly either. I've done a lot of testing to ensure I'm exporting the FBX correctly, and to the best of my knowledge I am. I've tried moving all pivots to world origin, reset xforms before exporting and made sure the HP & LP import correctly on other software, like Toolbag. The meshes align perfectly there, so not sure what else to try.
I'm using 3ds max 2015, and FBX 2015.1 plugin version, FBX2014 file format. Not sure if I'm missing some newer version.
By the way the other tools in the program look great so far!
Edit: Managed a good bake, but I had to retry several time to get the high poly in place, really strange. I also had to convert it to a single mesh to make sure all of it is loaded.
Sorry for the belated reply.
Thanks for the kind words
Unfortunately we don't currently support sub-meshes for FBX, but it's on the list. At the moment each sub-mesh must either be merged into one master mesh or exported separately as multiple mesh files if you want things like MatID for separate parts.
It would be really great if you could send us the meshes that you were having problems with so we can do some debugging.
Please include:
support@knaldtech.com
ASCII FBX rather than binary would be perfect where the FBX files are concerned.
Cheers!
I would really like to see this added as well!! Or is the base texture baking already in the beta??
Good job on Knald, lightning fast and awesome results!
I came across two issues with the baker so far though.
That part of the range mesh is outside the model while the rest of it is in.
Am I doing something wrong?
Thanks
Sorry for the late reply!
The beta was ended when Knald 1.1 was released a few weeks ago, but this is on our list of things to add in the future.
Thanks for the kind words!
Any cage mesh that you loaded wont show up until you hit the pre-process/bake button. Is that the issue you are having?
Yes, Knald always uses a cage for search distance purposes, be it the internal one or one that is loaded by the user. Can you send us the mesh that you are having issues with please?
Thanks!
I PM-ed you the archive with the high-poly, the cage and the low-poly. Knald doesn't seem to import my cage correctly... in Max it covers the entire high-poly object, but in Knald it just seems to stay in place for that particular face.
I reset Xform and converted to mesh then poly, exported it as FBX and OBJ... nothing helps
As far as I can tell this problem is only with the preview and baking works without any problems.
Did a clean install of newest Nvidia drivers and reinstalled Knald, but no help. Weirdest part is that Knald worked fine yesterday and I have no idea what could have changed. Any help and ideas would be great.
Edit: Found the culprit. did not realize I had RivaTuner Statistics Server running in the background. Silly me, should have been the first thing to check
A few quick questions/requests, sorry if they've been asked already...
- Any chance of getting drag and drop support for files when adding high and low poly meshes?
- The Curvature Type drop-down isn't saved when you save the baking settings (it'll always default back to Dual Channel).
- Can you share exactly what the "self-occlusion" check in the 3D Preview is doing?
- I love using copy-paste to get maps from Knald to Photoshop, is there any way to get it to copy 16bit info to the clipboard for normal maps? Not sure this is even possible with Windows I guess...
- +1 to a transfer map function (with an option to recalculate the tangent basis when transferring normal maps), that would be another solid step in completely replacing xN.
Thanks again.
Thanks for the kind words!
1. Yea, I'm sure we can add that in a future update. I will add it to the list.
2. Thanks for the report. I will make sure it gets fixed.
3. Self Occlusion is a toggle for if you want the 3d preview mesh to self occlude itself or not. It's purely a cosmetic feature for the viewer and has no impact on the generated results.
4. Windows doesn't support anything more than 8bit in the clipboard unfortunately. You can open all the images in Photoshop (or whatever other application you want) automatically on export by using the Post Export Actions which are found in the export tab. The 'Open With Default Handler' preset will open the exported images with whichever program is set to open them by default. for example if TGA is set to open in Photoshop then on export Photoshop will start and load the exported TGAs.
https://www.knaldtech.com/docs/doku.php?id=post_export_actions_1.1
5. Thanks for the suggestion. I will add it to the list
Cheers!
For the self-occlusion, I was more wondering if you could describe what it's doing in the viewer, or if there's any documentation on the technique somewhere. We have some self-occlusion type tech in our current engine but it seems quite different and I'm not always satisfied with it. I was wondering if maybe I could ask our engine guys to test some other methods.
Thanks for the kind words regarding our viewer. It's always great to hear when people like what they see
Unfortunately I'm not really at liberty to discuss undocumented/non-public information regarding our technology. Sorry
Having said that, in the future we are planning on releasing a standalone shader proposing an ibl implementation with a Lys generated cube map, which should cover all the relevant bases. I can't give any definitive times scales for that though.
Fbx seems to be a bit unstable for me when I use it on the highpoly (it causes knald to hang and never complete preprocessing). I'm fine using OBJ instead, but I wish I could bake an ID map based on material ID and not mesh ID from an obj. It's a little tedious exporting my HP out in many pieces, sometimes 20-30 to be able to utilize the mesh ID bake. Is there a detail I'm missing in Knald that allows me to bake from a single OBJ with several material IDs rather than 20 or so separate high poly meshes? Configuration only allows me to use mesh ID, and I have groups and materials checked on export. Maybe it's my export settings?
Thanks so much for the very kind words!
It's something we want to add in a future update for sure. I will make sure it's high on the list.
Knald 1.1.1 should fix the issue you are having with FBX. Please let us know if it works for you.
Since the 1.1.0 release of Knald a couple of months ago we have been hard at work fixing a few bugs and adding new improvements to Knald. Time to enjoy Knald 1.1.1!
What’s changed since Knald 1.1.0?
New Features, Improvements and Optimizations
Bug fixes
You can grab the new from your account
You can find build notes here: https://www.knaldtech.com/knald-update-build20160313001/
We really want to thank everyone for all the hard work and feedback during Knald's development. Please do continue sending us suggestions & posting everything you can!
Same. I tried 1.1.1 and the my high poly meshes never finish loading. It just gets stuck on 1%.
Can you send me the meshes that are sticking for you please? PM or email them me on support@knaldtech.com and I would be happy to take a look.
Send an email. Let me know if there's any problem in the files.
https://www.knaldtech.com/docs/doku.php?id=interface_knald_1.1#d_preview1
Hope that helps!
Thanks! I will investigate.
Currently we don't have the ability to explode/name match meshes within Knald itself, but it's on the list of things to add for a future update.
metalliandy
Feature requests:1. Could you make right clicking a numeric input field set it back to default.
2. Could you make kernel type it's own setting, I often find I like to use sobel as my default (gives nice antialiased edges for photo to normal), but each time I press reset to defaults to zero all my settings out I then I have to set the kernel again.
3. Still really want albedo/Vcols to remember between pasting from photoshop, would love to turn this off and never see it turned on again.
Picture one [Maya]: lowpoly mesh in gray, cage in orange. The cage is basically the same mesh that has two corners lifted straight up.
Picture two [Knald]: lowpoly mesh in gray, cage in orange. The cube got a new haircut, and I was expecting him to keep the old one. He was like a Duke Nukem minus the orange instead of blonde back then, and now he's got some kind of afro.
Looks like vertices on top got pushed out, averaged and kinda inflated, as if Push slider were used. To specify, there were not any warnings in Knald. Push slider was greyed out as it should be when using custom cage mesh.
The question is — am I heavily missing something and this is normal, or this is indeed something strange with the way my cages were handled by Knald? As I stated before, I was expecting Knald to keep my cage exactly the same, but that's what happens instead.
Just an update to the issue I was having: The reason why Knald was getting stuck during baking was because my high poly had motph map information(to toogle between the default position and the exploded offset for baking). As soon as I removed the blend shape it went fine, so probably best to avoid baking dense meshes with morph information.
Thanks for the suggestions as always.
1.Sure, that sounds like a good idea. I will see what I can do.
2. You can already do this actually. The reset button is specifically to reload the shipped defaults, but if you save the settings you want as your personal defaults with the down arrow and then hit the 'cancel' button (the middle one) it will reload to your last saved settings rather than the defaults.
3. This is still on the list of stuff to add for a future version, for sure. I promise will try to get this done as soon as we can, mate. We actually added a different method of ray distribution in 1.1.1 that fixes this issue, though the projection you will get from that cage wont be awesome
In the Bake Settings change the 'Ray Distribution' to 'Unadjusted' and you should get the result you are looking for.
Hope that helps
Thanks for updating the thread, MDiamond. For reference this isn't a Knald specific bug as the files also failed to load in Autodesk FBX Review too.
It's likely that the bug is something to do with Modo or the FBX SDK they are using. For now it's probably best not to use meshes from Modo with morph information until they update the SDK /fix the issue.
Cheers!
To specify the reason I'm asking this — I was trying to get a good bake result from a unexploded mesh using a fine-tuned Cage mesh. With ray distribution set to Aligned the cage was partially scewed in some specific geometry areas, thus affecting the bake in a negative way and making such cage unusable. I was looking for a way to prevent this, and as you pointed out Unadjusted ray distribution will allow me to use an unmodified cage mesh. Are you implying that using Unadjusted ray distribution will always produce subpar results compared to Aligned, or you was referring only to the specific configuration showed on my pics above? Just trying to understand it correctly. Thanks!
Yes, that is correct. Using a proper cage mesh with Unadjusted distribution wouldn't hurt bake quality. I was only talking about the specific cage used in the tests above when I said the results wouldn't be awesome.
For clarification: The Unadjusted setting for cages is just different way of controlling ray distribution and will not adversely reduce the quality of the bake vs Aligned.
In general soft normals (Adjusted) results in a better distribution of rays than points placed by hand as they are a mathematically "perfect" angular distribution for the geometry, but for some meshes you may get better results using points placed by hand (unadjusted) as it greatly depends on the topology of the cage as to how the projection will align.
Essentially 'Adjusted' is a way of using the cage to indicate ray distance but not distribution of rays whereas with unadjusted we allow the cage to dictate ray distance and distribution simultaneously.
Is Knald able to give TS normal maps calculated by different algorithms, or are we forced to use MikkTSpace and then convert via Handplane? If so, what what should we put in the "Baked In" field in Handplane for syncing correctly? XNormal?
It really depends on what your target engine is as to the advice I can give. Knald uses Mikktspace which is supported natively by Unreal 4.7+, Unity 5.3+ and Lumberyard to name a few.
If you are looking for another tangent basis then Knald does have the option to use the exported FBX tangent space (binormal and tangent vectors) if you desire, which can be toggled in the preferences, but you don't specifically need to use this unless your workflow requires it.
Regarding Handplane it again depends on the target engine. Knald & xNormal both use Mikktspace but Knald calculates the bitangent on the fly (which is compatible with UE4) & xNormal doesn't by default (it's an option in the preferences), so it depends if Handplane recognises the differences between the two methods when it does the conversion. Bitangent on the fly is used by default within Knald due to the fact that it is more efficient on modern hardware (sending data from vertex to pixel shader is more costly than executing a minor calculation in the pixel shader.), but this is a trivial change within the shader of your engine so in practice the difference might not be a problem.
Hope that helps!
This is in toolbag 2,the TS is already set to mikkT.You can see that there are few shading errors,which it should be because I didn't set the hard edge to fix it.
This is in knald 3D preview Window,The shading error is gone.
So what cause this diffrent?Is that knald and toolbag2 has something different in rendering?
Thanks.
Are you triangulating the mesh before baking? The errors you are seeing are usually because the triangulation of the mesh when baked is different to the triangulation of the the viewer and normal maps are locked to a specific triangulation once baked.
If you use the same triangulation that should fix the issue.
Yeah,I forgot the triangulation,very stupid mistake :*
And thanks for this awesome application.
Thanks for the kind words!
Hi there! There is one thing I'm wondering about since 1.1.0 beta, and that is whether it's still possible to use the old "Mesh AO" baking implementation from 1.0.0. in 1.1.0. The mesh AO bakes from 1.0.0 had some minor issues, like lack of proper AA, but they were absolutely, insanely perfect projection-wise, without a single clipping issue, without a single noticeable pixel missing a ray. I routinely use 1.0.0 to this day and I was not able to replace it with anything so far, because it's the only software I know of that does perfect cageless bakes.
And now, with 1.1.0, that option is gone, and we have the traditional approach to baking with low poly, high poly and cages. It's no doubt great for traditional content like sculpted high poly stuff, but I'm working on strategy and mobile games where highpoly version usually doesn't exist at all and cage creation is really complicated by use of hard edges in low poly models. Nothing from xNormal to SD baker could handle the cageless lowpoly to lowpoly bakes as well as Knald 1.0.0 did, so it was a lifesaver to me.
As far as I see, you can, of course, insert the same mesh into high and low poly slots, and Knald can generate it's own cage from scratch, but they don't give flawless results cageless baker from 1.0.0 gave me. To give one example where pushing fails - if there is a thin polygon like this one, push gives an incorrect cage.
You can manually fix a small case like that by changing the topology, of course:
But that's not always viable if you work with meshes with hundreds of areas like those. Old cageless baker handled that case flawlessly, and handled other complicated cases (like thin heavily occluded gaps smaller than typical push radius) very gracefully, hence my interest in continuing to use it.
Edit: More examples of geometry with very problematic cages:
And here are some comparisons of AO bake of that model in 1.0 and 1.1:
Hopefully that illustrates the reason behind my interest in cageless bakes well.
Thanks for posting and for the very detailed information.
You will be glad to hear that we haven't removed any of the AO baking functionality that was present in 1.0. The only change we made when adding the new baker was to remove the Mesh AO option if the user has generated high poly mesh cache via the high to low baker. You can use exactly the same workflow as you did in 1.0 (Load blank Normal map> Load mesh> AO Tab> Hit Activate).
Can you send me the meshes for both of these cases please? In the second mesh example you might want to install 1.1.1 as we fixed an issue where High Poly planar faces (axis aligned) baked incorrectly in some cases
High to low cageless baking is possible but the term 'cageless' is a bit of a misnomer. When baking high to low you always need to use a 'cage' or you will get seams in things such as normal maps. Whenever you see the term cageless it is simply referring to a search distance with averaged normals (averaged envelope) & is essentially the same thing as using default projection cage within Knald.
Max Frontal & Rear ray distances correspond to the Push & Range within Knald & generally both methods still require adjustment (as in all baking programs) to avoid issues & to ensure that people are getting the results they desire.
Within Knald you can visualise the projection if you wish, though this is by no means a requirement. It's quite possible to just use the default values and never adjust the cage or set the Push/Range values to whatever you wish without ever visualising the projection.
We also never require a user to import a cage mesh into Knald when baking as one is generated automatically, but of course, we also allow the user to import a cage mesh too if they wish.
Hope that helps!
Thanks for a quick response! For now, here are the meshes I've used:
https://www.dropbox.com/s/zd93478qb3mhmpo/knald_test_01.fbx?dl=0
https://www.dropbox.com/s/5d0u8qbt9zz9zay/knald_test_05.fbx?dl=0
So, if I understand you correctly, "Mesh AO" mode in Knald 1.0 worked in a very similar way to the new "Baker" mode, except 1.0 used exactly the same mesh as source and destination (low/high slots in Baker) and used non-configurable, welded, averaged and pushed version of that same mesh as a cage. If that's the case, what was different about 1.0 settings that made it possible to bake objects with very thin faces and small cavities without missed (incorrectly caged) areas?
As about 1.1.1, I'll give it a try, but I think that the second model has exactly the same root issue as the first one - examining it closely, all areas where cage was pushed incorrectly seem to have long thin faces produced by triangulation.
P.S.: As an off-topic stab at working around this - does anyone know any scripts or handy workflows with existing tools for 3ds Max, that allow you to retriangulate a mesh in a way biased to generate shortest edge lengths? Every time I fix those kinds of topologies manually, I catch myself thinking that I do exactly the same repeated checks and actions, which can probably be scripted. Stuff along the lines of:
Or maybe, if that's not too much to suggest, that can be an operation in the cage generator used by Knald. Every case of thin faces I can think of can be killed by this kind of edge flipping. Simple illustration of the idea:
Thanks for the meshes.
Actually, no
Knald has 3 ways to bake AO and they are all different methods. You can bake the Mesh AO in exactly the same way as you used to in 1.0 without ever loading the new baker. Just load an empty normal map (8080ff) and a mesh in the 3d preview window, then go to the AO tab and hit activate.
Yea, that was what I found too. The reason the cage isn't behaving well is that there is a hidden face (marked as face 1) that has the thickness of an edge & is causing the cage to skew. If you remove that face and then connect the stray verts to the large triangle (marked as face 2) the cage then works as expected (essentially what you are doing in the last image you posted).
(Please note that I moved the verts around in the above image to reveal the hidden face. It didnt import like this)
As it happens Knald already triangulates everything to shortest diagonal before baking, but the reason you are not seeing this on your mesh is that face 2 is already a triangle.
Hope that helps!
Just my 2 cents.
Divide the commercial license price by your work hour rate and ask yourself if Knald can save you that number of work hours. Looking at the work hours Knald has saved me in the long run, I'll have to disagree, the price is more than fair. It's like UVLayout in that regard, at first you have your doubts about the price, and then you realize how much more productive you are when you add those new tools.
I believe that if Knald was to go to the Steam, it would be beneficial both to the company and its customers. Steam has automatic regional prices that can be considerably lower in some regions, and it will allow more people like students, hobbyists and aspiring artists alike to actually afford to buy Knald license.
Allegorithmic did that and I can see how many people jumped on their train just because there was some great sales done and Steam prices were lower than purchasing through their website. Great portion of that people weren't professionals, so they don't have any "hour rate" to them, but they still bought the software because it was cool and affordable. I believe that it is mostly professional thinking — "how much time and thus money this tool gonna save me?". Non-professionals think differently, and something like just $50 will make a huge effect on them in terms of making a decision of were they to make purchase or not.
I can agree to some points, but aren't you contradicting yourself a bit? If a non-professional, a hobbyist, does no commercial work with the tool and therefore can't use work hour rates etc. to justify and support a purchase, then a cheaper non-commercial license Kland offers would be a fine choice. And if you profit from your work, therefore requiring a commercial license, then you can evaluate the usefulness of the tool vs. your rate or return on time figures.
As about regional prices, afaik Steam only suggests them, not enforces them - it's up to discretion of the developers. I don't doubt that regional prices can help with sales in places like CIS countries, but on another hand, I can understand why some developers aren't interested in them - in contrast with games, you can't actually limit your regional pricing to the originally intended region with localization or network limitations, the tool is the same for everyone.
Not that I'm the one to decide the pricing of Knald, just my view of the situation here.
Just to clear things up...The Freelancer license absolutely allows for commercial use (regardless of revenue).
The EULA wording is a little more complex in terms of legal wording but essentially the only stipulation for the Freelancer license is that to be eligible for it you have to be an individual, or a single person company and that person has to be the licensee. Other than that you can use it for whatever you want pretty much.
@metalliandy
Thanks for clarifying that!