Thank you for the answer, so that's good to have such speed by doing a real bake while Substance Painter fake the AO and other maps by default if i remember correctly (to match your speed?).
Also i thought that the cavity map was using the normal map only. But anyway in this image i have created we can see why map generated from normal map must be avoided:. The Concavity and Convexity from Knald are pretty good.
Hey Andy, seems i didnt completed all test with the new Version. I
get just wrong AO with mirrored UV's - also with 16xAA in this case. So
i have to build 2 LP (like in old days )
Ahh ok. Mirrored UVs should work ok, but if you confirm they don't, or anything else doesnt work as expected, please let me know and I will take a look
Thank you for the answer, so that's good to have such speed by
doing a real bake while Substance Painter fake the AO and other maps by
default if i remember correctly (to match your speed?).
Also i
thought that the cavity map was using the normal map only. But anyway in
this image i have created we can see why map generated from normal map
must be avoided:. The Concavity and Convexity from Knald are pretty
good.
Thanks! We also export a single channel curvature map too so there isnt much need to compile them in Blender unless you really want to Just select 'Single Channel' in the 'Curvature Type' dropdown in the curvature group and export. Once the initial bake is completed you can swap between Single and Dual channel without the need to rebake.
I share the node setup to composite the textures baked with Knald to get a good grayscale ready to color with gradients to get a stylized effect. The AO is too contrasted by default i had to mix it with a white color.
Hi @Metalliandy, here are some notes based on the 1.2.1 demo :
- When starting up the program, the user lands on the "Main" panel which is not self-explanatory. This tab should be called something more obvious, like "Texture converter".
- "Integrator" is also not self-explanatory and there is no need for this sub palette to be collapsable. "General" can also go (and overall throughout the app all these collapsible headers can probably be taken out, since scrolling is faster and less visually confusing). Mockup follows :
- Baker should probably appear as first tab since this is likely what your users want to use first. Other tabs might need re-ordering too for them to be shown in the order the user needs to click them.
- When starting up the app, why is the Baker tab greyed out ? The "Load Baker" button is not necessary, the Baker tab should just be live right off the bat.
- When starting up the app, why is the Height texture tab in focus ? It should be "off" like all others.
- The Baker tab needs some re-ordering to be more intuitive, see mockup :
- Some more mat ID baking options would be appreciated, like the ability to pull the baked mat ID information directly from the material IDs of the highpoly parts (thus relying on random colors, overriding the manually assigned colors per parts), and the ability to pull local color information directly from .mtl and .fbx files. Another option could be to assign random colors based on internal mesh continuity (not shown here). That's of course a bit of a stretch but could be an interesting fallback option. And lastly (also not shown here), pulling mat IDs from OBJ sub models/groups. This is all a bit tricky since the OBJ exporters of various apps seem to all do different things, but that's worth exploring. As far as I am concerned, the option I'd need the most would be the ability to bake from OBJ material IDs because I always apply materials to my highpoly models when working on them. I'll pass you an example file soon.
I hope this helps ! Needless to say the app is fantastic, it is just in need of a little bit of decluttering. The 3d previewer is what sets it above the rest, so thanks again for that. There are probably a few more things to be done to accommodate for more workflows (like the ability to prepare multiple bakes at the same time) but that's a whole other topic altogether.
@pior Thanks so much for taking the time to put together mockups and writing up such a detailed overview of your thoughts. It's super helpful to get feedback like this. I can't currently give your post the detailed answer it deserves but I will post a longer reply when I'm back at my main PC later today or tomorrow.
Hey guys, Once again thanks for the in-depth post.
I want to start with our reasoning as to why some things are laid out as they are and then move over to some more specific areas of the functionality and gui within Knald.
We have spent a very long time thinking about and implementing the GUI within Knald & Lys & generally have tried to make it as modular as possible, which is why the controls are grouped as they are. We found that it was much more flexible this way as when grouped we can save, restore and reset user controls with a very fine level of granularity, so if a user wants to only save the position of a few slider within a specific group, they can do just that and not have to save other controls in other groups if they don't want to, though with our save system they can do just that if they so desire (user can save 99% of all controls on a group, tab or application wide basis).
Also having items logically grouped provides us with the ability to only show the user controls specific to the tab or current function in question which substantially reduces clutter and confusion within the interface, which in turn makes everything easier to use. A good example for this is The Integrator group being hidden when the user loads a height map or uses the baker. The Noise reduction, Smooth, Input Map Type are only applicable to The Integrator and not when loading a height map or using Color To Normal (though The Integrator can be updated from Color To Normal) which is why they are placed in that specific group.
Something that may not be obvious is that Knald has multiple modes that are generally fenced off from each other in order to save memory as VRAM is at a premium and needs to be conserved as much as possible. If we had all the modes active all the time users would be more limited as to what they can process. For example, if the user loads an 8k normal map The Integrator would require around 3.5GB of VRAM, which could then no longer be used if they decided to bake something. Forcing the user to manually load the baker is an affirmative step that they wish to leave any other mode they are currently using and to clear that memory from the system.
The Main Tab is so called because it is the main interface within the
program & is where users can interact with every part of Knald. When
any texture is loaded for processing, either via The Integrator, Color
To Normal, The Baker or just loading a height map, the controls are
shown here. Converter doesn't really do this justice as the tab also
shows the controls for maps that are not converted.
Some general notes:
Groups are collapsible only for user convenience so if they are short on vertical screen space they can free up controls they don't need at that specific point in time.
The Export controls are within the General Tab because they are a global control for export in all modes with the export file path being mirrored in the Export Tab too.
All the controls within Knald are pretty well documented via tooltips and in the online documentation (you can view them within Knald by hovering over a GUI element for a few seconds), but if there is anything that you feels needs improving specifically please let me know and I will work on it for sure.
Regarding The Baker, we grouped items for the same reasons as I've mentioned previously, but doing so is actually much more important within The Baker as the potential for using much more vertical space is much greater than in other modes. If a user decides to bake an asset with 30 HP meshes then not only would they be increasing the vertical width of the Bake Meshes group but also Material ID too, which would probably force the user to scroll the height of 2 or 3 screens. It's important to us that users are free to conserve as much space as possible and with the system we have now they can load the meshes and then hide them at will. Another important consideration for us is that the user is able to see as much information as possible at once without the need to scroll, which again is where collapsible groups comes in.
The Bake Targets group also is localised in order to give the user a convenient way to modify which settings they view within the Baker by pressing the Configure button. Not having this control would also force the user to have a huge scrolling liability which gets larger with each target they add.
Thank for the comments regarding MatID. I'm not 100% sure what you mean by "pull the baked mat ID information directly from the material IDs of the highpoly parts". Knald currently assigns a single value per loaded HP mesh and you can pick between RGBCYMW Hue Shift, Max Euclidean RYB & Random, in addition to users having the ability to load their own palette by assigning colours horizontally to a 1px high image. While RGBCYMW Hue Shift & Max Euclidian RYB are both preset palettes, they are logical in their values and are designed to give maximum separation between subsequent values.
Having said all this however, there are always areas in which we can improve and your post has given us some great feedback and ideas in which we can improve Knald & Lys. Thanks again for taking the time to write it up.
I suggest three new feature: _ Infinite demo license but watermark Knald Demo on the textures _ Add a Cage Inflate option if we want to generate a cage _ Maybe painting tools and color gradient to texture quickly parts of the model: https://www.youtube.com/watch?v=wUeQsUJTz8I
I suggest three new feature: _ Infinite demo license but watermark Knald Demo on the textures _ Add a Cage Inflate option if we want to generate a cage _ Maybe painting tools and color gradient to texture quickly parts of the model: https://www.youtube.com/watch?v=wUeQsUJTz8I
Do you mean when a cage is imported or the generated cage within Knald? I will add the gradient map suggestion to the user wishlist.
Heya @metalliandy - thank you for taking the time to check these suggestions/mockups.
I do understand most of the reasoning behind the design choices, but I still believe that in quite a few cases the end result remains problematic. Now of course everything can be argued for or against, but overall my one main suggestion could be summed up as follows :
"The interface layout should reflect a linear script of user actions."
What I mean by that is that even though the tool can do multiple things, there is still an overall generic path that a user will almost always follow - and the UI should be organised in way that reflects this order. In the case of a plain page layout the UI elements would have to be organized from top left to bottom right (ie the diagonal that the eye naturally follows) but here in the case of Knald the ideal layout organisation would have to be from the first tab to the last, and from top to bottom within each tab.
Now overall the "linear user script" for a baking tool is :
1 - Loading of the highpoly model(s) 2 - Loading of the lowpoly model(s) 3 - Adjusting global settings (files paths, textures to bake, and so on) 4 - Adjusting map-specific settings (ID setup, various overrides if/when needed, and so on) 5 - Pressing the bake button 6 - Reviewing the result 7 - Eventually, processing the output for cavity detection, and so on.
One good rule of thumb is to try and make sure that the user never has to "backtrack" at any point during the script. That is to say, there should be no need to go left, or up, or to a previous tab.
Even though Xnormal is a bit antiquated, one can tell that Jogshy strictly follows this "linear script" principle and thus did his best to logically lead the user through all the steps / tabs. Of course not everything is perfect in Xnormal (the main mesh loading action being hidden behind a non-discoverable right click menu is a UX no-no) but besides that it is still really good. I hope this makes sense !
Besides that I second the suggestion of a watermarked demo.
Thanks for the comment, Pior. I appreciate the time you have taken to write up your thoughts.
I understand what you mean in regards to linear flow and I agree that there are things which could be improved within the interface. It's a constantly evolving entity of course.
We generally tried to keep global settings first and foremost (things like map size, AA, bake button, ray projection etc.) because they are the actions people are most likely to change the most outsize of bake target specific controls, which is why they are grouped right at the top. Once the user loads the meshes they are unlikely to change them again so we felt that having them right at the top would cause distraction to the user and if the user has to look away from the screen then they would have to scan the interface to find the stuff they need to change more often. This is also the reason why the bake target controls are at the bottom of the interface too as it is much faster to look right at the bottom of the screen to find them than scanning down through the GUI. Also, once the user had chosen the bake meshes and targets they can collapse/hide those groups and only bake specific targets will be visible.
I'm not sure if it is obvious, but there is no real need to update the integrator once a bake has been completed. If you bake cavity or curvature etc from a mesh then you dont need to update the integrator in order to reprocess it as the results from the baker will be taken directly from the mesh and there are many more map types available within the baker. We added the button as an easy way to allow users to send data to the integrator if they wanted to, but it isn't mandatory at all. If this isn't obvious we probably need to think about another way to allow users to do this.
The demo is actually already watermarked btw. I forgot to mention this when @Linko posted above.
metalliandy: i mean a masking tool associated with color ramps to easily texture and maybe the color ramps saved like modifiers to easily edit them and the ability to save color ramps presets. Also being able to import an ID map then to assign quickly the color ramps by picking the color.
But you probably mean a gradient map from bot to top it's a good idea too, something like that: http://www.blendswap.com/blends/view/77269 With a slider to adjust the black to dark grey and white to light grey.
Also i suggest a tab called "Grayscale" that are all the map composited together with slider to control the opacity of each map and the color ramps applied on top of it with an eye to toggle it On and Off.
And for the cage i suggest an option like Blender Cycles baking when we check the "cage" button we can pick a mesh or use the extrusion setting to duplicate an inflate the low poly to use it as a cage.
I agree with what Pior says, there is a "linear" way to bake texture and those options should be directly accessible without going tab to tab.
metalliandy: i mean a masking tool associated with color ramps to easily texture and maybe the color ramps saved like modifiers to easily edit them and the ability to save color ramps presets. Also being able to import an ID map then to assign quickly the color ramps by picking the color.
But you probably mean a gradient map from bot to top it's a good idea
too, something like that: http://www.blendswap.com/blends/view/77269
With a slider to adjust the black to dark grey and white to light grey.
Yea, I know what you meant. Gradient maps have been around for ages and were used in Left 4 Dead 2 for example. The other map you mentioned is called a Position or Volumetric gradient map when all 3 channels (RGB) are used. They are on our TODO list
Also i suggest a tab called "Grayscale" that are all the map composited together with slider to control the opacity of each map and the color ramps applied on top of it with an eye to toggle it On and Off.
And for the cage i suggest an option like Blender Cycles baking when we check the "cage" button we can pick a mesh or use the extrusion setting to duplicate an inflate the low poly to use it as a cage.
You can already do this in Knald. You just load your mesh into the Cage Mesh slot in the Bake Meshes group. When a cage mesh isnt loaded we generate one internally based on the low poly geometry. You don't even need to use similar topology for cage within Knald
I agree with what Pior says, there is a "linear" way to bake texture and those options should be directly accessible without going tab to tab.
You can already access all the baker functions bar export in the Baker tab, so you should really need to be swapping tabs too much. Basically you should only need to set the export location/name and everything else can be done within a single tab.
First off, I ABSOLUTELY LOVE Knald. I use it in every single project that I work on and it's become one of the most essential tools in my workflow.
Recently, I've started making more use of skew meshes to get rid of skewed details in my bakes. The method for using these meshes is to bake out an Object Space normal map, import that map + the low poly model into something like xNormal, HandPlane 3D, or Substance Designer, and then render out a tangent space normal map for whatever my target engine is (usually Substance Painter 2, Unreal 4, or Marmoset Toolbag 3). It's a terrific way to get amazing tangent space normal maps without skewed details.
But I would love to keep my workflow as streamlined as possible. The fewer the apps I have to use, the faster I can work.
So my question is this: Does Knald currently support a way to either use the Object Space Normal Map + low poly to generate a tangent space normal map? Or is there a setting in Knald that I can toggle to prevent skewing? If neither feature exists, would there possibly be a way to include that kind of functionality in Knald in a future update?
First off, I ABSOLUTELY LOVE Knald. I use it in every single project that I work on and it's become one of the most essential tools in my workflow.
Recently, I've started making more use of skew meshes to get rid of skewed details in my bakes. The method for using these meshes is to bake out an Object Space normal map, import that map + the low poly model into something like xNormal, HandPlane 3D, or Substance Designer, and then render out a tangent space normal map for whatever my target engine is (usually Substance Painter 2, Unreal 4, or Marmoset Toolbag 3). It's a terrific way to get amazing tangent space normal maps without skewed details.
But I would love to keep my workflow as streamlined as possible. The fewer the apps I have to use, the faster I can work.
So my question is this: Does Knald currently support a way to either use the Object Space Normal Map + low poly to generate a tangent space normal map? Or is there a setting in Knald that I can toggle to prevent skewing? If neither feature exists, would there possibly be a way to include that kind of functionality in Knald in a future update?
Thanks for the kind words, @bmobius6 It's always nice to hear that people love using Knald!
We don't currently have any sort of normal map conversion or anti-skew features in Knald currently. We very much appreciate the suggestions and will look into them for sure
First off, I ABSOLUTELY LOVE Knald. I use it in every single project that I work on and it's become one of the most essential tools in my workflow.
Recently, I've started making more use of skew meshes to get rid of skewed details in my bakes. The method for using these meshes is to bake out an Object Space normal map, import that map + the low poly model into something like xNormal, HandPlane 3D, or Substance Designer, and then render out a tangent space normal map for whatever my target engine is (usually Substance Painter 2, Unreal 4, or Marmoset Toolbag 3). It's a terrific way to get amazing tangent space normal maps without skewed details.
But I would love to keep my workflow as streamlined as possible. The fewer the apps I have to use, the faster I can work.
So my question is this: Does Knald currently support a way to either use the Object Space Normal Map + low poly to generate a tangent space normal map? Or is there a setting in Knald that I can toggle to prevent skewing? If neither feature exists, would there possibly be a way to include that kind of functionality in Knald in a future update?
Thanks for the kind words, @bmobius6 It's always nice to hear that people love using Knald!
We don't currently have any sort of normal map conversion or anti-skew features in Knald currently. We very much appreciate the suggestions and will look into them for sure
Thanks!
I really appreciate any time your team takes to implement the suggestions of the artists using Knald. The fast feedback we've received from the Knald team has been a massive boost to the confidence we have in using Knald.
I really appreciate any time your team takes to implement the suggestions of the artists using Knald. The fast feedback we've received from the Knald team has been a massive boost to the confidence we have in using Knald.
Thanks! We always aim to add as much as we can from user suggestions. I'm glad you are happy with the service! We can always be reached at support@knaldtech.com too!
Sometimes when i bake the curvature map is empty, completely black, why? The other maps bake properly.
Also i suggest: _ a specular map that is the result of multiple maps _ an UV Snapshot map to help for the texturing work _ to not bake and add a warning if the low poly has no UVs. _ an
option to generate a low poly with a decimation to prototype the game
asset, see what it looks like in game and create static meshes props
with auto unwrapping.
I have created a node setup that Knald users can use to generate a game texture just by connecting their texture. It gives the control for the opacity, amount of dust, blood and the AO contrast.
I have made a comparison of the Curvature i get from Knald on the left and Substance Painter 2.5 on the right. I have used Blender's Decimation modifier and Smart UVs so Substance struggles to generate the curvature from the normal map while the curvature based on the mesh in Knald gives good result with any UV Unwrap and normal:
That's a good thing because i can prototype my game assets with a Blender Decimation then Smart UV but also generate my final static meshes in few clicks without worrying about the UV quality for the curvature.
Just got myself a copy of Knald and so far it works nice, though I got few questions; - after inputting my license key, why does Knald said my copy still Evaluation Copy that expires in year 1970 (0_0?) - does the transmissions map similar to the thickness map in substance painter? Knald tend to be more lighter then SP bake - why does cage mesh exported from 3ds max goes crazy both for cage and range in Knald? - is there a way to add additional tanget normal map data and tell Knald to take that as a consideration when baking the other maps? for example when I generate a diamonds pattern for my gun's hand grip in SP and tell Knald to take this as a consideration when baking curvature/ ao etc..
And +1 for the feature request to bake mesh by name matching pattern a la Substance Painter Another suggestion is to be able to manually edit a cage mesh, like for example there's no way I can get a good cage for this particular situation. But this can easily workaround if we able to load an FBX file that contain multiple mesh and assigned a LP/HP/CAGE combo automatically based on matching mesh name + user defined suffix (_low, _high, _cage for example) like I mentioned above, so we can build a perfect lowpoly, highpoly and cage mesh in 3d software and let Knald bake it all out.
Sometimes when i bake the curvature map is empty, completely black, why? The other maps bake properly.
Hi! Sorry for the late reply.
The issue you are having with curvature is most likely down to you not having smooth normals in the mesh.
Also i suggest: _ a specular map that is the result of multiple maps _ an UV Snapshot map to help for the texturing work _ to not bake and add a warning if the low poly has no UVs. _ an
option to generate a low poly with a decimation to prototype the game
asset, see what it looks like in game and create static meshes props
with auto unwrapping.
Thanks again for all the suggestions. I will add them to the user wishlist.
Very nice work with all the Blender stuff btw, Nice to see you using Knald so much!
Just got myself a copy of Knald and so far it works nice, though I got few questions; - after inputting my license key, why does Knald said my copy still Evaluation Copy that expires in year 1970 (0_0?) - does the transmissions map similar to the thickness map in substance painter? Knald tend to be more lighter then SP bake - why does cage mesh exported from 3ds max goes crazy both for cage and range in Knald? - is there a way to add additional tanget normal map data and tell Knald to take that as a consideration when baking the other maps? for example when I generate a diamonds pattern for my gun's hand grip in SP and tell Knald to take this as a consideration when baking curvature/ ao etc..
And +1 for the feature request to bake mesh by name matching pattern a la Substance Painter Another suggestion is to be able to manually edit a cage mesh, like for example there's no way I can get a good cage for this particular situation. But this can easily workaround if we able to load an FBX file that contain multiple mesh and assigned a LP/HP/CAGE combo automatically based on matching mesh name + user defined suffix (_low, _high, _cage for example) like I mentioned above, so we can build a perfect lowpoly, highpoly and cage mesh in 3d software and let Knald bake it all out.
Hey!
The date issue is cosmetic only. If you email support@knaldtech.com with your purchase details we will be able to fix that manually for you.
I can't give you any insight as to what the thickness map in painter does, but the Transmission is Knald is indeed designed to be used as omni-directional thickness map for semi-translucent materials. Unlike some other solutions for thickness, we didn't directly emulate the DICE paper from a few years back and developed all the algorithms used in-house.
Our solution actually takes relative light transmission (energy) passing through the volume of the entire mesh into account (treating it as a coherent volume) when generating the texture too. In the image below (Art courtesy of Warren Boonzaaier ) you can see that the hood is light on the inside and a darker grey on the outside. This is because the light has to travel through both the head and the hood, resulting in a lower transmission on the outside of the hood. Alternatively, the hood is comparatively thin when isolated so light passing from the outside through to the inside is relatively unobstructed (resulting in a higher transmission value) . When using the inverted normals AO trick that many people use you cannot do this.
@metalliandy About he date, I see..I just kinda shock at first that I thought the license was a subscription based model with a due date. And the thickness maps yeah it kinda make sense if you put it that way, thanks!
I tried to experiment more with the workflow for Knald but now I found some strange result. What I'm trying to achieve here is the 4th question I asked above; bake normal in Knald > add normal details in SP > re-bake the rest of the maps in Knald.
So after I add the details in SP and export out, it looks good but when I import that same map into Knald's integrator, things get a little weird, the padding bleed in to the uv area, did I miss any setting to tell Knald to do not process the normal map that I imported in and just use it as it is to bake the rest of the maps?
Left is the export map from SP, right is what it looks like when I import that to Knald's integrator. It happened on lots other area though, not just this area. Also notice the top left area there's a slight bright red/pink color there that appear to be less saturated on Knald. I haven't really go through to all the documentation though, maybe it's just me that doesn't really understand about what the integrator use for.
@Revel I also get that issue with the padding bleed, I can't seem to find any setting to keep knald from re-processing the normal map and this does affect the other maps being generated.
Also does your knald bakes look correct in SP's viewport? I'm seeing an issue there as well, see image. There's a very subtle difference in the baked maps, but I thought both used mikktspace or did a missed a setting somewhere?
@Mastahuka on the supposedly flat surface, yeah the padding bleed kinda messed up the normal. On the Knald's viewport itself already can see the difference.
@metalliandy About he date, I see..I just kinda shock at first that I thought the license was a subscription based model with a due date. And the thickness maps yeah it kinda make sense if you put it that way, thanks!
I tried to experiment more with the workflow for Knald but now I found some strange result. What I'm trying to achieve here is the 4th question I asked above; bake normal in Knald > add normal details in SP > re-bake the rest of the maps in Knald.
So after I add the details in SP and export out, it looks good but when I import that same map into Knald's integrator, things get a little weird, the padding bleed in to the uv area, did I miss any setting to tell Knald to do not process the normal map that I imported in and just use it as it is to bake the rest of the maps?
Left is the export map from SP, right is what it looks like when I import that to Knald's integrator. It happened on lots other area though, not just this area. Also notice the top left area there's a slight bright red/pink color there that appear to be less saturated on Knald. I haven't really go through to all the documentation though, maybe it's just me that doesn't really understand about what the integrator use for.
When loading on a non-tiling texture you will sometimes get these issues, as this is a side-effect of Knald applying the integrated height map to your mesh. Sadly, it is not currently possible for Knald to generate perfectly correct values around the UV islands (which are typical if the normal map was baked) after processing a normal map through the integrator, resulting in the possibility of minor visual aberrations when applying the exported height maps to UV containing meshes. One thing you can do is to ensure that you are importing the normal map with an alpha channel, which tells Knald to pin the black areas in the alpha to 100% flat. This often fixes issues like this, but of course isn't always perfect.
Please note that the primary purpose of the integrator is for reversing a normal map into a height map and other maps for when you don't have the High poly geometry to work with. If you have HP geo then the majority of the time you are best using the baker only unless it's for a specific purpose, such as you need a unfaceted height map for a tiling texture etc.
@Revel I also get that issue with the padding bleed, I can't seem to find any setting to keep knald from re-processing the normal map and this does affect the other maps being generated.
Also does your knald bakes look correct in SP's viewport? I'm seeing an issue there as well, see image. There's a very subtle difference in the baked maps, but I thought both used mikktspace or did a missed a setting somewhere?
Thanks
Knald uses Mikktspace and we should be 100% synced between the baker and the viewport, as well as UE4 etc. Is your mesh triangulated before baking? How does the baked map look in Knald's viewport? If you want to upload the meshes I would be happy to try and debug this from here.
Yes, in Knald's viewport Knald's bake looks correct (thanks for offering to debug), so the problem is substance painter's viewport as I also checked in Toolbag and Knald's bake is the one that looks correct there as well. Good job!
As for the other issue, I agree it's best to use the baker when you have the HP. Although I do use the maps generated from the integrator as well, I just have to clean them up because of the distortions. Or just a way to load the normal map in the 3d viewport without any processing just for viewing purposes?
One more thing, is there a way to do or do you have any plans for a "point light" bake? I usually grab the green channel from the object space normal and it works just fine, but it would be nice to have one with soft shadows like the AO.
Yes, in Knald's viewport Knald's bake looks correct (thanks for offering to debug), so the problem is substance painter's viewport as I also checked in Toolbag and Knald's bake is the one that looks correct there as well. Good job!
As for the other issue, I agree it's best to use the baker when you have the HP. Although I do use the maps generated from the integrator as well, I just have to clean them up because of the distortions. Or just a way to load the normal map in the 3d viewport without any processing just for viewing purposes?
There isnt a way just to load normals currently but I can add it to the user wish list. The only thing I can suggest is to use the alpha channel trick which can limit the severity of such distortions.
One more thing, is there a way to do or do you have any plans for a "point light" bake? I usually grab the green channel from the object space normal and it works just fine, but it would be nice to have one with soft shadows like the AO.
Thanks and keep up the good work!
I will add it to the user wish list. Thanks for the suggestion. The green channel from the object space bent normal might be useful to you. Generally it's somewhat softer than the green channel from regular OS normal maps.
Here is an example of the various OS Bent Normal green channel flavours.
Seems like a stupid question, but is Knald still being developed? I mean, I see people and a developer posting here quite regularly, true. I purchased Knald beginning of january, and have been following it several weeks before actually purchasing it. Was an still is version 1.2.1. How often is a new version released?
The last update was released just over 3 months ago and we are always working on new features and improvements. Generally we prefer to post a release when it's ready, rather than promising x releases within a time period, as it guarantees a more stable product and quality doesn't slip.
"Knald can currently bake vertex colours from FBX or from OBJ (polypaint)."
Yeah so I just tried baking vertex colors and they did not work. Your documentation says if the obj file has them in it then it will automatically bake the vertex colors. I exported a sculpt from Zbrush with polypaint and tried baking and there was no vertex color tab, or rather it was greyed out.
We do indeed support vertex colors from ZBrush (Polypaint) and regular vertex colors from FBX & PLY formats.
To bake and view the vertex colors you must do the following:
Load the baker
Add the Vertex Color Map Type in the Bake Targets group
Bake
Once the bake is finished you should be able to see the vertex colors in the Vertex Color Tab.
If you cant there may be an issue with the .obj you are using. If you want to upload the files (high, low & cage) it and send them to support@knaldtech.com we would be happy to take a look and see if we can find the problem.
Hi @metalliandy, few days ago Windows 10 had a major update (I guess, since it displayed the screen as if the first time I installed Win10 last time), and now Knald is broken, unable to start. There is an error pop-up saying "The code execution cannot proceed because OpenCL.dll was not found. Reinstalling the program may fix this problem." I tried to reinstall the program, with a new downloaded .exe from the site but the issue still pop-up.
Hi @metalliandy, few days ago Windows 10 had a major update (I guess, since it displayed the screen as if the first time I installed Win10 last time), and now Knald is broken, unable to start.
Must be something else. I have the Win10 update installed and Knald starts without issues.
Hi @metalliandy, few days ago Windows 10 had a major update (I guess, since it displayed the screen as if the first time I installed Win10 last time), and now Knald is broken, unable to start. There is an error pop-up saying "The code execution cannot proceed because OpenCL.dll was not found. Reinstalling the program may fix this problem." I tried to reinstall the program, with a new downloaded .exe from the site but the issue still pop-up.
Did you guys aware of this issue?
Hey!
The OpenCL.dll error is usually down to the NVIDIA driver either
becoming corrupted or not being installed correctly. This can often
happen when large changes are made to Windows such as upgrading
Windows to a new version or installing a service pack for example,
so we would recommend that you download and install the latest GPU
driver for your card to fix the error.
Whenever you install a new driver we always recommend that you
perform a clean installation and delete any cached shaders if they
are present on your system. Please ensure that it's a clean install
by following the steps in the link below: https://docs.knaldtech.com/doku.php?id=clean_driver_installation_1.2
Strange that I always did a clean installation..But today there is a new driver ready to install too so after did the step above, it all works now, thanks!
Btw, did you guys manage to find out the cause of the exploded cage mesh that I sent few weeks ago?
I've got a question about Knald in comparison to XNormal. I know Knald has a pile of added features, but could someone tell me if there are any features Knald has that XNormal has, but just does flat out better? I'm using Knald over Xnormal at home but my new place of work is using XNormal. I'm wondering if there are any "no brainer" reasons for switching to Knald. I know XNormal is a great piece of freeware, but just wondering if there are any major major points.
We try our best not go down the road of criticizing other people's software but here are some of the things in Knald we are generally proud of within our baker:
Our AO, Transmission & Bent Normals baking is incredibly fast.
Our AO, Transmission & Bent Normals are fully interactive meaning you can adjust the settings as they bake. Outside of the baker our AO from height map is extremely accurate, instant and adjustable in real-time too.
We support multiple weighting methods for both AO and Bent Normals.
Our novel Transmission solution takes relative light transmission (energy) passing through the volume of the entire mesh into account (treating it as a coherent volume) when generating the texture rather than the more common solution which is generally based on the 2011 DICE paper. When using the inverted normals AO trick this isn’t possible.
Our high speed 16x anti-aliasing solution is extremely good quality and has a very low footprint.
We have a unique mesh caching system where all meshes are processed and cached during the initial bake, to be then reused in all bakes thereafter (until a mesh is modified and reloaded).
External cage files don’t have to match topologically to the low poly mesh. This means you don’t have to worry about the differing, triangulation, or varying vertex order.
The projection and range cages can be visualised and adjusted in real-time.
We support NGons.
There is no need to re-bake if you want to export as a different file type or bit depth.
We support 8, 16 and 32bit float for export.
We allow the user balance execution time between high poly processing (import) and baking time with the Bake Balancing functionality.
Our curvature within the baker is based on the real curvature found within the high poly mesh. No tricks or work arounds.
We have a fully Physically based viewport with image based lighting.
Knald has perfect viewport synchronisation with MikkTSpace.
We support custom FBX tangent space.
Knald supports OBJ, FBX and PLY with tricounts of 350m.
You can change the strength of our Normal maps after they are baked without breaking the shading (when using a single smoothing group).
99% of user settings can be saved and restored at a later date.
Of course there are plenty of other thing we feel we do very well too and this list contains just a few highlights.
Knot sure if this was covered already, I did knotice you intend to add kname matching for split objects. But when importing an FBX made up of multiple meshes. It seems that only one single mesh is being imported. Is this a bug I'm getting or does knald not support fbx of multiple objects?
Yeah this was confirmed a while back by Andy that current version of Knald didn't support multiple mesh import (from a single file). Hopefully they will add this feature on the future release.
For the most part specular maps are just a 4% flat grey colour (#3b3b3b) for dielectric materials and the colour from the metal you want on the metallic areas. If you open the 3d Preview (P) we have a selection of materials that you can get a number of values from for metal (you can see the values for metal by clicking the colour pickers in the surface group).
For the most part specular maps are just a 4% flat grey colour (#3b3b3b) for dielectric materials and the colour from the metal you want on the metallic areas. If you open the 3d Preview (P) we have a selection of materials that you can get a number of values from for metal (you can see the values for metal by clicking the colour pickers in the surface group).
Hope that helps!
Thank you
I think so I'm mainly doing organic stuff at the moment like skin and leathers for dark-age armours etc. It's my understanding rightfully or wrongfully that its the spec map that dictates how light reflects off the substance therefore setting to some degree how much a substance shines, if it is metallic or not.
Oh BTW would it be a nice feature to have a wee check box, say just below the load mesh button to keep mesh for this session. I dout I'm alone with having my textures in one dir and my models in an other, I have to move between them and reload my mesh each time if I'm testing to see if I have the look i'm after. I end up at time doing a lot of reloading to finalize the models finished look. maybe just a novice thing not sure
Hey, I got knald 1.2.1 I just installed it but it doesn't start it just keeps on the loading checking license server screen and when I click on it knald just quits. I had the demo before and I didn't have this problem at all. I only have this problem hen I try to install it on my c drive. But that's my ssd will the baking take longer if I install it on my hdd drive? Also why does it say that it expires in 1970?
Replies
Also i thought that the cavity map was using the normal map only. But anyway in this image i have created we can see why map generated from normal map must be avoided:. The Concavity and Convexity from Knald are pretty good.
Mirrored UVs should work ok, but if you confirm they don't, or anything else doesnt work as expected, please let me know and I will take a look
Thanks!
We also export a single channel curvature map too so there isnt much need to compile them in Blender unless you really want to
Just select 'Single Channel' in the 'Curvature Type' dropdown in the curvature group and export. Once the initial bake is completed you can swap between Single and Dual channel without the need to rebake.
Thanks so much for taking the time to put together mockups and writing up such a detailed overview of your thoughts. It's super helpful to get feedback like this. I can't currently give your post the detailed answer it deserves but I will post a longer reply when I'm back at my main PC later today or tomorrow.
Cheers!
Once again thanks for the in-depth post.
I want to start with our reasoning as to why some things are laid out as they are and then move over to some more specific areas of the functionality and gui within Knald.
We have spent a very long time thinking about and implementing the GUI within Knald & Lys & generally have tried to make it as modular as possible, which is why the controls are grouped as they are. We found that it was much more flexible this way as when grouped we can save, restore and reset user controls with a very fine level of granularity, so if a user wants to only save the position of a few slider within a specific group, they can do just that and not have to save other controls in other groups if they don't want to, though with our save system they can do just that if they so desire (user can save 99% of all controls on a group, tab or application wide basis).
Also having items logically grouped provides us with the ability to only show the user controls specific to the tab or current function in question which substantially reduces clutter and confusion within the interface, which in turn makes everything easier to use. A good example for this is The Integrator group being hidden when the user loads a height map or uses the baker. The Noise reduction, Smooth, Input Map Type are only applicable to The Integrator and not when loading a height map or using Color To Normal (though The Integrator can be updated from Color To Normal) which is why they are placed in that specific group.
Something that may not be obvious is that Knald has multiple modes that are generally fenced off from each other in order to save memory as VRAM is at a premium and needs to be conserved as much as possible. If we had all the modes active all the time users would be more limited as to what they can process. For example, if the user loads an 8k normal map The Integrator would require around 3.5GB of VRAM, which could then no longer be used if they decided to bake something. Forcing the user to manually load the baker is an affirmative step that they wish to leave any other mode they are currently using and to clear that memory from the system.
The Main Tab is so called because it is the main interface within the program & is where users can interact with every part of Knald. When any texture is loaded for processing, either via The Integrator, Color To Normal, The Baker or just loading a height map, the controls are shown here. Converter doesn't really do this justice as the tab also shows the controls for maps that are not converted.
Some general notes:
Groups are collapsible only for user convenience so if they are short on vertical screen space they can free up controls they don't need at that specific point in time.
The Export controls are within the General Tab because they are a global control for export in all modes with the export file path being mirrored in the Export Tab too.
All the controls within Knald are pretty well documented via tooltips and in the online documentation (you can view them within Knald by hovering over a GUI element for a few seconds), but if there is anything that you feels needs improving specifically please let me know and I will work on it for sure.
Regarding The Baker, we grouped items for the same reasons as I've mentioned previously, but doing so is actually much more important within The Baker as the potential for using much more vertical space is much greater than in other modes. If a user decides to bake an asset with 30 HP meshes then not only would they be increasing the vertical width of the Bake Meshes group but also Material ID too, which would probably force the user to scroll the height of 2 or 3 screens. It's important to us that users are free to conserve as much space as possible and with the system we have now they can load the meshes and then hide them at will. Another important consideration for us is that the user is able to see as much information as possible at once without the need to scroll, which again is where collapsible groups comes in.
The Bake Targets group also is localised in order to give the user a convenient way to modify which settings they view within the Baker by pressing the Configure button. Not having this control would also force the user to have a huge scrolling liability which gets larger with each target they add.
Thank for the comments regarding MatID. I'm not 100% sure what you mean by "pull the baked mat ID information directly from the material IDs of the highpoly parts". Knald currently assigns a single value per loaded HP mesh and you can pick between RGBCYMW Hue Shift, Max Euclidean RYB & Random, in addition to users having the ability to load their own palette by assigning colours horizontally to a 1px high image.
While RGBCYMW Hue Shift & Max Euclidian RYB are both preset palettes, they are logical in their values and are designed to give maximum separation between subsequent values.
Having said all this however, there are always areas in which we can improve and your post has given us some great feedback and ideas in which we can improve Knald & Lys. Thanks again for taking the time to write it up.
_ Infinite demo license but watermark Knald Demo on the textures
_ Add a Cage Inflate option if we want to generate a cage
_ Maybe painting tools and color gradient to texture quickly parts of the model: https://www.youtube.com/watch?v=wUeQsUJTz8I
Thanks for the suggestions!
Do you mean when a cage is imported or the generated cage within Knald?
I will add the gradient map suggestion to the user wishlist.
I do understand most of the reasoning behind the design choices, but I still believe that in quite a few cases the end result remains problematic. Now of course everything can be argued for or against, but overall my one main suggestion could be summed up as follows :
"The interface layout should reflect a linear script of user actions."
What I mean by that is that even though the tool can do multiple things, there is still an overall generic path that a user will almost always follow - and the UI should be organised in way that reflects this order. In the case of a plain page layout the UI elements would have to be organized from top left to bottom right (ie the diagonal that the eye naturally follows) but here in the case of Knald the ideal layout organisation would have to be from the first tab to the last, and from top to bottom within each tab.
Now overall the "linear user script" for a baking tool is :
1 - Loading of the highpoly model(s)
2 - Loading of the lowpoly model(s)
3 - Adjusting global settings (files paths, textures to bake, and so on)
4 - Adjusting map-specific settings (ID setup, various overrides if/when needed, and so on)
5 - Pressing the bake button
6 - Reviewing the result
7 - Eventually, processing the output for cavity detection, and so on.
One good rule of thumb is to try and make sure that the user never has to "backtrack" at any point during the script. That is to say, there should be no need to go left, or up, or to a previous tab.
Even though Xnormal is a bit antiquated, one can tell that Jogshy strictly follows this "linear script" principle and thus did his best to logically lead the user through all the steps / tabs. Of course not everything is perfect in Xnormal (the main mesh loading action being hidden behind a non-discoverable right click menu is a UX no-no) but besides that it is still really good. I hope this makes sense !
Besides that I second the suggestion of a watermarked demo.
I understand what you mean in regards to linear flow and I agree that there are things which could be improved within the interface. It's a constantly evolving entity of course.
We generally tried to keep global settings first and foremost (things like map size, AA, bake button, ray projection etc.) because they are the actions people are most likely to change the most outsize of bake target specific controls, which is why they are grouped right at the top.
Once the user loads the meshes they are unlikely to change them again so we felt that having them right at the top would cause distraction to the user and if the user has to look away from the screen then they would have to scan the interface to find the stuff they need to change more often. This is also the reason why the bake target controls are at the bottom of the interface too as it is much faster to look right at the bottom of the screen to find them than scanning down through the GUI. Also, once the user had chosen the bake meshes and targets they can collapse/hide those groups and only bake specific targets will be visible.
I'm not sure if it is obvious, but there is no real need to update the integrator once a bake has been completed. If you bake cavity or curvature etc from a mesh then you dont need to update the integrator in order to reprocess it as the results from the baker will be taken directly from the mesh and there are many more map types available within the baker. We added the button as an easy way to allow users to send data to the integrator if they wanted to, but it isn't mandatory at all. If this isn't obvious we probably need to think about another way to allow users to do this.
The demo is actually already watermarked btw. I forgot to mention this when @Linko posted above.
But you probably mean a gradient map from bot to top it's a good idea too, something like that: http://www.blendswap.com/blends/view/77269 With a slider to adjust the black to dark grey and white to light grey.
Also i suggest a tab called "Grayscale" that are all the map composited together with slider to control the opacity of each map and the color ramps applied on top of it with an eye to toggle it On and Off.
And for the cage i suggest an option like Blender Cycles baking when we check the "cage" button we can pick a mesh or use the extrusion setting to duplicate an inflate the low poly to use it as a cage.
I agree with what Pior says, there is a "linear" way to bake texture and those options should be directly accessible without going tab to tab.
Yea, I know what you meant. Gradient maps have been around for ages and were used in Left 4 Dead 2 for example.
The other map you mentioned is called a Position or Volumetric gradient map when all 3 channels (RGB) are used. They are on our TODO list
Thanks for the suggestion!
You can already do this in Knald. You just load your mesh into the Cage Mesh slot in the Bake Meshes group. When a cage mesh isnt loaded we generate one internally based on the low poly geometry. You don't even need to use similar topology for cage within Knald
You can already access all the baker functions bar export in the Baker tab, so you should really need to be swapping tabs too much. Basically you should only need to set the export location/name and everything else can be done within a single tab.
Recently, I've started making more use of skew meshes to get rid of skewed details in my bakes. The method for using these meshes is to bake out an Object Space normal map, import that map + the low poly model into something like xNormal, HandPlane 3D, or Substance Designer, and then render out a tangent space normal map for whatever my target engine is (usually Substance Painter 2, Unreal 4, or Marmoset Toolbag 3). It's a terrific way to get amazing tangent space normal maps without skewed details.
But I would love to keep my workflow as streamlined as possible. The fewer the apps I have to use, the faster I can work.
So my question is this: Does Knald currently support a way to either use the Object Space Normal Map + low poly to generate a tangent space normal map? Or is there a setting in Knald that I can toggle to prevent skewing? If neither feature exists, would there possibly be a way to include that kind of functionality in Knald in a future update?
It's always nice to hear that people love using Knald!
We don't currently have any sort of normal map conversion or anti-skew features in Knald currently. We very much appreciate the suggestions and will look into them for sure
Thanks!
I'm glad you are happy with the service! We can always be reached at support@knaldtech.com too!
Sometimes when i bake the curvature map is empty, completely black, why? The other maps bake properly.
Also i suggest:
_ a specular map that is the result of multiple maps
_ an UV Snapshot map to help for the texturing work
_ to not bake and add a warning if the low poly has no UVs.
_ an option to generate a low poly with a decimation to prototype the game asset, see what it looks like in game and create static meshes props with auto unwrapping.
I have created a node setup that Knald users can use to generate a game texture just by connecting their texture. It gives the control for the opacity, amount of dust, blood and the AO contrast.
Download: http://linko.projects.free.fr/gametexture.zip
I have made a comparison of the Curvature i get from Knald on the left and Substance Painter 2.5 on the right. I have used Blender's Decimation modifier and Smart UVs so Substance struggles to generate the curvature from the normal map while the curvature based on the mesh in Knald gives good result with any UV Unwrap and normal:
That's a good thing because i can prototype my game assets with a Blender Decimation then Smart UV but also generate my final static meshes in few clicks without worrying about the UV quality for the curvature.
Another test, generating maps just by importing this flat image color: https://upload.wikimedia.org/wikipedia/commons/thumb/0/0c/Blender_logo_no_text.svg/2000px-Blender_logo_no_text.svg.png
I am impressed by the quality of the normal map, i have then imported the generated normal to get all my other maps and tweaked the logo:
- after inputting my license key, why does Knald said my copy still Evaluation Copy that expires in year 1970 (0_0?)
- does the transmissions map similar to the thickness map in substance painter? Knald tend to be more lighter then SP bake
- why does cage mesh exported from 3ds max goes crazy both for cage and range in Knald?
- is there a way to add additional tanget normal map data and tell Knald to take that as a consideration when baking the other maps? for example when I generate a diamonds pattern for my gun's hand grip in SP and tell Knald to take this as a consideration when baking curvature/ ao etc..
And +1 for the feature request to bake mesh by name matching pattern a la Substance Painter
Another suggestion is to be able to manually edit a cage mesh, like for example there's no way I can get a good cage for this particular situation. But this can easily workaround if we able to load an FBX file that contain multiple mesh and assigned a LP/HP/CAGE combo automatically based on matching mesh name + user defined suffix (_low, _high, _cage for example) like I mentioned above, so we can build a perfect lowpoly, highpoly and cage mesh in 3d software and let Knald bake it all out.
The issue you are having with curvature is most likely down to you not having smooth normals in the mesh.
Thanks again for all the suggestions. I will add them to the user wishlist.
Very nice work with all the Blender stuff btw, Nice to see you using Knald so much!
Hey!
Our solution actually takes relative light transmission (energy) passing through the volume of the entire mesh into account (treating it as a coherent volume) when generating the texture too. In the image below (Art courtesy of Warren Boonzaaier ) you can see that the hood is light on the inside and a darker grey on the outside. This is because the light has to travel through both the head and the hood, resulting in a lower transmission on the outside of the hood. Alternatively, the hood is comparatively thin when isolated so light passing from the outside through to the inside is relatively unobstructed (resulting in a higher transmission value) . When using the inverted normals AO trick that many people use you cannot do this.
It's super fast to bake, extremely versatile and you can get lots of different results by playing with the settings for a little while. You can read more about it here: https://docs.knaldtech.com/doku.php?id=transmission_maps_knald_1.2
Thanks for the other suggestions. Also added to the user wishlist!
And the thickness maps yeah it kinda make sense if you put it that way, thanks!
I tried to experiment more with the workflow for Knald but now I found some strange result.
What I'm trying to achieve here is the 4th question I asked above; bake normal in Knald > add normal details in SP > re-bake the rest of the maps in Knald.
So after I add the details in SP and export out, it looks good but when I import that same map into Knald's integrator, things get a little weird, the padding bleed in to the uv area, did I miss any setting to tell Knald to do not process the normal map that I imported in and just use it as it is to bake the rest of the maps?
Left is the export map from SP, right is what it looks like when I import that to Knald's integrator. It happened on lots other area though, not just this area. Also notice the top left area there's a slight bright red/pink color there that appear to be less saturated on Knald. I haven't really go through to all the documentation though, maybe it's just me that doesn't really understand about what the integrator use for.
Also does your knald bakes look correct in SP's viewport? I'm seeing an issue there as well, see image. There's a very subtle difference in the baked maps, but I thought both used mikktspace or did a missed a setting somewhere?
Thanks
When loading on a non-tiling texture you will sometimes get these issues, as this is a side-effect of Knald applying the integrated height map to your mesh. Sadly, it is not currently possible for Knald to generate perfectly correct values around the UV islands (which are typical if the normal map was baked) after processing a normal map through the integrator, resulting in the possibility of minor visual aberrations when applying the exported height maps to UV containing meshes.
One thing you can do is to ensure that you are importing the normal map with an alpha channel, which tells Knald to pin the black areas in the alpha to 100% flat. This often fixes issues like this, but of course isn't always perfect.
Please note that the primary purpose of the integrator is for reversing a normal map into a height map and other maps for when you don't have the High poly geometry to work with. If you have HP geo then the majority of the time you are best using the baker only unless it's for a specific purpose, such as you need a unfaceted height map for a tiling texture etc.
Knald uses Mikktspace and we should be 100% synced between the baker and the viewport, as well as UE4 etc. Is your mesh triangulated before baking? How does the baked map look in Knald's viewport?
If you want to upload the meshes I would be happy to try and debug this from here.
Yes, in Knald's viewport Knald's bake looks correct (thanks for offering to debug), so the problem is substance painter's viewport as I also checked in Toolbag and Knald's bake is the one that looks correct there as well. Good job!
As for the other issue, I agree it's best to use the baker when you have the HP. Although I do use the maps generated from the integrator as well, I just have to clean them up because of the distortions. Or just a way to load the normal map in the 3d viewport without any processing just for viewing purposes?
One more thing, is there a way to do or do you have any plans for a "point light" bake? I usually grab the green channel from the object space normal and it works just fine, but it would be nice to have one with soft shadows like the AO.
Thanks and keep up the good work!
There isnt a way just to load normals currently but I can add it to the user wish list. The only thing I can suggest is to use the alpha channel trick which can limit the severity of such distortions.
I will add it to the user wish list. Thanks for the suggestion.
The green channel from the object space bent normal might be useful to you. Generally it's somewhat softer than the green channel from regular OS normal maps.
Here is an example of the various OS Bent Normal green channel flavours.
Awesome! Glad to hear it
Yes, name matching support is on the list of things to do for a future release.
The last update was released just over 3 months ago and we are always working on new features and improvements. Generally we prefer to post a release when it's ready, rather than promising x releases within a time period, as it guarantees a more stable product and quality doesn't slip.
Hope that helps!
We do indeed support vertex colors from ZBrush (Polypaint) and regular vertex colors from FBX & PLY formats. To bake and view the vertex colors you must do the following:
Once the bake is finished you should be able to see the vertex colors in the Vertex Color Tab.
If you cant there may be an issue with the .obj you are using. If you want to upload the files (high, low & cage) it and send them to support@knaldtech.com we would be happy to take a look and see if we can find the problem.
Hope that helps!
Did you guys aware of this issue?
Must be something else. I have the Win10 update installed and Knald starts without issues.
Love the models in your portfolio btw.
The OpenCL.dll error is usually down to the NVIDIA driver either becoming corrupted or not being installed correctly. This can often happen when large changes are made to Windows such as upgrading Windows to a new version or installing a service pack for example, so we would recommend that you download and install the latest GPU driver for your card to fix the error.
Whenever you install a new driver we always recommend that you perform a clean installation and delete any cached shaders if they are present on your system. Please ensure that it's a clean install by following the steps in the link below:
https://docs.knaldtech.com/doku.php?id=clean_driver_installation_1.2
Please let me know how you get on.
Btw, did you guys manage to find out the cause of the exploded cage mesh that I sent few weeks ago?
We try our best not go down the road of criticizing other people's software but here are some of the things in Knald we are generally proud of within our baker:
Of course there are plenty of other thing we feel we do very well too and this list contains just a few highlights.
Hope that helps!
But when importing an FBX made up of multiple meshes. It seems that only one single mesh is being imported. Is this a bug I'm getting or does knald not support fbx of multiple objects?
Hopefully they will add this feature on the future release.
Which brings up another thing, I want to set up and bake multiple materials (texture sets) at the same time as well.
Yes, we will support multiple low poly meshes in a future build for sure. It's a very high priority for us currently.
We are always working on new stuff!
I don't have a date for a new release yet though. Stay tuned
Real novice question, sorry. I see there's no Specular map output in Knald, is there a best option for creating one from the outputs that are there?
For the most part specular maps are just a 4% flat grey colour (#3b3b3b) for dielectric materials and the colour from the metal you want on the metallic areas.
If you open the 3d Preview (P) we have a selection of materials that you can get a number of values from for metal (you can see the values for metal by clicking the colour pickers in the surface group).
Hope that helps!
Thank you
I think so I'm mainly doing organic stuff at the moment like skin and leathers for dark-age armours etc. It's my understanding rightfully or wrongfully that its the spec map that dictates how light reflects off the substance therefore setting to some degree how much a substance shines, if it is metallic or not.
Oh BTW
would it be a nice feature to have a wee check box, say just below the load mesh button to keep mesh for this session. I dout I'm alone with having my textures in one dir and my models in an other, I have to move between them and reload my mesh each time if I'm testing to see if I have the look i'm after. I end up at time doing a lot of reloading to finalize the models finished look. maybe just a novice thing not sure
Also why does it say that it expires in 1970?