Trying the demo at the moment, awesome so far except for the Udim Vertical limitation. So if you please let me if and when you're planing on "removing" this limitation? We are interested buying Mightybake at Framestore, but with the huge environment I am currently tackling, it's pretty tedious to move everything back in 10*10 space for groups of objects for each single scene and split it into multiple low polys and then import them back into Mari (as the udim number is changing).
Also, I tried it with a very big asset (less that 10 Udims), the OBJ was a wooping 6.6 gigs, but FBX was around 660 Mb and it's crashing while loading the high poly. Unfortunately I can't provide the asset so you could have a look... It doesn't seem to be RAM related as I took a peek on the windows system monitor and everything looked fine until it crashed. So if you have any idea on how to do those kind of huge assets or any idea on how you could possibly make everything load... Just wondering really.
As A note, it would be nice to only provide a high only and mightybake would know to use this as same mesh to calculate the maps.
Also vouching in for sky dome effect as the results from Topogun's AO are pretty awesome with it. *wink*
If you want to just bake the high without a low poly, just untick the high poly tickbox, then plug your high mesh into the low slot and bake it. It will bake ao for example it will just calculate it like a light mapper would in something like mental ray.
The MightyBake license server is borked today, everyone lost their license and when I tried to log in to the account the account is missing and prints this text [woocommerce_my_account]
I'm terribly sorry about the license server. Our web service was updated which caused some configuration issues. The server is back up. Please let me know if you have any other issues.
Alembic support would be great, its now our main interchange between apps in VFX pipeline. And I second the vertical UDIM issue raised here, its essential for our massive assets which may have hundreds of UDIMs
For those interested, the vertical UDIM limit has been removed in my local build. The 'automatic' settings will do the whole space. It assumes a maximum of 10 wide though.
I will be providing options to set the range of the UDIM bake, i.e. min & max in both U and V in the 1.5.0 release.
@unhuman I've downloaded Alembic and i will work to integrate it soon, but I can't commit to a version release until I've worked on it some more.
Mightybake, so account page is now working again, but none of our license work. It says the license account has been exceded, even if I delete the person's machine from their license to release it, we can never reassign it. We're completely locked out from using the any of the licenses right now.
It has fixes for the dim limits. It also has rounded edge projection to created low distortion bakes and rounded normal maps. I'll post a tutorial soon.
It has fixes for the dim limits. It also has rounded edge projection to created low distortion bakes and rounded normal maps. I'll post a tutorial soon.
I'm trying out the demo at the moment and I keep getting 'bad allocation' error every time I try to bake out something that isn't a cube. The model itself isn't super complex but there's a large number of UDIMS. Unfortunately due to NDA's I can't post examples.
I'm running windows 7 with 32gb of ram and a quadro k2000 for the graphics card.
I'm trying out the demo at the moment and I keep getting 'bad allocation' error every time I try to bake out something that isn't a cube. The model itself isn't super complex but there's a large number of UDIMS. Unfortunately due to NDA's I can't post examples.
I'm running windows 7 with 32gb of ram and a quadro k2000 for the graphics card.
Cheers, Mark
Problem solved. Switching from fbx worked! It wasn't happy with obj.
Now for my next question. Can you specify the width of the curvature map in a similar way to the Occlusion? Or is that the sampling? I'm on the demo version so at the moment I can only select 1x1.
Based on several users comments I've taken a deeper look at the OBJ parsing. There's a serious bug in it that is hit if you use multiple objects. This is unlikely to ever happen using Zbrush but happens often using Mudbox and Blender. I will work on a fix immediately and publish is ASAP. Sorry to all the OBJ users.
Just posted 1.5.0 beta 3 (win 64) here. This should be the final beta for 1.5.0. I will be posting the LINUX and Mac betas shortly. I've added two new map types 'Hard Edges' and 'Alpha Mask'. I'll post some examples soon. You can use the 'Hard Edges' to do edge wear. It's based on the low poly mesh topology, so if you have interpenetrating meshes, they won't create edges.
Version 1.5.0b3 - Feb 21 2016
(Feature) Added 'Alpha Mask' bake type - anti-aliased mask for showing the transferred areas
(Feature) Added 'Hard Edge' bake type - low poly edge highlighting for hard edges
(Fix) Added layers option to command line
(Fix) Fixed command line transfer baking - broken in previous 1.5.0 beta
I've been messing around with the rounded corners baking feature today. Looks very promising, but I think I've found some bugs. Haven't had a chance to check full investigate the rounded corners on high to low to solve distortion, but that looks incredible. Need to try it on a harder example.
Bugs: 1. If you look very close at the rounded corners bake on low to low there is a seam, it doesn't go away with dilation or increasing the resolution.
2. The rounding effect on low to low looks a bit strange, it doens't actually look rounded when you apply it to flat cubic shapes.
3. The roundness on low to low turns black as it gets farther away from the camera angle.
4. The roundness on low to low does not handle ngons very well, for tops of cylinders you must add quads to keep the rounded edge from pinching.
Some images of my tests, this is super cool keep up the good work, can't wait for this feature to be issue free, it's super fun just loading up a single low poly model and hitting back and watch the bevels appears. There have been numerous times at work I've needed to use this.
Does mightybake support Name Matching if the hipoly FBX is exported from Mudbox, and the lowpoy/env exported using the shelf in Maya? I am getting issues where the name matching is working only partially for some objects.
Just downloaded the latest beta -- really loving the blending of geo and surface normals. Saves me a ton of time.
I have been getting a number of crashes though, I even rolled back to 1.4.9 but am still getting them. They are somewhat sporadic and I've had a hard time debugging what the potential cause is. This time it's pretty consistent, so I think I have a good test case.
Here are my steps (using 1.4.9):
create low and high (smooth mesh)
export each using appropriate button on Mightybake shelf
load Mightybake and plug each fbx into appropriate field
The bake commences, sometimes it gets as far as where the UVs are located in the bake, but other times it crashes as soon as the bake screen has appeared.
Some Troubleshooting steps I've tried:
Using an envelope and setting Search Distance to 0
Baking each map out individually -- the AO completed , but the two normal maps didn't
Importing models into a fresh scene and re-exporting with MB shelf buttons
Doing a cube combine on all the models to ensure there is no corruptness on them
For high model , instead of using a smooth mesh preview model, converted it to polygons in Maya then re-exported with MB shelf
I can provide you with the fbx files and the models. Or perhaps you can see what I may be doing wrong?
I've experience this a couple times in the past, but not consistently and could work thru it -- usually these involved multiple models of each type; however, this time it's just a single of each so I'm at a bit of a loss of how to get this to work.
I don't understand how to get material ID's to work without loading in a texture on the high res. I'm just getting a pure white map when trying to bake it
edit: ahhh now I see you just assign material ID's to the FBX . What would be great though is an option to bake color maps from a high res to the low. Also, TGA output would be nice.
Well - the Roundednormal feature is nice so far. But it would be great if i could rid of manual scaling my models to fit into the rounding value norm. Feature request: Global Model scaling
@mightybake I believe he's asking for a value to scale the size of a model uniformly similar to how xNormal handles it - arbitrary values you can assign to increase or decrease the size of the model to potentially alleviate some issues with baking models that are too large or too small.
For example, I have a model that required values of 0.005 for rounded corners - scaling it up to 100 would have made the value for rounding closer to 1 instead of a small fraction.
Personally, I look forward to the rounded corners being 100% done - what you have so far is great. With a little more polish on that feature, I can see a lot of time being devoted away from subdivision modeling for some of my work.
@Synaesthesia Cool. It's not a perfect solution for all cases, but it's allows you to iterate really quickly. The latest development version have very clean edges, but the pinched corners still need some work.
I still have plans to eventually integrate subdivision surfaces, but that's not going to be soon.
We've received a few support requests for people running windows 7. There may have been a system update that has adjusted some settings. We've posted a quick tutorial if you are having trouble activating or using you license on Windows.
Hi Rob, Just bought a copy and so far I'm absolutely loving using the software. I'm working in the prerendered world and so I'm looking to use MB as a replacement for painfully slow and often glitchy Zbrush displacement and normal map exports. I'll probably have a few questions as I go but so far so good.
One question I had is about the Maya shelf. There's scant documentation and so I've been relying on Malcolm's tutorials .. anyway can you tell me what is happening when I hit the Low shelf button ? .. I realise there's some scripts running to export my object but I'd love to know what they're doing and why. Also It'd be great to get some form process bar or feedback about that process .. right now Maya just sits there if I'm exporting a slightly heavier mesh.
After a few days experimenting I'm generally getting great results. It's allowing me to iterate and try things rather than waiting forever for Zbrush MME to export. However I'm getting an issue with my displacement maps - I see faceted artifacts within them.
At first I thought it might be due to some of the special sauce applied during the Maya script to export the low version, but this doesn't seem to be the case as I get the same result if I manually export my low mesh. On a hunch I also tried scaling down the low surface a little so that there'd be no overlap but still I get the same result. I have also tried subdividing the low mesh but then I get lots of facets rather than a few. Any combinations of soft edges, normals or whatnot seem to make no difference.
I'm guessing it's due to the varying offset between edges and faces (see attachment).
Searching around this seems a common displacement map problem but I can't find a good answer from anyone. The common suggestion is to up the amount of subdivs in the low mesh, however that still adds the facets, just lots of them. Plus if I'm to use the special Maya shelf buttons exporting a highly subdivided mesh takes forever. Does anyone here have any ideas ?
bencowellthomas Hey Ben, I wrote this all out, but it looks like the forum deleted my post for some reason. Any ways, I messed around with this for a bit and I think you might need to override the normals for your offline renderer. I'm not an expert in offline rendering, but when I use the Maya realtime viewport shader it renders fine, but that shader literally only renders the displacement, no normal information so you have to plug in a normal map. I thought I'd see if that idea works in an offline renderer as well. See example images below, I'm not sure if that would be the ideal setup for film industry high to low bakes, but maybe everyone has a special shader to compensate for those square patterns in the displacement map. Maybe MightyBake has a better solution? It would be interesting to bake your test case in the Maya transfer maps and see what Autodesk outputs as displacement and see if it has the same square pattern. I would imagine it does, but I haven't tried it.
Ben, just to check, are you rendering UDIM displacement maps?
The displacement maps are computed based on the underlaying mesh. Your explanation of the problem is accurate. If you find or have an idea on how to fix it, please let me know.
The only way I can think of, is to compute a very subdivided version of the low poly in the baker itself. The resulting displacement map wouldn't directly apply to the base mesh anymore.
To get rid of the seams, I wonder if you used an un-normalized displacement map as an EXR and tried that instead.
I was able to get a fairly smooth displacement map out using: - Non-normalized displacement map - EXR output - Using a mentalrayDisplacementApproximation set to 'fine view high quality' - Scaled to 1.0
I tried not normalized on my test scene, but how do you put in a search distance for positive and negative, when the map is not normalized it only appears to capture height data starting at the surface, so any indents are not captured in the bake. The eyes on my guys face are just missing now. EXR did not help with a normalized bake.
Hmmm. Good point. My test was all positive. I'll go back and see if I can figure it out.
I think to make this work properly in maya, it will need to be a vector displacement. I added the shader for that ages ago, but i can't seem to find it in the UI. I will add that.
Thanks for input. I'll try the Maya transfer maps and the other ideas this morning and get back to you. My suspision is that to get an accurate bake from Maya you'd turn on render time subdivision on your low mesh which would fix it .. but I'll try.
I am using UDIMs yes .. this was what initially brought me to Mightybake .. I can't find another tool out there that supports them.
Normalising doesn't work for negative values surely as it's putting the map into a 0-1 range ?. It's also not an option for accurate displacement as all your values become relative and a scaling factor needs to be added. Ideally I'd to stick to non-normalised 32bit EXR displacements as they remove the inaccuracies of midpoints and scale factors.
Adding a feature to subdivide your low mesh within Mightybake would be a very efficient workflow, as you say it'd be important that the target renderer and Mightybake subdivide using the same scheme, most important in the handling of UV border smoothing. I'd vote for Pixar's Opensubdivs as it's becoming an industry standard for computing subdivisions, although frustratingly nearly every renderer seems to name and implement the various UV border smoothing options in their own unique way. You can read more here: http://graphics.pixar.com/opensubdiv/docs/intro.html
Also can I ask about that low button in Maya ? .. if I did go the pre-subdivided route in exporting out of Maya these scripts are far too slow. What are they doing and why ?
Oh and a small request .. could you make it an option that when you add a new preset it clears out the output file path ? .. I'm forever accidentally overwriting my last bake output with my new one as I forget to change this value.
Ok .. your idea of applying the object space normal map to my object fixed the visible facets, however I'm not sure how I'd blend this with tangent based normals that I'd be generating in Mari etc for the detailed texture.
Also facets in the displacement maps are still an issue as I'm using the displacement map in Maya to recreate my high mesh (modify>displacement to polygons) for painting on in Mari (exporting the high from Zbrush gives mismatching UVs with the low geo).
Also it's worrying to see what the object space normals do to my normals pass (often used in compositing).
Vector displacement is not a great option at this stage as various render engines implement it differently .. I've never managed consistent results with it.
I think subdividing the geo inside Mightybake would be awesome, as long as the UV boundary smoothing is the same as appears in the renderer. The speed boost using Mightybake compared to Zbrush MME exporter is incredible and so I think this workflow would be extremely popular.
Replies
Trying the demo at the moment, awesome so far except for the Udim Vertical limitation. So if you please let me if and when you're planing on "removing" this limitation? We are interested buying Mightybake at Framestore, but with the huge environment I am currently tackling, it's pretty tedious to move everything back in 10*10 space for groups of objects for each single scene and split it into multiple low polys and then import them back into Mari (as the udim number is changing).
Also, I tried it with a very big asset (less that 10 Udims), the OBJ was a wooping 6.6 gigs, but FBX was around 660 Mb and it's crashing while loading the high poly. Unfortunately I can't provide the asset so you could have a look... It doesn't seem to be RAM related as I took a peek on the windows system monitor and everything looked fine until it crashed. So if you have any idea on how to do those kind of huge assets or any idea on how you could possibly make everything load... Just wondering really.
As A note, it would be nice to only provide a high only and mightybake would know to use this as same mesh to calculate the maps.
Also vouching in for sky dome effect as the results from Topogun's AO are pretty awesome with it. *wink*
Cheers!
[woocommerce_my_account]
Regarding the OBJ - which package were you exporting from? How many objects does it have in it? The OBJ parser in MB is a very optimized, but simple parser. There may be a flavour of OBJ we haven't seen.
thanks!
@unhuman I've downloaded Alembic and i will work to integrate it soon, but I can't commit to a version release until I've worked on it some more.
As for the OBJ IIRC it comes from Mudbox as the modelers here are using it to sculpt all creatures, altho I loaded it fine in Maya to export a FBX.
Hope that helps.
Cheers!
Ive just released the 1.5.0 beta at http://www.mightybake.com/beta.
It has fixes for the dim limits. It also has rounded edge projection to created low distortion bakes and rounded normal maps. I'll post a tutorial soon.
Nevermind -- back up and running
Here's a quick preview of each of the different directions.
Suface Normals
Geometry Normals
Rounded Normals (New)
I'm trying out the demo at the moment and I keep getting 'bad allocation' error every time I try to bake out something that isn't a cube.
The model itself isn't super complex but there's a large number of UDIMS. Unfortunately due to NDA's I can't post examples.
I'm running windows 7 with 32gb of ram and a quadro k2000 for the graphics card.
Cheers,
Mark
Rob
Just posted 1.5.0 beta 3 (win 64) here. This should be the final beta for 1.5.0. I will be posting the LINUX and Mac betas shortly. I've added two new map types 'Hard Edges' and 'Alpha Mask'. I'll post some examples soon. You can use the 'Hard Edges' to do edge wear. It's based on the low poly mesh topology, so if you have interpenetrating meshes, they won't create edges.
Version 1.5.0b3 - Feb 21 2016
Bugs:
1. If you look very close at the rounded corners bake on low to low there is a seam, it doesn't go away with dilation or increasing the resolution.
2. The rounding effect on low to low looks a bit strange, it doens't actually look rounded when you apply it to flat cubic shapes.
3. The roundness on low to low turns black as it gets farther away from the camera angle.
4. The roundness on low to low does not handle ngons very well, for tops of cylinders you must add quads to keep the rounded edge from pinching.
Some images of my tests, this is super cool keep up the good work, can't wait for this feature to be issue free, it's super fun just loading up a single low poly model and hitting back and watch the bevels appears. There have been numerous times at work I've needed to use this.
Just downloaded the latest beta -- really loving the blending of geo and surface normals. Saves me a ton of time.
I have been getting a number of crashes though, I even rolled back to 1.4.9 but am still getting them. They are somewhat sporadic and I've had a hard time debugging what the potential cause is. This time it's pretty consistent, so I think I have a good test case.
Here are my steps (using 1.4.9):
- create low and high (smooth mesh)
- export each using appropriate button on Mightybake shelf
- load Mightybake and plug each fbx into appropriate field
- baking Maya NM, Unity 5.3+ NM and AO
- Width 2048, Height 1024, Dilation 4, Format png 16-bit, Search Method Furthest, Search Distance 2, Direction Geometry Normals
The bake commences, sometimes it gets as far as where the UVs are located in the bake, but other times it crashes as soon as the bake screen has appeared.Some Troubleshooting steps I've tried:
- Using an envelope and setting Search Distance to 0
- Baking each map out individually -- the AO completed , but the two normal maps didn't
- Importing models into a fresh scene and re-exporting with MB shelf buttons
- Doing a cube combine on all the models to ensure there is no corruptness on them
- For high model , instead of using a smooth mesh preview model, converted it to polygons in Maya then re-exported with MB shelf
I can provide you with the fbx files and the models. Or perhaps you can see what I may be doing wrong?I've experience this a couple times in the past, but not consistently and could work thru it -- usually these involved multiple models of each type; however, this time it's just a single of each so I'm at a bit of a loss of how to get this to work.
Thanks so much! Hope you can help me out.
edit: ahhh now I see you just assign material ID's to the FBX . What would be great though is an option to bake color maps from a high res to the low. Also, TGA output would be nice.
Feature request: Global Model scaling
Thanks
For example, I have a model that required values of 0.005 for rounded corners - scaling it up to 100 would have made the value for rounding closer to 1 instead of a small fraction.
http://www.mightybake.com/license-issues/
One question I had is about the Maya shelf. There's scant documentation and so I've been relying on Malcolm's tutorials .. anyway can you tell me what is happening when I hit the Low shelf button ? .. I realise there's some scripts running to export my object but I'd love to know what they're doing and why. Also It'd be great to get some form process bar or feedback about that process .. right now Maya just sits there if I'm exporting a slightly heavier mesh.
cheers
ben
At first I thought it might be due to some of the special sauce applied during the Maya script to export the low version, but this doesn't seem to be the case as I get the same result if I manually export my low mesh. On a hunch I also tried scaling down the low surface a little so that there'd be no overlap but still I get the same result. I have also tried subdividing the low mesh but then I get lots of facets rather than a few. Any combinations of soft edges, normals or whatnot seem to make no difference.
I'm guessing it's due to the varying offset between edges and faces (see attachment).
Searching around this seems a common displacement map problem but I can't find a good answer from anyone. The common suggestion is to up the amount of subdivs in the low mesh, however that still adds the facets, just lots of them. Plus if I'm to use the special Maya shelf buttons exporting a highly subdivided mesh takes forever. Does anyone here have any ideas ?
thanks
ben
Hey Ben, I wrote this all out, but it looks like the forum deleted my post for some reason. Any ways, I messed around with this for a bit and I think you might need to override the normals for your offline renderer. I'm not an expert in offline rendering, but when I use the Maya realtime viewport shader it renders fine, but that shader literally only renders the displacement, no normal information so you have to plug in a normal map. I thought I'd see if that idea works in an offline renderer as well. See example images below, I'm not sure if that would be the ideal setup for film industry high to low bakes, but maybe everyone has a special shader to compensate for those square patterns in the displacement map. Maybe MightyBake has a better solution? It would be interesting to bake your test case in the Maya transfer maps and see what Autodesk outputs as displacement and see if it has the same square pattern. I would imagine it does, but I haven't tried it.
The displacement maps are computed based on the underlaying mesh. Your explanation of the problem is accurate. If you find or have an idea on how to fix it, please let me know.
The only way I can think of, is to compute a very subdivided version of the low poly in the baker itself. The resulting displacement map wouldn't directly apply to the base mesh anymore.
To get rid of the seams, I wonder if you used an un-normalized displacement map as an EXR and tried that instead.
- Non-normalized displacement map
- EXR output
- Using a mentalrayDisplacementApproximation set to 'fine view high quality'
- Scaled to 1.0
Mental Ray Render
Underlaying geometry:
I think to make this work properly in maya, it will need to be a vector displacement. I added the shader for that ages ago, but i can't seem to find it in the UI. I will add that.
Thanks for input. I'll try the Maya transfer maps and the other ideas this morning and get back to you. My suspision is that to get an accurate bake from Maya you'd turn on render time subdivision on your low mesh which would fix it .. but I'll try.
I am using UDIMs yes .. this was what initially brought me to Mightybake .. I can't find another tool out there that supports them.
Normalising doesn't work for negative values surely as it's putting the map into a 0-1 range ?. It's also not an option for accurate displacement as all your values become relative and a scaling factor needs to be added. Ideally I'd to stick to non-normalised 32bit EXR displacements as they remove the inaccuracies of midpoints and scale factors.
Adding a feature to subdivide your low mesh within Mightybake would be a very efficient workflow, as you say it'd be important that the target renderer and Mightybake subdivide using the same scheme, most important in the handling of UV border smoothing. I'd vote for Pixar's Opensubdivs as it's becoming an industry standard for computing subdivisions, although frustratingly nearly every renderer seems to name and implement the various UV border smoothing options in their own unique way. You can read more here: http://graphics.pixar.com/opensubdiv/docs/intro.html
Also can I ask about that low button in Maya ? .. if I did go the pre-subdivided route in exporting out of Maya these scripts are far too slow. What are they doing and why ?
Many thanks
ben
Also facets in the displacement maps are still an issue as I'm using the displacement map in Maya to recreate my high mesh (modify>displacement to polygons) for painting on in Mari (exporting the high from Zbrush gives mismatching UVs with the low geo).
Also it's worrying to see what the object space normals do to my normals pass (often used in compositing).
Vector displacement is not a great option at this stage as various render engines implement it differently .. I've never managed consistent results with it.
I think subdividing the geo inside Mightybake would be awesome, as long as the UV boundary smoothing is the same as appears in the renderer. The speed boost using Mightybake compared to Zbrush MME exporter is incredible and so I think this workflow would be extremely popular.