delaunay doesn't work in 3 dimensions right ? do you know of an alternative that does? Specifically for non-solids - imagine a letter e with a bend modifier on it
That is a very strange website. But it might be just what I'm looking for - that's a project for next week at least (one day they're going to realise they're basically paying me to dick around and this will all be over)
though not using that code (I can't remember where I got mine from though I made some mods to remove thin tris etc) It's still pretty ugly delaunay style mesh
my point emitters convert to mesh using it Badly cut diamonds 'r 'us
Hey! This is first post and I must tell that I am certainly overwhelmed by the work posted in here. I'm not as experienced as most of you, but I love this area.
So, I just finished a tool for UE4 where you are able to pack your RGBA channels into a single texture, being able to chose from different textures and customize resolution, output folder and compression settings from a single blueprint actor.
Since the launch of free Quixel for UE4, I've been struggling with packing textures from multiple materials gathered from Quixel, and most of the times all I want is some mask variation but there is also texture sampler budget.
The tool was created using only blueprints and not c++. Took advantage of the Render material to texture function to be able to generate the packed textures. Along side with this, created the custom material function where I can switch between normal map texture and regular RGB texture and to top it off, I wanted to have a real-time viewport update of what my result would be, so I added some planes with the texture and channel assigned to it.
Here's a video showing how it actually works and it's NOW available on the UE Marketplace in case you'd like to support a fellow (aspiring) tech artist.
Yup, it works with a blank render taget, so anything you actually throw at it it will be packed into the channel you chose, including alpha. The baked texture is then saved into the folder of your chosing and you can get rid of the textures you used before. So if you imported lets say 3 8K textures, you can pack them up into a single RGB 2K texture and then delete the 8K textures
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
Thanks! I really dont know why UE doens't have anything like it built in, which made me want to build it even more haha. As you can see it's quite simple tool and can definetily save a ton of time, at least for artists.
I had to do a lot of manual Hole punching on spaceships so. I got Lazy
and decided to make a script to automate the process. Just wish I had
the time to build this tool when I actually needed it. Not the most
realistic, but Its a good base to start.
Wow, that's interesting. Would you be willing to share what's going on under the hood conceptually. I see you're doing an extrude, but how are you randomly deleting faces and getting the smaller jaggy edges.
Its not actually randomly deleting faces its just deleting the original faces that was selected at the end after most of the operation is done, the jaggy edges come from the Random attribute of the transform Components.
Posted to Gumroad my old vert-to-quad script (F2MAX), as well as a new script that connects selected verts to all others on their selected face (FanConnect).
That's cool. In Photoshop I've been using a script which sets up a layer rig to blend the normal maps correctly, it's part of the normally blue panel. https://gumroad.com/l/NormallyPanel I beleive the NM Blue2 button blends them correctly like the old Quixel tools did. I set a hotkey for the script so I can toggle the normal layer from default to blended correctly.
This would be incredibly handy for damaged edge trims, it was a real pain in Maya duplicating and floating damage edges trims in front of the model because Maya doesn't offer any parametric way to offset the geometry without skewing the shape as it gets farther away from the surface.
Yes, but it skews the shape of the trim as I mentioned above as it gets farther away from the model. The only clean way I've found to do this is with extrude face since it will stay straight, but that's time consuming and the rout I took which was painful.
If you harden the normals /set to face it should project straight out.
I tested it recently after the same point was raised by an artist at the office and didn't see it behaving wrongly (not that I'd put it past Maya to have special cases)
Unfortunately no, I just tried that with all hard normals and the same problem occurs, everything skews because transform component only really works in one axis at a time if you want to keep it flat, unless I'm doing something wrong and feel free to correct me. You could of course move the top faces up manually along Y, then use transform component to do the side faces, but that's the same amount of work as just extruding the faces and then deleting the excess which is how I've been doing it in the past.
Unfortunately no, I just tried that with all hard normals and the same problem occurs, everything skews because transform component only really works in one axis at a time if you want to keep it flat, unless I'm doing something wrong and feel free to correct me. You could of course move the top faces up manually along Y, then use transform component to do the side faces, but that's the same amount of work as just extruding the faces and then deleting the excess which is how I've been doing it in the past.
Why are you using "Offset" only and not "Local Translate Z" on this gif example? I think you can get a better results using "Local Translate Z" first and then you can fiddle with Offset to try to fix things a bit if you are going very far out.
Because offset is much better than dragging the manpulator, and in most cases offset will actually create perfectly aligned faces just like the extrude tool, I've tried both since posting the .gif and I also tried selecting edges, but there's no way to do this as far as I can tell unless you do it in two steps which is not parametric. Local translate Z actually creates an even worse bubble looking skew.
Furthermore to our discussion above about broken edges trims, I figured out how to do this parametricly without any fiddling. Or at least I think I did until someone tells me it doesn't work, but it works on the shapes I need it for. The trick is to select edges and translate on x using the component move mode setting in the move tool. I wish I figured this out back when I was working on Gears 5!
operates on faces/polys/element with the option to hide "valid" faces/polys/elements... at the end I'm setting the min spacing of the emitter that generates the points that the mesh is created from. Have to say the trickiest part was finding a "decent" robust tri vs tri c code on the interweb (there are some real shockers about) in the end had to implement my own.
cylinder with nasty noise on it (while adjusting the seed and roughness)
added the ability to adjust the transition point in the uv to the texture and matched the "blend" geometry to the shape more as opposed the the circular version as it distorts the map less.
I'm starting on a set of UV visualisation / analysis / manipulation tools for fbx files using standalone python with a view to being able grab stats and find possible optimisations on big libraries of assets
step 1 is obviously to visualise some UVs
Step 2 will probably be determining coverage and other obvious stats but it's bedtime soon so I'm quitting for the evening
Neat, what would be the use case for this. We tried to check the UVs from the outsourcing team before we approved them to go on to the texturing stage to avoid costly re-texturing if they had created wasteful UVs. We used the Maya %UV wasted HUD to make sure everything hit at least 70% unless it was something funky where the shells couldn't be packed tightly.
So I created this Maya tool for someone that had a really specific workflow. They wanted to group UV shells and then pack the shells based on a scale they assigned to each group, I thought this was a really interesting idea and a neat challenge, but once I finished the tool I realized I'd never use it for anything, but the person that wanted it was happy. So I haven't released the tool yet because I'm not sure it's useful, but I'll share it here because I think it's neat. The worst part of the tool is there's no way to visually colour the UV shells to show which group they're in, or at least no way I could figure out how to do it in mel.
Neat, what would be the use case for this. We tried to check the UVs from the outsourcing team before we approved them to go on to the texturing stage to avoid costly re-texturing if they had created wasteful UVs. We used the Maya %UV wasted HUD to make sure everything hit at least 70% unless it was something funky where the shells couldn't be packed tightly.
That's one use case, texel density checking is another. I should also be able to use image analysis / computer vision stuff to compare lod UVs and suggest optimisations in the form of shell overlapping and so on too
The main thing is that it's not in Maya so it's fast and not crippled by python 2.7
Oh interesting, yeah we needed to do A LOT of lod optimizations to get Gears 5 to run at 60fps all the time. The tech art guys would use view mode in unreal to identify problems and send us huge lists of things that needed to be adjusted in certain ways.
My general philosophy is that reactive optimisation of this kind is the result of you not beating your art team severely enough at the beginning the project
Ha ha, I can guarantee 100% we were beat into pulp at the beginning of the project with the most harsh budgets you've ever encountered for any AAA game and stuck to them, then further dried out and ground into dust near the end of the project to get a solid 60fps @4k especially in the open world sections.
Replies
edit : cos im dumb...
delaunay doesn't work in 3 dimensions right ? do you know of an alternative that does? Specifically for non-solids - imagine a letter e with a bend modifier on it
But it might be just what I'm looking for - that's a project for next week at least (one day they're going to realise they're basically paying me to dick around and this will all be over)
So, I just finished a tool for UE4 where you are able to pack your RGBA channels into a single texture, being able to chose from different textures and customize resolution, output folder and compression settings from a single blueprint actor.
Since the launch of free Quixel for UE4, I've been struggling with packing textures from multiple materials gathered from Quixel, and most of the times all I want is some mask variation but there is also texture sampler budget.
The tool was created using only blueprints and not c++. Took advantage of the Render material to texture function to be able to generate the packed textures. Along side with this, created the custom material function where I can switch between normal map texture and regular RGB texture and to top it off, I wanted to have a real-time viewport update of what my result would be, so I added some planes with the texture and channel assigned to it.
Here's a video showing how it actually works and it's NOW available on the UE Marketplace in case you'd like to support a fellow (aspiring) tech artist.
Cheers!!
https://www.youtube.com/watch?v=eejd2uQKzIg&t=7s
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
https://www.dropbox.com/s/p2px8v4yxqiidv8/spaceSelObjs.ms
EDIT: Also made this script today which selects all faces in a Unwrap_UVW that have no area in UV space (for fixing meshs with bad UVs): https://www.dropbox.com/s/rimir0n2ivg734s/selZeroAreaUVFaces.ms
enum { NameResourceID = IDS_BLENDMODE_VNORMAL };
::Color operator()( const ::Color& fg, const ::Color& bg ) const
{
return ::Color(Normalize(fg + bg));
}
};
enum { NameResourceID = IDS_BLENDMODE_VNORMAL };
::Color operator()( const ::Color& fg, const ::Color& bg ) const
{
Point3 fgn = 2.0f * (Point3(fg.r, fg.g, fg.b) - 0.5f);
Point3 bgn = 2.0f * (Point3(bg.r, bg.g, bg.b) - 0.5f);
::Color res(Normalize(fgn + bgn)/2.0f + 0.5f);
return res;
}
};
I tested it recently after the same point was raised by an artist at the office and didn't see it behaving wrongly (not that I'd put it past Maya to have special cases)
Why are you using "Offset" only and not "Local Translate Z" on this gif example? I think you can get a better results using "Local Translate Z" first and then you can fiddle with Offset to try to fix things a bit if you are going very far out.
{
Point3 y(0.0f,1.0f,0.0f);
}
struct VNormal : public BlendMode< VNormal > {
enum { NameResourceID = IDS_BLENDMODE_VNORMAL };
::Color operator()(const ::Color& fg, const ::Color& bg ) const
{
Point3 fgn = 2.0f * (Point3(fg.r, fg.g, fg.b) - 0.5f);
Point3 bgn = 2.0f * (Point3(bg.r, bg.g, bg.b) - 0.5f);
Matrix3 tm;
normal_basis(bgn, tm);
fgn = tm.VectorTransform(fgn);
::Color res(Normalize(fgn + bgn)/2.0f + 0.5f);
return res;
}
};
step 1 is obviously to visualise some UVs
Step 2 will probably be determining coverage and other obvious stats but it's bedtime soon so I'm quitting for the evening
Technically step 1 was actually building the fbx SDK bindings for python 3.8 .
Actual, effective instructions for doing this are here if anyone's interested:
https://www.ralphminderhoud.com/blog/build-fbx-python-sdk-for-windows/
The main thing is that it's not in Maya so it's fast and not crippled by python 2.7