seems like old ground and it is in a way created an Outliner interface to better integrate some of my tools.... basically anything with an "outline" mesh, spline or modifier can be can be used as a "cutter",
a plane in plane
I'm still rooting for you Klunk Awesome to see something like this progressing. What do you think the performance of this would be on a 20k mesh with multiple cutters? Trying my hand at making a modifier that does something similar but with decals, still in progress. This sort of thing has so many applications . Thanks for continuing this work and posting, thought it might be a dead end since your post a while back lol.
Klunk, you could literally save people's lives, time and energy if you end up releasing this, even for a price. I'm sure the interest would be high. Hope it eventually happens, fantastic work by the way!
the trouble is it works well in application it's designed for and reasonably robustly at the moment, it doesn't work as well in a broader sense so too speak and to make it robust enough as a universal tool is still a lot of work
as for perfornance, it's wasn't programmed with it in mind but it does ok
max 2010 on a 6600 @ 2,4ghz and geforce 8800 gts
as you can see a few bugs to iron out on the A and e
In the context of game meshes it would be rare for one "cutter" to span an entire mesh and cross potentially thousands of tris. Instead details would be fairly local, with the exception of seams which i would handle differently than just projecting an outline. 20k is a good estimate for hero items like vehicles and large structural items or even first person weapons. If you're lucky to be able to use deferred rendering for a project you can take advantage of all the deferred decal awesomeness seen in star citizen/doom, but not all of us are that fortunate :P Which is why i'm so interested in the ability to cut into the mesh instead of floating
I've worked in deferred rendering for many years as an artist, it's still expensive to do lots of decals, I was surprised to see the amount of mesh decals in Star Citizen and Doom. When we were working on Gears 5 deferred decals were a per pixel cost so the more you have on screen the higher the cost, perhaps SC/Doom figured out an optimized mesh decal shader that helped them render so many?
I've worked in deferred rendering for many years as an artist, it's still expensive to do lots of decals, I was surprised to see the amount of mesh decals in Star Citizen and Doom. When we were working on Gears 5 deferred decals were a per pixel cost so the more you have on screen the higher the cost, perhaps SC/Doom figured out an optimized mesh decal shader that helped them render so many?
interesting. Given your experience what would you say is a reasonable amount of decals per scene? Sucks to hear it's that limited I'm not holding out hope of using deferred decals on our project any time soon tho, so we'll still be cutting in much of our detail.
It's just a matter of how many pixels they take up in screen space, so one huge decal will kill the performance. We were told to use them sparingly, I think in the hundreds per area not the thousands like I've seen in Doom. Alpha is really expensive in Unreal.
Getty creative with Vector Displacement Mapping, I wish we had better tools but I think we're making some progress with Mudbox and Zbrush. Obviously not for every project but it's super fun to play with. We now include a sample with our shader editor, really looking forward to see what ppl build.
Here's a sneak peak of another possible use, still a bit rough.
Multi layered impact decal/damage.
Tessellation can be taxing on older systems but the fact that LOD control is pretty much free is quite handy.
Sorry, one more! I promise it's the last one on VDM This time with a guide.
We were trying to create some more practical, a real-world example of sorts, so we went for wall damage but this could very well be adapted to other type of stuff, even using a cutout with some fancy shader to render it over something else. The cool thing about this is that you have multiple levels of debris, unlike regular displacement you extrude it on all axis, negative and positive directions.
We used Mudbox, Substance Painter and Amplify Shader Editor with Unity; we would love to see other ppl experiment with this as it can be done in other engines/tools.
another variant of one of my spline fill modifiers for max.... this time using a lofted spline.
been wanting to implement it for a while, but was put off by having to implement the control gizmo and not too keen on using another scene node. Then it struck me just add the lofting spline to the spline being filled and bobs yer uncle.... there's few more things I had to be wary of, it needs a kdtree to keep a min distance where the loft wants to bunch the verts (also good as it eliminates the chance of duplicate points which the delaunay triangulation code doesn't like)
scratched another itch.... reworked normalize spline to work on number of segments not (knot to knot length) also lets you only change just one curve in a shape (or all or selected)... there's some accumulation errors when segment count is high and it's quite a bit slower than the original but I can live with that.
scratched another itch.... reworked normalize spline to work on number of segments not (knot to knot length) also lets you only change just one curve in a shape (or all or selected)... there's some accumulation errors when segment count is high and it's quite a bit slower than the original but I can live with that.
Good stuff.. i feel like it wouldnt be too hard to adjust tangents to better match the original shape. Then again, might be missing the point
Nice! I should Google that algorithm - I implemented a version of this in ue4 a while back and never felt like it was done the right way - certainly wasn't elegant 😁
I have begun writing a pixel painter in the most unnecessarily awkward way i could think of (python, openGL and the pixels are generated with a geometry shader). You can't tell from the screengrab but the pixels are slowly spinning which is nice cos it makes your eyes go funny
I'm mainly doing this so i can better understand data transfer between bits of the rendering pipeline - not cos i want a pixel painter.
Future plans include switchable geometry and fragment shaders, saving images and one day a 3d view (cos why not complicate matters further). first things first though - it needs an eraser
cracked the fixed corners/bezier corners in my resample spline modifier
bloody infuriating sometimes working in max..... why would you set a bezier corner out vector to effectively a zero length vector why ? why ? it's not only making my life difficult but it makes it bloody tricky to edit for anyone else....
*and relax*
it works so well I'm going to have to add the max's length option to mine. A note about anyone using the Curve Fit code I mentioned previously you probably only need the least squares generate Bezier routine for this application theres very little to be gained from using the newton raphson reparameterization.
delaunay doesn't work in 3 dimensions right ? do you know of an alternative that does? Specifically for non-solids - imagine a letter e with a bend modifier on it
That is a very strange website. But it might be just what I'm looking for - that's a project for next week at least (one day they're going to realise they're basically paying me to dick around and this will all be over)
though not using that code (I can't remember where I got mine from though I made some mods to remove thin tris etc) It's still pretty ugly delaunay style mesh
my point emitters convert to mesh using it Badly cut diamonds 'r 'us
Hey! This is first post and I must tell that I am certainly overwhelmed by the work posted in here. I'm not as experienced as most of you, but I love this area.
So, I just finished a tool for UE4 where you are able to pack your RGBA channels into a single texture, being able to chose from different textures and customize resolution, output folder and compression settings from a single blueprint actor.
Since the launch of free Quixel for UE4, I've been struggling with packing textures from multiple materials gathered from Quixel, and most of the times all I want is some mask variation but there is also texture sampler budget.
The tool was created using only blueprints and not c++. Took advantage of the Render material to texture function to be able to generate the packed textures. Along side with this, created the custom material function where I can switch between normal map texture and regular RGB texture and to top it off, I wanted to have a real-time viewport update of what my result would be, so I added some planes with the texture and channel assigned to it.
Here's a video showing how it actually works and it's NOW available on the UE Marketplace in case you'd like to support a fellow (aspiring) tech artist.
Yup, it works with a blank render taget, so anything you actually throw at it it will be packed into the channel you chose, including alpha. The baked texture is then saved into the folder of your chosing and you can get rid of the textures you used before. So if you imported lets say 3 8K textures, you can pack them up into a single RGB 2K texture and then delete the 8K textures
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
Thanks! I really dont know why UE doens't have anything like it built in, which made me want to build it even more haha. As you can see it's quite simple tool and can definetily save a ton of time, at least for artists.
I had to do a lot of manual Hole punching on spaceships so. I got Lazy
and decided to make a script to automate the process. Just wish I had
the time to build this tool when I actually needed it. Not the most
realistic, but Its a good base to start.
Wow, that's interesting. Would you be willing to share what's going on under the hood conceptually. I see you're doing an extrude, but how are you randomly deleting faces and getting the smaller jaggy edges.
Its not actually randomly deleting faces its just deleting the original faces that was selected at the end after most of the operation is done, the jaggy edges come from the Random attribute of the transform Components.
Posted to Gumroad my old vert-to-quad script (F2MAX), as well as a new script that connects selected verts to all others on their selected face (FanConnect).
That's cool. In Photoshop I've been using a script which sets up a layer rig to blend the normal maps correctly, it's part of the normally blue panel. https://gumroad.com/l/NormallyPanel I beleive the NM Blue2 button blends them correctly like the old Quixel tools did. I set a hotkey for the script so I can toggle the normal layer from default to blended correctly.
This would be incredibly handy for damaged edge trims, it was a real pain in Maya duplicating and floating damage edges trims in front of the model because Maya doesn't offer any parametric way to offset the geometry without skewing the shape as it gets farther away from the surface.
Yes, but it skews the shape of the trim as I mentioned above as it gets farther away from the model. The only clean way I've found to do this is with extrude face since it will stay straight, but that's time consuming and the rout I took which was painful.
If you harden the normals /set to face it should project straight out.
I tested it recently after the same point was raised by an artist at the office and didn't see it behaving wrongly (not that I'd put it past Maya to have special cases)
Unfortunately no, I just tried that with all hard normals and the same problem occurs, everything skews because transform component only really works in one axis at a time if you want to keep it flat, unless I'm doing something wrong and feel free to correct me. You could of course move the top faces up manually along Y, then use transform component to do the side faces, but that's the same amount of work as just extruding the faces and then deleting the excess which is how I've been doing it in the past.
Unfortunately no, I just tried that with all hard normals and the same problem occurs, everything skews because transform component only really works in one axis at a time if you want to keep it flat, unless I'm doing something wrong and feel free to correct me. You could of course move the top faces up manually along Y, then use transform component to do the side faces, but that's the same amount of work as just extruding the faces and then deleting the excess which is how I've been doing it in the past.
Why are you using "Offset" only and not "Local Translate Z" on this gif example? I think you can get a better results using "Local Translate Z" first and then you can fiddle with Offset to try to fix things a bit if you are going very far out.
Replies
I'm still rooting for you Klunk Awesome to see something like this progressing. What do you think the performance of this would be on a 20k mesh with multiple cutters? Trying my hand at making a modifier that does something similar but with decals, still in progress. This sort of thing has so many applications . Thanks for continuing this work and posting, thought it might be a dead end since your post a while back lol.
Here's a sneak peak of another possible use, still a bit rough.
Multi layered impact decal/damage.
Tessellation can be taxing on older systems but the fact that LOD control is pretty much free is quite handy.
We were trying to create some more practical, a real-world example of sorts, so we went for wall damage but this could very well be adapted to other type of stuff, even using a cutout with some fancy shader to render it over something else. The cool thing about this is that you have multiple levels of debris, unlike regular displacement you extrude it on all axis, negative and positive directions.
We used Mudbox, Substance Painter and Amplify Shader Editor with Unity; we would love to see other ppl experiment with this as it can be done in other engines/tools.
I'm mainly doing this so i can better understand data transfer between bits of the rendering pipeline - not cos i want a pixel painter.
Future plans include switchable geometry and fragment shaders, saving images and one day a 3d view (cos why not complicate matters further).
first things first though - it needs an eraser
edit : cos im dumb...
delaunay doesn't work in 3 dimensions right ? do you know of an alternative that does? Specifically for non-solids - imagine a letter e with a bend modifier on it
But it might be just what I'm looking for - that's a project for next week at least (one day they're going to realise they're basically paying me to dick around and this will all be over)
So, I just finished a tool for UE4 where you are able to pack your RGBA channels into a single texture, being able to chose from different textures and customize resolution, output folder and compression settings from a single blueprint actor.
Since the launch of free Quixel for UE4, I've been struggling with packing textures from multiple materials gathered from Quixel, and most of the times all I want is some mask variation but there is also texture sampler budget.
The tool was created using only blueprints and not c++. Took advantage of the Render material to texture function to be able to generate the packed textures. Along side with this, created the custom material function where I can switch between normal map texture and regular RGB texture and to top it off, I wanted to have a real-time viewport update of what my result would be, so I added some planes with the texture and channel assigned to it.
Here's a video showing how it actually works and it's NOW available on the UE Marketplace in case you'd like to support a fellow (aspiring) tech artist.
Cheers!!
https://www.youtube.com/watch?v=eejd2uQKzIg&t=7s
That's a great tool. The thing I hate most about working with ue4 is that it doesn't have flexible compile time channel packing and this could solve some of that.
https://www.dropbox.com/s/p2px8v4yxqiidv8/spaceSelObjs.ms
EDIT: Also made this script today which selects all faces in a Unwrap_UVW that have no area in UV space (for fixing meshs with bad UVs): https://www.dropbox.com/s/rimir0n2ivg734s/selZeroAreaUVFaces.ms
enum { NameResourceID = IDS_BLENDMODE_VNORMAL };
::Color operator()( const ::Color& fg, const ::Color& bg ) const
{
return ::Color(Normalize(fg + bg));
}
};
enum { NameResourceID = IDS_BLENDMODE_VNORMAL };
::Color operator()( const ::Color& fg, const ::Color& bg ) const
{
Point3 fgn = 2.0f * (Point3(fg.r, fg.g, fg.b) - 0.5f);
Point3 bgn = 2.0f * (Point3(bg.r, bg.g, bg.b) - 0.5f);
::Color res(Normalize(fgn + bgn)/2.0f + 0.5f);
return res;
}
};
I tested it recently after the same point was raised by an artist at the office and didn't see it behaving wrongly (not that I'd put it past Maya to have special cases)
Why are you using "Offset" only and not "Local Translate Z" on this gif example? I think you can get a better results using "Local Translate Z" first and then you can fiddle with Offset to try to fix things a bit if you are going very far out.