While I was working on a project and relearned some maxscript, I made a couple of scripts for 3ds max to help with some tedious tasks, such as exporting lot of models in one go, adjusting turbosmooth on many objects ect., and have uploaded them to my website, in case someone else might have any use of them. http://www.http//mcgreed.dk/maxscripts.html
Working on a Landscape generator for webGL in unity that uses tilesets. Depending on the bitmasking value that is calculated it chooses the corret 3d-mesh. Now i need to make some 3d-tiles.
Finally can share a preview - 3ds max fast blockout tool done as a part of the keyHydra suite, customized boolean cutters and more interactivity by manipulating the operands right in viewport:
I've made some tools to speed up the process of renaming objects, specifically for 'Match by name' baking in Substance Painter.
This next script is for deleting suffixes. It will check if the suffix is '_high' if it is it will delete the last 5 characters of the name. If _high isn't the suffix then it will just delete the last 4 characters which will delete '_low' or '.001'.
I've made some tools to speed up the process of renaming objects, specifically for 'Match by name' baking in Substance Painter.
This next script is for deleting suffixes. It will check if the suffix is '_high' if it is it will delete the last 5 characters of the name. If _high isn't the suffix then it will just delete the last 4 characters which will delete '_low' or '.001'.
Useful one! I will appreciate if you share it with us C:
@Yondr I'm happy to share this but I created it to be called from another addon called Pie Menu Editor. So the script as of right now will not work as a standalone addon. That said, enjoy! Just be sure one of the objects has _low suffix!
I have just released version 0.3 of my Zbrush plugin. It offers some renaming options and a prefix based grouping system, with which you can do stuff like this:
I'm not a tech artist (I won't even say I fully understand HLSL/GLSL code) but I decided to make a custom Tri-Planar Projection shader (calculating 6-faces for no mirroring errors) for Marmoset 3 since they now allow for custom shaders. Allows Albedo, normals, roughness, metalness, ao maps for now.
I currently have it working to a certain extent but there are still some iffy bugs and I need to see if I can fix them or not (based on my skills + implementation). Might probably make a thread for this in future for possible future distribution.
Hi, I am working on a simple Gear creation script, fully made in Python and Maya! Still very simple but I think I will extend it to support alignment, animations and more creation styles.
Following up on a free parallax-based fake interior shader we did a while back with our shader editor at Amplify (Amplify Shader Editor).
Currently working on a new version based on the original paper by Oogst 3D for fake interiors done entirely via shader sorcery. The Unreal implementation by Stefander was an awesome example, definitely recommend it to anyone using UE4. For even more information on how to build something like this in Unreal, check out the old UDK Example.
Interior Mapping in Unity3d
Possible extra features: Add extra layers for props.(people, desks, boxes etc) Lighting (Emission) Randomize rooms
I post often on the RTVFX forums i should prob start doing that here too, anyway here's my WIP shader experiment. A slope aware rain spatter/drip shader, I'm in the process of making it look prettier here's some gifs. i'm aslo wrapping the entire thing into a material function so it can be easily layered on top of any surface material.
i'm sorting the faces, the modifier has a gizmo with position or direction options, if set to direction the order is set by z position of the face center in the modifier space otherwise it's set by the distance from the gizmo position. There are some oddities to fix, even though I sort the whole Face array within the mesh the Material ID remain fixed. Adjacent faces within polys which have "different" orders are put together max will adjust the orders "internally" so they are correct which causes the UV shearing Issues. The matid is odd because smg "moves" with the face.
I post often on the RTVFX forums i should prob start doing that here too, anyway here's my WIP shader experiment. A slope aware rain spatter/drip shader, I'm in the process of making it look prettier here's some gifs. i'm aslo wrapping the entire thing into a material function so it can be easily layered on top of any surface material.
Whoaa this is awesome, how does it even work?
It's actually pretty simple, the ripples and streaks are WorldProjected, and has a BP Vector to control rain direction, the slope awareness, is just a simple Z-gradient that blends between ripples and streaks, this is nice because it takes the objects curvature into account as well, I've started writing a breakdown on my blog if you're interested https://deepspacebanana.github.io/deepspacebanana.github.io/blog/
@PixelGoat Here's the Material Network, the basic Idea is to generate a spherical Distance Field Gradient from the Player's Location and use it to add a spherical offset to the basic Wind Math. We could easily reduce the number of instructions, but I haven't bothered with that for now. If you need a more detailed explanation, I'll probably make a blog post, if I ever get some free time
@PixelGoat Here's the Material Network, the basic Idea is to generate a spherical Distance Field Gradient from the Player's Location and use it to add a spherical offset to the basic Wind Math. We could easily reduce the number of instructions, but I haven't bothered with that for now. If you need a more detailed explanation, I'll probably make a blog post, if I ever get some free time
How would you handle it so that it worked on all players and not just one? By using a render target centered around camera and drawing to that using BP?
@PixelGoat Here's the Material Network, the basic Idea is to generate a spherical Distance Field Gradient from the Player's Location and use it to add a spherical offset to the basic Wind Math. We could easily reduce the number of instructions, but I haven't bothered with that for now. If you need a more detailed explanation, I'll probably make a blog post, if I ever get some free time
How would you handle it so that it worked on all players and not just one? By using a render target centered around camera and drawing to that using BP?
Honestly, I'm not sure, would have to find some way to feed the closest player location per instance of the foliage. That gets into gameplay code and is quite beyond me. Render Targets as I understand it, have a pretty heavy overhead and aren't actually feasible in-game. I'm not even sure if render targets are an option here since there are multiple meshes and Render Target are in 2D texture space, for this shader I needed a 3D spherical gradient in world coordinates.
@PixelGoat Here's the Material Network, the basic Idea is to generate a spherical Distance Field Gradient from the Player's Location and use it to add a spherical offset to the basic Wind Math. We could easily reduce the number of instructions, but I haven't bothered with that for now. If you need a more detailed explanation, I'll probably make a blog post, if I ever get some free time
And this collection with "TexLoc" is the texture with your distance field?
"TexLoc" was just a temporary name I had set a long time ago, it is actually "Player Location" fed from the Player Blueprint, the distance field is not a texture, it is a 3 Dimensional gradient generated using math inside the shader, in this case, the Distance Function. "TexLoc" is literally just a vector3 coordinate value that has the player's location in the world.
AssetGen is a free addon that automates the tasks to get a
game asset ready for video games from an High Poly model. While it
takes several hours to get an asset ready from an high poly, this addon
does that in a matter of few clicks. It is ideal for all your static
assets.
It is developed by Srđan Ignjatović aka zero025 and surpervised by Danyl Bekhoucha aka Linko.
@gilesruscoe That looks great. Could you explain how it works? My best guess would be several extruded and masked shells, but it looks definetely better than any fur I have seen using this technique, especially in motion.
Hi guys, I gave this talk and the recording was released. It's about the creative use of textures for other things than color - for example to store position data in a texture to be able to pause and rewind time in the game (and therefore play-back the effect). I hope you'll like it! Note: Extra-Content can be found on the blog post below the video: https://simonschreibt.de/gat/cool-stuff-with-textures/
Following up on a free parallax-based fake interior shader we did a while back with our shader editor at Amplify (Amplify Shader Editor).
Currently working on a new version based on the original paper by Oogst 3D for fake interiors done entirely via shader sorcery. The Unreal implementation by Stefander was an awesome example, definitely recommend it to anyone using UE4. For even more information on how to build something like this in Unreal, check out the old UDK Example.
Interior Mapping in Unity3d
Possible extra features: Add extra layers for props.(people, desks, boxes etc) Lighting (Emission) Randomize rooms
@gilesruscoe That looks great. Could you explain how it works? My best guess would be several extruded and masked shells, but it looks definetely better than any fur I have seen using this technique, especially in motion.
Yep, you got it, this is a bunch of shells (I think 12 shells in the images, I don't remember). Specifically, its a geometry shader that dynamically creates the shells and shell spacing depending on parameters. The only thing I'm doing differently is that the shells are offset by a noise vector, giving random direction to the strands, multiplying the colour by the layer step (top layers are brighter than inner layers) and also randomising the length of the strands using a single channel of the noise vector, to give it patches of different length. Besides that there's a few other things ontop which are fairly straight forward, like a two-colour gradient over the length of the strands, a mask texture to control different coloured areas etc.
For the lighting, one addition that I found to give a good result is to do a fresnel effect (N dot V) and add the strand length to it. This gave longer strands more fresnel regardless of angle and makes it look like light scattering through the hair.
The motion is done by rotating each normal around its local xy plane using a 2x2 matrix, this was just plugged into a sin wave for the gif, but I hope to plug rotational velocity of the mesh into it at some stage.
Calculating a wallride trajectory on the fly and its offset for the character (exagarated here, the offset algorithm was so fricking hard!!!). Next will be how to tackle the character movement mechanism to make use of the trajectory (which has velocity info embedded in it). We'll see...
I would like to ask you guys if this idea would make any sense in your opinion before I make anything. I thought you could technically voxelize a scene by taking xyz+- slices with a scene capture and rendering them into 6 volume textures, or something ... This would be a render of some short scene depth view, 1/ steps / side for example. And then either do a procedural mesh pass by passing through pixel values from the render targets, or to just render a volumetric result by raymarching. I would assume generating a mesh would give you an overall more optimized result. All non filtered so you get planes or cubes if doing per side rendering.
You could also just read the render target pixel value in the mesh loop. I would think if its like that, you only need a single render target with 1x1 res.
optional drop in mesh helper for 3ds max, my solution for multiple optional meshes in game dependent on the game params. The helper has a custom attach constraint that binds it to the mesh that requires the optional geometry (the attach constraint returns all data needed to fit the optional mesh to existing geometry). The helper use a NodeTransform Monitor so it can update when the target changes. The first part shows the helper updating to changes in the target mesh then I change some of the mesh params in the helper and finally we are changing the barycentric position of the helper in the face using the attach constrarint. Still have to implement the surface blending properly (positions, uv, vertex colours and normals)
to show changing the face index in the attach constraint
https://youtu.be/Iy9fCdcrNSo fixed the blend to the target mesh with added bonus of flagging the drop in mesh as invalid if it exceeds the bounds of the faces.
I would like to ask you guys if this idea would make any sense in your opinion before I make anything. I thought you could technically voxelize a scene by taking xyz+- slices with a scene capture and rendering them into 6 volume textures, or something ... This would be a render of some short scene depth view, 1/ steps / side for example. And then either do a procedural mesh pass by passing through pixel values from the render targets, or to just render a volumetric result by raymarching. I would assume generating a mesh would give you an overall more optimized result. All non filtered so you get planes or cubes if doing per side rendering.
You could also just read the render target pixel value in the mesh loop. I would think if its like that, you only need a single render target with 1x1 res.
I would like to ask you guys if this idea would make any sense in your opinion before I make anything. I thought you could technically voxelize a scene by taking xyz+- slices with a scene capture and rendering them into 6 volume textures, or something ... This would be a render of some short scene depth view, 1/ steps / side for example. And then either do a procedural mesh pass by passing through pixel values from the render targets, or to just render a volumetric result by raymarching. I would assume generating a mesh would give you an overall more optimized result. All non filtered so you get planes or cubes if doing per side rendering.
You could also just read the render target pixel value in the mesh loop. I would think if its like that, you only need a single render target with 1x1 res.
It works by taking max hair and fur, creating splines from guides, converting those to geometry (so they have UV's), then generating a unique grayscale value per strand and flooding the vertex color red channel with it, doing a root to tip gradient in the green channel, and a depth gradient in the blue channel.
The vertex colors can then be baked down in your app of choice.
Big thanks to Dennis Lehmann for helping me iron out some issues i was having.
Replies
I've looked into a way to improve the "anime" look with shader so the facial features are always right :
Also I've been experimenting with AR and Vuforia to get good light and environment integration :
http://www.http//mcgreed.dk/maxscripts.html
Working on a Landscape generator for webGL in unity that uses tilesets. Depending on the bitmasking value that is calculated it chooses the corret 3d-mesh. Now i need to make some 3d-tiles.
Next step will be to change the bitmasking from 8 to 32 bit. So i can have intersections between the height segments.
I'm developing Animation Sketch on Blender to experiment a simplification in the whole rig-skinning-animation part (à la Mosketch) :
This next script is for deleting suffixes. It will check if the suffix is '_high' if it is it will delete the last 5 characters of the name. If _high isn't the suffix then it will just delete the last 4 characters which will delete '_low' or '.001'.
It offers some renaming options and a prefix based grouping system, with which you can do stuff like this:
If you are interessted, check out the thread!
I currently have it working to a certain extent but there are still some iffy bugs and I need to see if I can fix them or not (based on my skills + implementation). Might probably make a thread for this in future for possible future distribution.
Following up on a free parallax-based fake interior shader we did a while back with our shader editor at Amplify (Amplify Shader Editor).
Currently working on a new version based on the original paper by Oogst 3D for fake interiors done entirely via shader sorcery. The Unreal implementation by Stefander was an awesome example, definitely recommend it to anyone using UE4. For even more information on how to build something like this in Unreal, check out the old UDK Example.
Interior Mapping in Unity3d
Possible extra features:
Add extra layers for props.(people, desks, boxes etc)
Lighting (Emission)
Randomize rooms
It's going to be free and fully editable.
still some shearing problems between neighbouring faces sharing UV's that need to be iron out but much better visualization than without (mid vid)
works well enough to help with some prototyping
Here's the Material Network, the basic Idea is to generate a spherical Distance Field Gradient from the Player's Location and use it to add a spherical offset to the basic Wind Math. We could easily reduce the number of instructions, but I haven't bothered with that for now. If you need a more detailed explanation, I'll probably make a blog post, if I ever get some free time
https://i.gyazo.com/f0a8ab15ad2837d41567e06214fa35e2.mp4
"TexLoc" was just a temporary name I had set a long time ago, it is actually "Player Location" fed from the Player Blueprint, the distance field is not a texture, it is a 3 Dimensional gradient generated using math inside the shader, in this case, the Distance Function. "TexLoc" is literally just a vector3 coordinate value that has the player's location in the world.
If you are curious about distance functions I highly recommend checking out Inigo Quillez's website, he explains them really well http://www.iquilezles.org/www/articles/distfunctions/distfunctions.htm
AssetGen is a free addon that automates the tasks to get a game asset ready for video games from an High Poly model. While it takes several hours to get an asset ready from an high poly, this addon does that in a matter of few clicks. It is ideal for all your static assets.
It is developed by Srđan Ignjatović aka zero025 and surpervised by Danyl Bekhoucha aka Linko.
First time creating hair with xgen, process so far>
For the lighting, one addition that I found to give a good result is to do a fresnel effect (N dot V) and add the strand length to it. This gave longer strands more fresnel regardless of angle and makes it look like light scattering through the hair.
The motion is done by rotating each normal around its local xy plane using a 2x2 matrix, this was just plugged into a sin wave for the gif, but I hope to plug rotational velocity of the mesh into it at some stage.
You could also just read the render target pixel value in the mesh loop. I would think if its like that, you only need a single render target with 1x1 res.
optional drop in mesh helper for 3ds max, my solution for multiple optional meshes in game dependent on the game params. The helper has a custom attach constraint that binds it to the mesh that requires the optional geometry (the attach constraint returns all data needed to fit the optional mesh to existing geometry). The helper use a NodeTransform Monitor so it can update when the target changes. The first part shows the helper updating to changes in the target mesh then I change some of the mesh params in the helper and finally we are changing the barycentric position of the helper in the face using the attach constrarint. Still have to implement the surface blending properly (positions, uv, vertex colours and normals)
to show changing the face index in the attach constraint
fixed the blend to the target mesh with added bonus of flagging the drop in mesh as invalid if it exceeds the bounds of the faces.
It works by taking max hair and fur, creating splines from guides, converting those to geometry (so they have UV's), then generating a unique grayscale value per strand and flooding the vertex color red channel with it, doing a root to tip gradient in the green channel, and a depth gradient in the blue channel.
The vertex colors can then be baked down in your app of choice.
Big thanks to Dennis Lehmann for helping me iron out some issues i was having.