yeah, it's a bezier ease control (it's actually a "double" bezier to give it more of a kick ), same as the one at the top (which is an overall control where as the bottom one is a per stop variant).
on a side note this is quite weird, I'm not sure whats causing it but when adding the gradient with noise as a vertex color angled across a 100 by 100 segment plane you get an odd interference pattern when the octives are 8 or higher. And It only happens when angled.
I don't think it's a "real world" issue anyway it's a pretty unlikely scenario though there not a tool on the planet an artist can't break in short order
been mucking about with a simple stone generator for max it's basically a geosphere with random clipping planes (the demo object has a mesh smooth with 2 iterations on it plus some box mapping). Anyway the script is attached for anyone who wants to play
interestingly (well to me anyway ) I can randomize scripted plugin objects in my scatter tools.... you can invalidate the mesh using SimpleObject::MeshInvalid(); which forces the mesh to be rebuilt then in the script you change seed rseed to seed (random 0 65535) and voilà a new mesh at every spot
for some applications using my point/ray clouds I need some objects to use all the points and some to use only some of the points. So I created a limiter helper (I would have used a modifier but then it's still up to the caller to decide which to use the mod or the base object and that created some issues). So the limiter just create a point cloud with a percentage of the original point with the ability to bias that pick with the color of the point). And as now all my point clouds are pipeline objects I can use all my vertex shading mods to handle the bias
the above demo wasn't what it was designed to do but is a useful idiot Also I could get a similar result with map based distribution with fractal noise shading
added some phong specular lighting to a normal mapped camera facing shader.
the top and bottom spheres are actual spheres for reference all the others are just quads which in reality looks like... though enlarged somewhat. something a bit more subtle... probably not that subtle without the phong spec....
thanks, the maths is really "weird" especially getting the light and view directions for the pixel shader (screen space) and a different light direction for the vertex shader (view space) as the lighting and normals come in two parts.... one for the overall clump of points handled in vertex shader and then each billboard handled in the pixel shader. Tuning the 2 together is a bit hit and miss too
seems like old ground and it is in a way created an Outliner interface to better integrate some of my tools.... basically anything with an "outline" mesh, spline or modifier can be can be used as a "cutter",
a plane in plane
Crawling back to the forums nearly 4 years later since I initially posted the question to re-ask if this had become available for use in the meantime by any chance? The "cut 'n' stitch" dynamic here is just wonderful and would save a lot of manual work. Fingers crossed!
I've been building a little OpenGL shader development environment loosely inspired by shadertoy. notable features: you can use any editor you like (nvim or you're weak) works on linux autoloads saved glsl files supports #includes reports shader compilation errors properly poops out your full shader with line numbers to the console so you can debug the aforementioned errors clicking on a pixel prints the output color as 32f - which is good saves output to a 32bit exr
It's in python 3.12 and untested on windows - I'll likely open it up on github at some point fairly soon here's a really exciting picture of a test shader and some windows with text in...
Edit: didn't think it was worth a new post really but since last time.. rewrote fragment shader construction code added texture includes made sure it works on windows
wrote (i.e stole from IQ) some sdf libraries and mucked around with them (picture below)
There's some quality of life stuff to sort out before I open this up but I'm getting there, mainly I want to make use of all uniforms optional so I can get rid of the filthy hack at the top of main()
found a way to conform the cutter to the surface it's cutting... requires a reference to the surface otherwise you'd end up in "death loop of hell" followed by max going down
useful clean drop in mesh... it requires a pre "normals to map mod" with a post "map to normals" to appear seamless
all of the above led to a map to vdata (soft selection) modifier.... plane is using a gradient ramp map (radial to box etc) piped through the conform mod another variation a straight map channel to vdata mod this is vert color noise (varying the phase) converted to weighted vertex selection and conformed to the sphere
added an option for a look at constraint node for the projection direction for the conform.... the plane has a path constraint to a circle with a noise on the radius and conformed to the sphere.
this is so weird... the end result of making sure my outliner modifier (converts any open edges into a cutting shape for my mesh cutter mod) actually creates something sensible in extreme cases (I got a crash early on with max's extrude modifier on complex text leaving random holes). Any way heres a simple box with random faces deleted to create the open edges the gizmo defines the projection direction..... I don't think there's any practical applications other than making the mod a bit more robust when something is slightly amiss
This is a lot less flaky now i rewrote half of it - that doesn't mean it's any good but it does function and it's been really good fun actually using it to make silly stuff.
includes are supported from lib_glsl in the installation folder/package you can add <n> textures (Texture0, Texture1) etc. - it looks in cwd for the files. edit: i fixed it so you dont need to restart the app to add textures the hack is no longer needed
I'm not building packages yet but it is buildable (I use poetry, you should also use poetry) I will do that once I've got my head around github actions but thats for another day if you can't be bothered / want to run from source, you can find the dependencies in the .toml - I'd still recommend installing it since it's designed to be run as an executable module (python -m shed -f my_shader_file.glsl)
Last tested on windows 10 with an nvidia gpu. it should work on linux (kubuntu, amd gpu) although I've not gone back to check since i fixed the most recent windows bug.
some more random mundane win32 controls for me.... some gradient display controls (working off the base gradient data) so you can have a visual representation from the Win32 Open File dialog... has three options diagonal (should be square or just looks skewed ) horizontal and vertical everyone loves a windows open file dialog test.rmp being the correct file name but incorrect format I reversed engineered the photoshop gradient format reader..... though that comes with multiple gradients per file... quite a bit of work to code nicely !
it's done in python/opengl because I'm far too lazy and stupid to write it in C
TLDR for those that don't want to read the paper: you define a volume for the canopy, fill it with points, grow branches from the trunk towards the points. The major benefit vs the more conventional 'start with a trunk and grow branches from it' approach is that you are able to explicitly define a canopy shape - rather than just accepting what you're given
This isn't where I'm actually heading with this project - i have a similar but different approach in mind but I needed a known good reference
I used the same paper for the basis of my tree generator.... though they missed a trick in the original paper..... the point cloud should also have an associated direction vector for each cloud point it gives you far more control over the tree,
also the tree is all about the point cloud and it's distribution !
@Eric Chadwick - its done in literally the worst way possible at the moment so it's too slow to grow realtime after a depressingly small number of iterations. I know how to fix it but since it's just reference i'm saving the optimisation for the proper version.
@Klunk yep, i spotted that issue pretty quickly they do allude to it in the blurb but dont really go into what sort of direction makes most sense. i've got them tending upwards for the moment which seems to largely work.
the whole space colonisation thang works very well as a "fractal" system.... twigs->fronds->tree.....as the second trick missed in the paper is the tree/branch/twig its self should work as a point/vector generator.... but that plant modelling paper is a new one to me
I had quite a lot of experience with speedtree (and had some really good results but boy does it throw out a lot of crap....) though space colonization method seems to produce a more "natural" result and less crap the process is not particularly intuitive
rather embarassingly that second algorithm completely stumped me for a long time. then i grew a brain and made it happen
There's plenty of refinement left to do including:
not having gaps where branches should meet,
multiple target points on the trunk (the trunk doesn't contribute at all, they're just converging on a point)
direction biasing
biasing the distribution of points - it makes sense for there to be more towards the outside of the volume (anyone know the right search terms for that sort of thing - the internet is not playing ball)
but.. even the dumb dumb implementation is really fast - as in it's instant with no acceleration structures at all, this is literally just nested loops it gets faster as it goes (because you're effectively removing information) you need far less points to start with
also gif (of me advancing the growth steps manually)
biasing the distribution of points - it makes sense for there to be more towards the outside of the volume (anyone know the right search terms for that sort of thing - the internet is not playing ball)
I implemented "density by map" option for my emitters, I also implement several 3d texture maps... eg 3d volume depth map (also has a helper object) etc. one of the strangest but welcome things I find with this algorithm is no matter how many branches and complicated the tree gets there is never any self intersection
i dont relish the thought of implementing anything like that from scratch - i'd probably have to read documentation and that makes my soul sad.
Given the need to support 'any' closed volume for a canopy I'm inevitably going to end up doing some sort of voxelisation which means it should be fairly straightforward to generate a distance field that I can use as a mask. I dont relish the thought of that either but it's maths(mild headache) rather than openGL docs(car battery clipped to your sensitive bits)
Intersection between branches is possible with the outside in approach I think - that might actually kill the idea. With the original I think it would be impossible
I like how fast outside-in is though so I will persevere for a bit
I maintain a kdtree of the tree node positions as it grows to speed up the find nearest.....then enumerate the kdtree to generate new nodes. In my case the biggest bottle neck is reorganizing the tree into a more suitable state for generating the mesh.
i dont relish the thought of implementing anything like that from scratch
the point generation became so critical to the tree creation process I haddecided thought it best to implement them as max pipeline objects (a rabbit hole to hell that was !! ) Though they do have other uses.... they make for great scatter type stuff and mesh instancing etc it all gets a bit nuts after a while and the face hugger it all gets a bit bonkers
I wanted a break from trees and because I am an idiot I decided to have a go at the overlapping model wave function collapse without reading anyone else's implementation (tbh - I got bored reading the descriptions of the algorithm so I didn't even bother with that)
The overlapping model is the one where you take a source image, slice it up into overlapping tiles and then generate a bigger image by select appropriate neighbours for a placed tile automatically - as opposed to the more common model where tile neighbours are defined ahead of time
This was certainly a mistake but it does appear to be mostly working. It doesn't reliably fill the image but I don't think it's supposed to - i really should read the words... (top image is result, other one is the debug I think filtering the tileset for duplicates will increase the likelyhood of it finishing but I'd rather play video games this afternoon.
uniform distribution across a non-planar quad..... most of versions online suggested splitting it across the 2 tri's but in the above case would produce a hard edge first generate a biased unit quad distribution the bias is computed from the ratio of the top and bottom edge you then do four polynomial interpolations to fit the the polygon surface. I'll post it up once it's cleaned up bugs are ironed out which enables me to loft a "flat" and uniform random points around a spline.... It was one of those... "lofting a volumetric cylinder of points was easy peasy how tough would a planarish loft be"
had a late night epiphany... resulting in a new way to generate point normals..... using a top angle and a bottom angle (with adjustable distance between)... need to change the gizmo colour when it passed 90 to show it's "inverted"
had a sleepless night and wondered whether, after this thread, I could convert the edge rendering tool to generate the normal directly. Turns out I can
obviously the UI will need some tweaking and there are some seam issues that could be improved and I haven't tried anything more complicated than the cube All I can say is tangent space normal maps what a ball ache long live object space ! will post it up when it's in and acceptable state seems i've got work to do.... thought I could get away without actually having to compute the tangents (you don't if your uv are correctly aligned ) Oh well time to roll out the Lengyels
i genuinely sat down with the intention of having another crack at the trees but... i wrote a shit game engine that runs in the terminal and a shit game to run on it instead (who knew there was a bong glyph ?)
Replies
does it happen at not 45 degree rotation?
I've seen this sort of artefacting before but the exact cause escapes me for the moment
it's basically a geosphere with random clipping planes (the demo object has a mesh smooth with 2 iterations on it plus some box mapping). Anyway the script is attached for anyone who wants to play
seed rseed to seed (random 0 65535)
and voilà a new mesh at every spot
the above demo wasn't what it was designed to do but is a useful idiot
with fractal noise shading
the top and bottom spheres are actual spheres for reference all the others are just quads which in reality looks like...
though enlarged somewhat.
something a bit more subtle...
probably not that subtle
inline HWND getColorPickerHWND(ColorPicker* picker) { if (!picker) return NULL; return (HWND)((int**)((int**)picker)[1])[1]; }you could Enumerate the all Windows looking for windows with class "#32770" and title text beginning "Color Selector:..." far too much dull coding
Crawling back to the forums nearly 4 years later since I initially posted the question to re-ask if this had become available for use in the meantime by any chance? The "cut 'n' stitch" dynamic here is just wonderful and would save a lot of manual work. Fingers crossed!
notable features:
you can use any editor you like (nvim or you're weak)
works on linux
autoloads saved glsl files
supports #includes
reports shader compilation errors properly
poops out your full shader with line numbers to the console so you can debug the aforementioned errors
clicking on a pixel prints the output color as 32f - which is good
saves output to a 32bit exr
It's in python 3.12 and untested on windows - I'll likely open it up on github at some point fairly soon
Edit:
didn't think it was worth a new post really but since last time..
rewrote fragment shader construction code
added texture includes
made sure it works on windows
wrote (i.e stole from IQ) some sdf libraries and mucked around with them (picture below)
There's some quality of life stuff to sort out before I open this up but I'm getting there, mainly I want to make use of all uniforms optional so I can get rid of the filthy hack at the top of main()
useful clean drop in mesh... it requires a pre "normals to map mod" with a post "map to normals" to appear seamless
combo of the last two....
plane is using a gradient ramp map (radial to box etc) piped through the conform mod
another variation a straight map channel to vdata mod
this is vert color noise (varying the phase) converted to weighted vertex selection and conformed to the sphere
I don't think there's any practical applications other than making the mod a bit more robust when something is slightly amiss
This is a lot less flaky now i rewrote half of it - that doesn't mean it's any good but it does function and it's been really good fun actually using it to make silly stuff.
So... I stuck it on github
https://github.com/poopipe/shed.git
includes are supported from lib_glsl in the installation folder/package
you can add <n> textures (Texture0, Texture1) etc. - it looks in cwd for the files.
edit: i fixed it so you dont need to restart the app to add textures
the hack is no longer needed
I'm not building packages yet but it is buildable (I use poetry, you should also use poetry) I will do that once I've got my head around github actions but thats for another day
if you can't be bothered / want to run from source, you can find the dependencies in the .toml - I'd still recommend installing it since it's designed to be run as an executable module (python -m shed -f my_shader_file.glsl)
Last tested on windows 10 with an nvidia gpu. it should work on linux (kubuntu, amd gpu) although I've not gone back to check since i fixed the most recent windows bug.
everyone loves a windows open file dialog
test.rmp being the correct file name but incorrect format
I reversed engineered the photoshop gradient format reader..... though that comes with multiple gradients per file... quite a
bit of work to code nicely !
This is a (very) naive implementation of the tree generation algorithm described in this paper
https://algorithmicbotany.org/papers/colonization.egwnp2007.large.pdf
it's done in python/opengl because I'm far too lazy and stupid to write it in C
TLDR for those that don't want to read the paper:
you define a volume for the canopy, fill it with points, grow branches from the trunk towards the points.
The major benefit vs the more conventional 'start with a trunk and grow branches from it' approach is that you are able to explicitly define a canopy shape - rather than just accepting what you're given
This isn't where I'm actually heading with this project - i have a similar but different approach in mind but I needed a known good reference
also the tree is all about the point cloud and it's distribution !
@Klunk yep, i spotted that issue pretty quickly
they do allude to it in the blurb but dont really go into what sort of direction makes most sense. i've got them tending upwards for the moment which seems to largely work.
Next i'll try this one
https://www.researchgate.net/publication/266472675_Particle_Systems_for_Plant_Modeling
Which appears to be exactly what I was going to try and work out for myself
then i grew a brain and made it happen
There's plenty of refinement left to do including:
- not having gaps where branches should meet,
- multiple target points on the trunk (the trunk doesn't contribute at all, they're just converging on a point)
- direction biasing
- biasing the distribution of points - it makes sense for there to be more towards the outside of the volume (anyone know the right search terms for that sort of thing - the internet is not playing ball)
but..even the dumb dumb implementation is really fast - as in it's instant with no acceleration structures at all, this is literally just nested loops
it gets faster as it goes (because you're effectively removing information)
you need far less points to start with
also gif (of me advancing the growth steps manually)
I implemented "density by map" option for my emitters, I also implement several 3d texture maps... eg 3d volume depth map (also has a helper object) etc.
one of the strangest but welcome things I find with this algorithm is no matter how many branches and complicated the tree gets there is never any self intersection
Given the need to support 'any' closed volume for a canopy I'm inevitably going to end up doing some sort of voxelisation which means it should be fairly straightforward to generate a distance field that I can use as a mask.
I dont relish the thought of that either but it's maths(mild headache) rather than openGL docs(car battery clipped to your sensitive bits)
Intersection between branches is possible with the outside in approach I think - that might actually kill the idea. With the original I think it would be impossible
I like how fast outside-in is though so I will persevere for a bit
i dont relish the thought of implementing anything like that from scratch
the point generation became so critical to the tree creation process I had decided thought it best to implement them as max pipeline objects (a rabbit hole to hell that was !!
Though they do have other uses.... they make for great scatter type stuff and mesh instancing etc
it all gets a bit nuts after a while
and the face hugger
it all gets a bit bonkers
The overlapping model is the one where you take a source image, slice it up into overlapping tiles and then generate a bigger image by select appropriate neighbours for a placed tile automatically - as opposed to the more common model where tile neighbours are defined ahead of time
This was certainly a mistake but it does appear to be mostly working.
It doesn't reliably fill the image but I don't think it's supposed to - i really should read the words... (top image is result, other one is the debug
I think filtering the tileset for duplicates will increase the likelyhood of it finishing but I'd rather play video games this afternoon.
most of versions online suggested splitting it across the 2 tri's but in the above case would produce a hard edge
which enables me to loft a "flat" and uniform random points around a spline....
It was one of those... "lofting a volumetric cylinder of points was easy peasy how tough would a planarish loft be"
triangulate it, I want to see what happens
//******************************************************************************************** // where quad is defined as.... // c----d // | | // | | // | | // b----a inline void UnitPoint2Quad(const Point3& unit, const Point3& a, const Point3& b, const Point3& c, const Point3& d, Point3& p) { float u1mx = 1.0f - unit.x; float u1my = 1.0f - unit.y; // some inlining // p = a * (1 - u.x) * u.y p.x = a.x * u1mx * unit.y; p.y = a.y * u1mx * unit.y; p.z = a.z * u1mx * unit.y; // p += b * (1 - u.x) * (1 - u.y) p.x += b.x * u1mx * u1my; p.y += b.y * u1mx * u1my; p.z += b.z * u1mx * u1my; // p += c * u.x * (1 - u.y) p.x += c.x * unit.x * u1my; p.y += c.y * unit.x * u1my; p.z += c.z * unit.x * u1my; // p += d * u.x * u.y p.x += d.x * unit.x * unit.y; p.y += d.y * unit.x * unit.y; p.z += d.z * unit.x * unit.y; }it's max and c but should be pretty easy to understand. If any one's interested I can post the random biased code snippet tooNot totally finished, but i rewrote how bloom in the engine works.
it's a rather basic "keep mod n" and junctions at the moment... planning to add weighting based on most deviation.
using a top angle and a bottom angle (with adjustable distance between)...
need to change the gizmo colour when it passed 90 to show it's "inverted"
obviously the UI will need some tweaking and there are some seam issues that could be improved and I haven't tried anything more complicated than the cube
will post it up when it's in and acceptable state
seems i've got work to do....
thought I could get away without actually having to compute the tangents (you don't if your uv are correctly aligned
Lengyels?
seems like he knows a thing or two
almost a chamfer
but...
i wrote a shit game engine that runs in the terminal and a shit game to run on it instead
(who knew there was a bong glyph ?)
I'm pretty confident you can't make the seam perfect (at least in terms of generating a texture) cos its a sampling/filtering artefact.