Hi
Have watched the Houdini bridge demo twice now but am confused as to how to exactly mix 2 materials using the Bridge Houdini Livelink feature.
In my case I want to simply layer a grunge texture on top of a marble texture.
Also with Redshift is it necessary to use the Triplanar workflow when mixing materials or can I use the Redshift textures workflow(with UVs)?
Any examples, .hip files most welcome!
Replies
Depending on your Render software, mixing 2 materials are slightly different. But In all cases it uses the native capability of each render software to do the blending. The thing the Quixel tools offer is the "quixel mixer" nodes for all Renders.
The idea with this node is to be able to take in displacement maps from 2 sources, as in 2 materials and from this generate a mask that you use together with used render engines blender capability.
The "quixel mixer" node can also take in a threshold value in the "th" input pin that can be either render properties or attributes on the geometry. This value is meant to be able to "fade" the mixing between the 2 materials. and if used together with 2 displacement maps it will generate a gradual transition looking natural between the 2 materials.
If you are using RedShift for blending I would look into Material / Displacement Blend nodes for Redshift, these are the ones you will supply the mask for which the "quixel mixer" node generates. In the end you dont have to use the "quixel mixer" node to blend between materials it is just there to make generating more advanced masks easier.
Think of it a little bit like photoshop layers where the "bot" input is the bottom layer, there you put the displacement from the material you want to be at the bottom or under the other material, and in the "top" pin you put the other material.
I have attached a still screen from the video. if you see the "ParticleAttributeLookup" node, this is just taking in a point attribute from the geometry that says attribute = fit( @P.y , -0.2, 0.2, 0, 1); so it blends bettwen 0 and 1 based on height. This feeds into the "quixel redshift mixer" nodes 'Th' input to drive the transition between the top and bottom displacement textures. and in turn the 'mask' output is feed into the redshift blender nodes.
So in short... use the blend / mix node of your favorite renderer, generate masks with the 'quixel_mixer' node based on the displacement inputs from 2 materials. "offset" it with point attributes in the 'th' input.
For your 2nd question, it works both with TriPlanar and Texture workflow.
/Magnus