For a simulation project need to create an accurate library of materials based on a specific building, that can later re-used and modified for later usage.
Testing the photogrammetry path I find difficult and not scalable to just edit the taken photos to generated what needed.
So was thinking if make sense to just use the photo as a reference, extract the parametric data, and next generate the material using some non destructive scriptable tool such as Filter Forge or Substance Designer.
There is a way to extract let's say the noise pattern parameters from a photo, so to next reuse them programmatically? Or there is some reliable database where all the needed parameters are set for the various materials?
Replies
I would search for existing materials in those libraries and match them.
If you need to generate the data programmatically then designer and the automation toolkit are where you probably want to start looking.
It also seems to be not cheap-you need the pro version to handle large 16 bit images so you immediately need to spend $400
The subscription cost for individuals with designer is way less
Where designer's weaknesses lie are that you cannot develop your own atomic nodes and you cannot effectively iterate across an image. This makes blurs and other multisample type filters difficult to create.
For transparency...
I'm a huge fan and advocate of Allegorithmic pipelines - I build them for a living so I have to be
And
As I said above, I know nothing much about FF
but
if I were faced with the choice I would be looking at how FF dealt with large scale automation, modularity and change propagation rather than the type of image it's geared to create - the art part is essentially an implementation detail.