I was wondering if there are any software/plugins for a theoretical type of simulation-based content creation. I have often become frustrated by the way that current digital content is created, as I often feel there is a significant disconnect between the way digital assets are created and the way matter exists in our universe (digital objects having no inherently defined volumes, for example). The advent of physically based rendering has made renderings both easier to produce and more realistic/flexible, as it more accurately replicates the physical processes in our universe, so I’m wondering if the same philosophy can’t be applied to digital assets as a whole. For example, if you wanted to create a scene with a wooden table being broken in half by a person falling onto it, are there any ways of creating the scene in such a way that that table is made of a collection of interconnected particles that react in a way that is similar to how they would react in our universe (e.g. the grain and type of wood affecting how it appears, both on the surface and inside internal volumes, and how it fractures and splinters). Are there any digital systems that can create content in this way?
If not, is the main restriction a lack of computational power? Memory?
Just curious,
Thanks in advance
Replies
https://docs.unrealengine.com/en-US/Engine/Chaos/ChaosDestruction/ChaosDestructionOverview/index.html
Things like chaos or working with voxel data are as close as you'll get in games