I don't know a lot about hardware. I looked up this card and it appears to be at least 10 years old.
A play tester reports my game crashes on their computer. Is it considered reasonable for a modern game to run on hardware that old? A game with realistic graphics?
Currently the game loads into full world without running a benchmark - that is something I'm updating but even with a chance to turn graphics settings low, is there some performance measures I can look at so that I can say unequivocally, "the game uses ____ amount of VRAM or whatever, therefore this ____some GPU__ is the minimum spec."?
thanks for any advice
Replies
It's not really reasonable to accommodate for decade old hardware, no. Whether you do is a decision for you and your project. Depending on the kind of crash it might still be worth investigating as that might imply something incorrect about your programming, but if it is just running out of memory or overloading the card there's not much worth doing about that.
Your 'minimum supported hardware' is really going to be whatever the lowest thing you've tested it on is. You won't be able to give a definite answer to this question without testing, and it's probably best not to anyway (lest you 'mislead' customers). You can give indications of expected performance on certain memory/clock budgets based on your own profiling.
Thanks @rexo12 . I am developing alone so it helps to get an idea how other developers make decisions about this sort of thing.
i have a laptop with a nvidia 1060 GPU and so far I've just made sure the game can run on high settings and maintain 30fps on that. So far, the only people who've had trouble had really old hardware or bargain-level, non-gaming hardware.
It's kills me though when somebody wants to play the game but can't :)