I've got a question for any of you hardware gurus out there:
I’ve had a report from a playtester about a CTD because of running out of video memory (error reports below):
They’ve shared their system specs - this is an inspiron 5570 laptop. I dont see the GPU mentioned but I google searched, and it seems like it might have an AMD Radeon 530.
I am not a hardware guy, but from what I can tell this is like, very low end laptop that should only be expected to run basic games. Am I correct in that assumption?
I am trying to determine if it is reasonable to try and optimize the lower graphics settings to be able to perform on hardware like this in my game. For reference, here is some images what my project looks like:
It is an open world with dynamic lighting and lots of foliage. My development machine is this:
and on high graphics settings the game runs around 30fps on my machine. On ultra it sometimes dips below 30fps during sunrise/sunset when shadows are longer. Usually the GPU draw time is ~20m/s.
If I reduce screen resolution to 90% usually the FPS gets closer to 60fps.
Besides the obvious stuff of reduce texture size, view distance, shadow res, etc, I am not sure there is much I can do?
I’d appreciate any advice from you hardware gurus. Thanks!
Update:
User has updated about their GPU:
Replies
But the question remains, would it be reaosnable to expect a machine like that to run a game like this if shadows and grass are disabled, textures reduced, draw distance reduced, and so on? Or would that still be a lost cause.
In another week or so I'll be delivering an update that loads into an empty level with UI only, giving a chance to lower graphics ahead of time so we'll see. But would still be nice to see some more experienced thoughts about the matter.
Since (I assume) you're not a large studio that's already making tens of millions off the already shipped PC and console versions of the game I'd suggest you have better things to spend your time on.
My recommendation would be to ..
a: build a settings menu ASAP - even if it's just low/med/high
b: target the average steam machine (which is roughly what you're already running iirc) and make sure you get the actual game ( with all the logic / vfx / AI etc. running) constantly hitting your target frame rate & resolution at low/medium settings.
This puts you in the ballpark of getting it to run on last gen consoles and gives you breathing room for turning shiny shit on for the new hardware
just a note on settings - I'm sure you've thought about it though
if this is multiplayer you have to consider the impact of disabling things like foliage and shadows on game balance - plenty of people have died to hiding in invisible grass in games like DayZ and PUBG
yeah my main concern is if its a good use of time or not because there is too much work to do already and I didn't know if a machine like this is typical or at the extreme end.
Now I am thinking that if this guy can load into an empty level first and then adjust graphics settings to low (and this pretty much disables all grass, shadows, and reduces lods to minimum), it should be able to run on that machine. I'd hope so anyway, but if not I am not worried because as you've described, the game runs with satisfactory performance on my machine at high and even ultra settings most of the time.
About disabling grass and gameplay, in this case I don't have to worry too much - it's single player only. It will make gameplay slightly easier for players but I'd rather that they just be able to play period.
About detecting video memory, I'll have to research about that. The rest is things I've done a little of but yeah - you end up swapping out so much stuff back and forth its kinda pointless to do too much until the end - at the end I'll have to just go through everything with a comb and get it all squared away.
Just updating here for posterity:
regarding running a hardware benchmark, unreal has some built in blueprints for that here: