Whenever judging a game's visual quality, there are a lot of things to consider.
In my opinion, I separate the graphics into two separate categories before judging the game visual quality as a whole.
1: The game's rendering and artist work itself
Basically, that's the texture, the shading, the modeling, the lighting and you can even add in the LOD and all that stuff.
2-The game's output resolution.
Is the game clean? Basically that's when I judge the framerate, the Anti-Aliasing and the Screen resolution
Sadly, it seems like alot of devs have trouble juggling between these two factors (especially for console) and prefer the first outcome instead of delivering a clean 60 fps hd game.
I myself, thought I prefer the first..but realized since I started experiencing PC gaming that I really care more about having that ''wow it's clean'' instead of ''wow, it's beautiful''.
I'm not sure if you guys understand my point there?
For example, if a game on my PC eats too much ressources at MAX settings, I would simply downgrade the shadow's quality, the LOD details, the texture quality and etc. and leave the resolution, the AA and the framerate as high as possible.
Replies
60fps always makes a game play better. High resolution stops your eyes from getting distracted by ugly pixels.
Hehe that's because you have a good rig but if you were a console user, you don't have that flexibility.
If you were a developper who had influence on this, what would you optimize your game for; visual details or game output resolution?
As for the developer question, it's tricky to answer. In reality; budget,deadlines and publisher pressure can all have an influence on either making a good looking game or making a good playing game.
If there were no restrictions, I would still aim for both. I'd pick 1080p and 60fps, and make the best game possible under those limitations.
So you have never enjoyed any console games except Call of Duty? Interesting.
The WORST thing about this generation is that at the start everyone was like, oh we need to all do 60FPS now since its next gen. And so gamers and critics latched onto the idea that with the next gen everything has to be 1080p 60FPS or else its shit and that they cant enjoy a game if its not that.
People somehow forget almost every single game they have played and loved has been at 30FPS.
On one hand, 60fps was actually the standard in both the 8-bit and 16-bit era. Games rarely went below that unless there was slow down.
It wasn't till the 32-bit era when games got polygons, developers made compromises and cut the frame rate in half. However, this lead to much bigger and better games than anything the Genesis or Super Nintendo could handle.
But instead of returning to a 60fps standard like before, frame rates kept getting slashed each generation, which in turn harms gameplay.
60fps on consoles would be cool but so long as there is some form of decipherable motion as opposed to a powerpoint slideshow then I can settle with framerates above 24 or so while 60 would be ideal.
Games that aren't dwelled upon twitch-reactions can afford this kind of thing but when it comes to games like cod or fighting games that have little to no "virtual skill" and require constant game-user feedback then higher framerates would be favorable.
I've noticed a lot of lower-framerate games use motion blur to mask the low framerate and emphasize phi-motion.
Funny that you say this. The only console I've ever owned was a Wii and it has loads of titles that run at 60 fps, and low framerates are a big reason why I avoid most consoles.
I'll probably never play any of your studio's more recent games until the PC has a PS3 emulator that works well because I just don't find a third person walking-around-and-climbing-ladders game interesting enough to justify spending $200 on a PS3 and $20 on a used copy of it just to enjoy the long-ass loading times, sub-30 framerates and locked field of view on a controller that I'm not used to when I have my "monster rig" with loads of 60 fps games that I would like to play but just haven't gotten around to yet.
I've been PC gaming since I was a wee lad with Richard Scarry's Busytown so high framerates are quite important to me. Some other games that I grew up with and loved: Starcraft and Brood War, Lego Racers 2, Sid Meier's Alpha Centauri (a bit of an odd duck that didn't really need much of a framerate to play but was still really responsive, especially compared to later installments like Civ V where 60% of the game is waiting on enemy turn to finish), Roller Coaster Tycoon, The Sims, and Half-Life (didn't play this myself but watched my dad play it. Jump scares made me jump, lol.) Not all of these games necessarily ran at 60 fps but they were all responsive and ran with very little added latency, which was usually good for the gameplay.
If it's a game that requires motor skills, it needs to be at 60fps - I can't play shooters, racing games, platformers and that ilk at lower framerates, it just feels horribly wrong. 4X games and anything else that isn't particularly fast paced can be fine at 30fps, so long as the framerate is consistent.
You're speaking for yourself there.
Unlikely. People who advocate 60fps tend to have played games at 60 and preferred it. It's never a theoretical idea.
Gotta keep baiting them masses into buying sub 20fps games on 10 year old hardware.
Really though i think the 60fps matters more.
Dark Souls 2 for example felt like a whole new experience when running at a higher framerate. Not having to adjust to laggy inputs and extreme frame drops made things a hell of a lot easier as a trade off.
This is true for ffxi, i played the pc version for about 4 years the game was locked at 30, though recently decided to play in a private server just for nostalgias sake.
The login tool comes with a bunch of little hacks like higher resolutions downsampled anti aliasing and the choice to change the frame limiter.
And at 60 fps the game does feel different you could even say it moves like a next gen game, mind you this is a ps2 game.
Yeah this is a really weird sentiment and only applies to console games. Most people who grew up playing shooters on PC certainly have played their favorite games at 60FPS, unless their video card sucked or something. Most pc games vsync at 60hz.
I've been doing video capture (using a dedicated capture card to get uncompressed video) lately and one of the limitations with my workflow is that I need to pipe 30hz to the capture card, and use an HDTV as a preview monitor as virtually no computer monitors support 30hz. Working at 30hz is painful.
When you get down to 24/30hz, you really need motion blur to make it feel smooth.
Overall I'm always in favor of better fps/clearer image/quicker loading times etc. Graphics age quickly, but smooth gameplay lasts forever.
Also, I really like it when games have art direction that is made with respect to the limitations of hardware they are on. Metal Gear Solid 2 is probably one of my favorite examples. Not only is it 60fps, but thanks to its clean artstyle it looks crisp and clear to this day. By comparison, Metal Gear Solid 3 and 4 didn't age as gracefully.
Anything else resolution is priority, I dislike the idea of stacking filters and lowering resolution. And the frame rate needs to be sold.
4 still has a cg like appeal to me.
Not when the dlc is just the parts of the original game that they didn't complete in time
Amen brother! Give me all the pretties over 60fps any day.
You're so edgy bro
e: Has anybody tried taking input at a higher rate than the rendering because maybe that would help alleviate some of the perceived delay
A game I wish I could get to run at 60fps is Simcity 4. Cities Skylines runs better on my computer.
EDIT: After reading again, it's not exactly what you were asking.
Echoing some of what was said before, pretty much any high-twitch/multi-player game requires a solid 60fps, with no performance dips. If sections of a game can dip by more than 15 fps, I'd rather have a locked frame rate.
In a story based, single player game, where 10ms isn't going to kill you, I'll gladly take beauty over 60fps. If I can play a game at my own pace, and it's not chuggy as hell, I'm perfectly content with 24-30 locked frame rate.
I'm coming from the perspective of a primarily console gamer, and I do understand why a PC gamer would be more upset about sub-60 fps.
Mouse input with how we're used to controlling the cursor in windows and the direct feedback in first person shooters is why we're so sensetive to input latency on pc.
We're still limited with the amount of time it takes between each frame in 30fps for the result of our input to be rendered to the screen, there's no magic to fix that, even if we were to have the input logic run more often we'd still be limited by the time it would take to render the given state.
Now some games do it worse and at the very best we should be given as much instant feedback the very next rendered frame.
I recently tried skyrim at 144FPS on a gsync monitor, it's possibly the most mindblowing input feedback I've ever had in gaming.