Question What Graphics settings do you play games at?

What graphics settings do you run games at

  • Lowest Settings

    Votes: 0 0.0%
  • Medium Settings

    Votes: 0 0.0%
  • Highest Settings

    Votes: 0 0.0%
  • Ultra

    Votes: 6 40.0%
  • You just let game choose

    Votes: 5 33.3%
  • something else - please explain below

    Votes: 4 26.7%

  • Total voters
    15

Brian Boru

King of Munster
Moderator
I generally just leave it as is
Well then, may I suggest you vote for that :)

Me too. I'll only go in the settings if something visual bothers me in the game. One thing is with those 'Move the slider until you can barely see the image' setup jobs, I usually cheat a little on the side of over-bright—I dislike dark places in games, mainly due to weaker eyesight.
 
One thing is with those 'Move the slider until you can barely see the image' setup jobs, I usually cheat a little on the side of over-bright—I dislike dark places in games, mainly due to weaker eyesight.
I have done this myself in Diablo, its dark was too dark to see.

I wouldn't really classify the amount of light you can see in the game as being a graphics setting though. It sort of is but I meant more along lines of quality of textures etc.

Since getting new GPU I have looked in the settings on a few of my games but I haven't really changed any. I need to play more modern games that have higher expectations I think.
 
Was gonna vote "let game choose" but then i noticed that those settings were always less than what my GPU can handle at 1440p and then things like vsync and motion blur are usually left on. I use nvidias settings from time to time but usually just for reference, but me most of the time? Its usually ultra and then i decrease from there if my FPS arent up to what i want, the occasional game say CP2077, i pretty much knew that i couldnt run that at ultra when it first came out but the overwhelming majority of games new and old run great at ultra at 1440p so i never start lower.
 
Yea its ultra and reduce from there for me too. Cant really get ultra with my current card in most games anymore so I reduce until I can hit a decent frame rate.

Obviously motion blur gets turned off, I also usually lower AA to the lowest level as its usually pretty expensive in terms of performance. If the game's a few years old and I have a newer card then I dont lower the AA and see what happens.
 

mainer

Venatus semper
something else - please explain below

I play, or re-play, many older games over the course of a year as opposed to new releases, and most of those games only have high/medium/low settings, some of which don't have any individual options to tweak individual options. But my general philosophy/procedure is this: I shoot for the max settings, and then gradually tweak backwards to get what I consider to be an acceptable FPS balanced with graphical quality.

My monitor is big (43"), has great detail (4k), but is slow by gaming standards for most gamers (60 hz). I've always prioritized graphic fidelity over FPS as I don't play competitively, so getting a solid 40-60 FPS is fine for me if it gives me more graphic detail and animations. Some older games, like Bethesda RPGs, I actually mod for more details as well as tweak the INI files.

The newest game I've played, The Witcher 3 Next-Gen, was a nightmare for me to configure. Granted, the remaster wasn't optimized very well at release (though 3D Projekt is continually patching it), but it also had settings beyond Ultra with the Ray Tracing and Ray Tracing Ultra settings. There isn't a graphics card in existence today that can run that game in RT Ultra.
0n3xra5.png
 
Something else - I spend a ridiculously long amount of time tweaking my settings to get the best mixture of visuals + performance. With an RTX 2060 and Ryzen 5 1600X, my hardware is a bit outdated, so typically presets have a few settings turned on that need to be off or lowered for better performance. For instance, I spent far too much time getting my graphics setting perfect in Cyberpunk 2077 where the game played at a constant 45-60FPS with visuals that still looked good and not like a PS2 era game.
 
There isn't a graphics card in existence today that can run that game in RT Ultra.
specs beyond current abilities just gives you something to look forward to when you do have a PC that can run it. That is one way to look at it. I used to always play older games on new PC to see the difference.

That doesn't work so well now as all my games are so old that my previous GPU could play them at about same settings. Only thing I notice now is no screen tearing.
 

Frindis

Dominar of The Hynerian Empire
Moderator
something else - please explain below:

In most games, I play on Ultra if my PC can take it and I get reasonable FPS, nothing below 60. In games like PUBG/Warzone, I'll adjust the settings so I can have the game visually look sharp without the cost of major fps loss. In some games ultra settings do not really do much at all, making you invest in fps sinks that make you believe that it looks better when it really does not or at a so small increment that it is not noticeable at all. Throwing in Red Dead Redemption 2 as an example of a game that does just that with some of its ultra (and even high) settings.
 
i don't really play on ultra as i rarely see the differences for a graphical improvement that i might not even notice. High is fine i find for 90% of the games i play. That said, i rarely play on ultra these days as my card probably can't support it or isn't the best idea for my 1070. Perhaps when i get a new graphics card i'll notice, but with current prices and massive game back log? i'll wait a bit.
 
It appears that kaamos is incorrect because people do try to play at ultra and did vote for it. Also shows reviews of Gpu showing ultra actually does make sense.
My choice of games probably reduces number that would expect high settings or even have them. Some of them would benefit more from a better CPU more than Gpu (that is next), mostly strategy games and open world like Tropico 5. They start to chug after a while.
I will boost some and see if it makes any difference on the games I play.
 
  • Like
Reactions: Brian Boru
For me if a game "optimises for GPU" then I let it do that. If I'm not happy with the results or it doesn't have that option I start at Ultra and then tweak specific settings that have the best trade offs.

Although I have a 1440p, 165hz monitor I don't play competitive shooters so I typically set the FPS limit to 60. This generally lets me have a lot higher settings without getting horrible fan noise from my 6700XT

PC Gamer had a good article about this. Although 4 years old it is still relevant today.
 

TRENDING THREADS