1080p 60fps Champions Club

Jan 14, 2020
38
33
4,570
Visit site
Hey guys,

Anyone else here still rocking 1080p 60fps and happy with it?

I've found the hardware required to hit these aren't nearly as expensive as it used to be, I'm pretty content with 1080/60.

What would the perfect 1080/60 next gen card be? The 3050 perhaps? All the next gen stuff seem to be heavily geared towards 144 / > 1k resolutions.
 
I think all the next gen cards will crush games at 1080p and be well over 60 fps aside from high end AAA games that are demanding, with RTX on etc. but the last 2 gens of cards have been able to put games over 60fps at 1080p very well, even older, if youre content on just getting 60fps at 1080p theres no need to really get a new gpu in the 30xx series aside from maybe power consumption/look/etc. If a 3050 or a 3060 comes out and it sits at like 250 or 300 then hey by all means get one when they do, its your choice, and im sure they will be coming out with those.

As for 1080p/60fps, no lol but that is part of the reason why i built the system i got, i like to get as high fps as i can at ultrawide resolutions over 1080p
 
  • Like
Reactions: antoknda
Not here.

1080p 60's real problem is the paucity of adaptive sync for monitors of that class. There are lots with freesync but most have very narrow refresh rate ranges, unlike 144hz monitors that will have LFC.

Adaptive sync removes the need to target 60fps so hard, and smooths out the experience when you're not hitting that mark. 60fps is somewhat arbitrary with a modern monitor + GPU combo.

So the next perfect 1080p card is probably a monitor with adaptive sync that frees you from the number 60, assuming your GPU supports it.

As for lower end GPUs themselves, it's not clear yet how soon and/or how far down the stack the new GPU families will go. For instance, the best entry level 1080p gaming card from AMD in the age of its 5000 series cards is still its RX 570 from the previous gen.

We're also at a point where 1440p 144hz monitors can cost little more than 1080p 144hz offerings, and GPUs supporting high settings 1440p gaming are much more affordable than they ever have been (RX 5600 XT / RTX 2060+). Plus new consoles coming out. And next gen consoles are targeting 1440p or 4k, which probably makes sense both in terms of what PC can offer and the widespread availability of 4k TVs. And consoles are also presenting HDMi 2.1, with the headline gaming feature being variable refresh rate.

Obviously there are still people who play even below 1080p too.
 

Zoid

Community Contributor
I'm still on that 1080p 60Hz life.

I would love to invest in a fancy high-res monitor and a beefy graphics card to push frames to it, but alas, these things cost money, and I have yet to be granted my free unlimited supply, no matter how passionately I lobby for it :p
 
  • Like
Reactions: antoknda
I'd make the case that getting a monitor with a high refresh rate and adaptive sync is a relatively small cost, in that you can claw back some of the cost of the monitor through savings on the GPU.

If there's X and Y GPU that are ~£30 (or your local equivalent) different in price, and one pushes 5fps more than the other, you might be tempted to get the one that's £30 more if it keeps you closer to 60, if that's what your monitor does or it only has a 55-75 freesync window. Whereas with a large adaptive sync window you're not so bothered about the numbers.

You might also be able to defer a GPU upgrade for longer for the same reason. The -ish part of 60-ish FPS gets a lot more flexible.

And for games that are going to run at 100fps+ anyway like Fortnite or any older / less demanding titles, then the high refresh monitor also has an advantage to offer there.

So you benefit both in titles that run at high framerates, and in titles that run at 60-ish FPS. The ish is much less of an issue.

While the expense certainly isn't viable or may plain not make sense for everyone (e.g. you game on your 1080p 60hz TV from the comfort of your sofa), refresh rates and/or resolutions above 1080p/60 shouldn't be seen as premium goods any more. They're mainstream and not just for enthusiasts with deep pockets or people who want MOAR FPS / pixels.

Although people who want MOAR FPS tragically often still buy a -70 Super class GPU for 1080p 60, which is just tragic :(

I die inside every time I see someone say they want to enjoy max settings and max raytacing at 1080p. The point that raytracing and all other settings is about improving fidelity, and resolution plays a huge part in fidelity often seems lost.
 
Last edited:

Zoid

Community Contributor
While the expense certainly isn't viable or may plain not make sense for everyone (e.g. you game on your 1080p 60hz TV from the comfort of your sofa), refresh rates and/or resolutions above 1080p/60 shouldn't be seen as premium goods any more. They're mainstream and not just for enthusiasts with deep pockets or people who want MOAR FPS / pixels.
For people budgeting out a new system I definitely agree - I think an adaptive sync monitor (potentially above 1080p) should be worth inclusion, even for medium budget builds.

As for me myself though? Step right up to procrastination station. As long as I keep enjoying my games at 1080p 60Hz then I don't have to upgrade my GTX 1070 and I can keep $800 in my pocket :p
 
Jan 14, 2020
38
33
4,570
Visit site
For people budgeting out a new system I definitely agree - I think an adaptive sync monitor (potentially above 1080p) should be worth inclusion, even for medium budget builds.

As for me myself though? Step right up to procrastination station. As long as I keep enjoying my games at 1080p 60Hz then I don't have to upgrade my GTX 1070 and I can keep $800 in my pocket
I've been chasing 1080p 60Hz for so many years, now that I have have it, I just don't want to let it go
 

Sarafan

Community Contributor
Still at 1080p and not going anywhere. :) I have a GTX 1060 card which limits me from going higher. I'm planning to buy an RTX 3060, but even then I'll stick to 1080p. I'm satisfied with this resolution and have no need to upgrade it. I suspect however that we'll be slowly moving towards 4K as a standard. The new RTX GPUs give a chance that this will be possible.
 
  • Like
Reactions: antoknda and Echo84
Jan 14, 2020
38
33
4,570
Visit site
Still at 1080p and not going anywhere. I have a GTX 1060 card which limits me from going higher. I'm planning to buy an RTX 3060, but even then I'll stick to 1080p. I'm satisfied with this resolution and have no need to upgrade it. I suspect however that we'll be slowly moving towards 4K as a standard. The new RTX GPUs give a chance that this will be possible.
It seems 4k TVs are aggressively rolling out, so it's definitely becoming a standard thing. I just don't feel the resource cost of hitting 4k using either PC or console currently is worth the payoff at all, plus I'm too damn blind to tell the difference!
 
  • Like
Reactions: antoknda

Zoid

Community Contributor
It seems 4k TVs are aggressively rolling out, so it's definitely becoming a standard thing. I just don't feel the resource cost of hitting 4k using either PC or console currently is worth the payoff at all, plus I'm too damn blind to tell the difference!
4K TVs are a little bit ahead of the PC monitor game in becoming affordable. There are some darn cheap 4K TVs out there. Whenever I upgrade my TV I'll definitely go 4K, but until then I am actively avoiding going into a Best Buy, because if I do then I'll spend 20 minutes ogling some 80" 4K TV playing demo reel footage and I'll realize what I'm missing :LOL:
 
  • Like
Reactions: antoknda

spvtnik1

Community Contributor
Jan 13, 2020
186
415
1,970
Visit site
Well... I kind of wished I had ponied up for the 1440p ROG Swift instead of the 1080p version. However, in the same light, I'm glad I didn't purchase a +$500 TN panel monitor. I intend to let my monitor become the bottleneck soon, with the vision of replacing it once 1440p/4k IPS panels are more affordable. I've decided clearly on the RTX 3080, or whatever variations could be in store for us. I do not care if it's too much for 1080p. Just look at Flight Simultar 2020... there is plenty of room to add more density and fidelity in our gaming experiences. In the interim, I think 1080p 120fps would not be terrible, nor would dynamically scaling to higher resolutions at lower fps. Personally, although I haven't really experienced gaming on a 4k monitor, I think I would prefer more fps over more pixels.

At the end of the day, I would be tempted to keep the PG248Q because it is just that nice. There is a chance I would sell it along with my current computer to a GOOD HOME only, haha. Likely not, the parts will instead find their way in to a media PC. The monitor has plenty of potential as a secondary.
 
Last edited:
  • Like
Reactions: antoknda

spvtnik1

Community Contributor
Jan 13, 2020
186
415
1,970
Visit site
Yes. By increasing the resolution.

Yes, I agree. I just read that up there. XD I will say that any time I've using the scaling feature in a game, I can usually tell there's difference, I'm just assuming it would look 100 times better at the native scale with anti-aliasing... but it still looks better than the default resolution, at least in some titles, so I'm willing to work with that.
 
Ultimately it's mostly down to personal preference.

That said, the above emphasises another concern I have about high end GPU + low res monitor pairings, which is the view that it's a choice between resolution and framerate.

Of course it is to some degree, but by a much smaller degree than is often made out.

People look at benchmarks of graphics cards or games at various resolutions and often conclude they'll take a 30%-ish framerate hit going from 1080p to 1440. And you can understand why - it's what the graphs show.

Except that because of the higher resolution, you don't necessarily need to run the game at the same settings to enjoy better fidelity. You can often turn the AA down for one thing, and one might find that other settings being lowered still leaves an overall much nicer picture. And then if one is using Dynamic Super Resolution or whatever equivalent at 1080p, one has already taken a lot of the framerate hit of going to a higher resolution anyway.

For people playing competitive games at competitive (mostly low) settings, then of course it is a choice between framerate and resolution in many cases. it's relatively niche, but it's its own domain. For the rest of us who generally like to enjoy games at 'pretty high' settings and 'decent' framerates, in some combination, it actually needn't be much of a choice between resolution or framerate (if you're spending the cash on a relatively expensive monitor and/or GPU anyway).

Especially with ludicrously powerful graphics cards like the RTX 3080 that don't really have a role in 1080p.

Also things like FS2020 are fine at 30fps, at least so many versed in these things seem to feel. it's not a 120fps kind of game. Partly because it's arguably not exactly a game.
 
Last edited:
  • Like
Reactions: Brian Boru and Zoid
Still at 1080p and not going anywhere
Ditto. Our GPUs are 970 & 1060, and we both use 2 TVs as monitors, so it would be quite expensive to build 2 new PCs worthy of 30X0 cards & 4 x 4K TVs.

Mind you, the price of 4K TVs is quite ridiculous—$250 for 50", I paid that for our 1080p 36-42" current ones only a few years ago. That kind of pricing, 4K will become standard very quickly.
 

Sarafan

Community Contributor
It seems 4k TVs are aggressively rolling out, so it's definitely becoming a standard thing. I just don't feel the resource cost of hitting 4k using either PC or console currently is worth the payoff at all, plus I'm too damn blind to tell the difference!

We'll see how the next-gen consoles will behave when it comes to 4K. TVs are ready, but still a little expensive in comparison to Full HD ones. I still don't believe that RTX 30X0 GPUs will be able to handle 4K in maximum details with ray tracing on. It'll be possible to play comfortably in 4K with ray tracing off however. That will make 4K available for those who can afford the high-end graphics cards. We're far from 4K adoption in the mid-end and low-end segments though.
 
Max settings is somewhere between meaningless and dangerous though.

It's like how Deus Ex Mankind Divided needed a GTX 1080 (Sept 2016) to run just an average of 60fps with all the settings turned up. But if you managed just a few settings down a bit, a GTX 970 could hold 60fps almost absolutely.

Games frequently have a few settings that are ludicrously demanding to power and add little, if any, appreciable fidelity.

While max settings exist in that you can push the sliders all the way up, "max settings" really oughtn't to be treated as a goal. The extent to which "max settings" or "ultra settings" get talked about and holds mindshare is unhelpful. At best.

At best, it leads to people who care - or think they care - about fidelity trapping themselves at a lower resolution and/or refresh rate than they can achieve and benefit from. At worst it leads to people shelling out for a $1000 GPU to play at 1080p 60fps when they could get a GPU and a monitor for the same money that would give them 1440p 60+ framerates at very high settings and a much better experience overall.

The only thing they lost was being able to say "I play my games at MAX settings".

Sorry, I get triggered by max settings ;)
 
  • Like
Reactions: Brian Boru

Zoid

Community Contributor
Max settings is somewhere between meaningless and dangerous though.

It's like how Deus Ex Mankind Divided needed a GTX 1080 (Sept 2016) to run just an average of 60fps with all the settings turned up. But if you managed just a few settings down a bit, a GTX 970 could hold 60fps almost absolutely.

Games frequently have a few settings that are ludicrously demanding to power and add little, if any, appreciable fidelity.

While max settings exist in that you can push the sliders all the way up, "max settings" really oughtn't to be treated as a goal. The extent to which "max settings" or "ultra settings" get talked about and holds mindshare is unhelpful. At best.

At best, it leads to people who care - or think they care - about fidelity trapping themselves at a lower resolution and/or refresh rate than they can achieve and benefit from. At worst it leads to people shelling out for a $1000 GPU to play at 1080p 60fps when they could get a GPU and a monitor for the same money that would give them 1440p 60+ framerates at very high settings and a much better experience overall.

The only thing they lost was being able to say "I play my games at MAX settings".

Sorry, I get triggered by max settings
I agree with all of this.

I've always thought that game developers / producers / designers could do a better job educating the player about the different settings available and what they actually do. I'm not sure how much blame can be levied onto to the player for wanting the turn everything up to ultra when they're presented with a slider that goes from low to ultra and no other info. Ultra sounds great, and nobody wants low, right?

Kudos to Nvidia and various game review websites for putting out guides like this one going into great detail about what each setting does and how it effects performance, but I'd like to see graphics settings menus in games include more of these kinds of details so that players can understand why going up to ultra from high might not benefit them.

While I'm daydreaming, I think it would be great if graphics settings were handled in a live preview where you could tweak settings and watch their effects change in real time. I know different engines have limitations that prevent this though. Or how about the eye doctor model (better A or B)? While you're playing you can flip between two settings on the fly and see which one gives you better fidelity or framerates, and then you can move on to the next setting?

I think as long as graphics options are obscured in poorly explained sliders that require you to restart the game every time you touch them, players are just always going to shoot for ultra and hope they make it.

Ok I'm done derailing the thread now :p
 
m not sure how much blame can be levied onto to the player for wanting the turn everything up to ultra when they're presented with a slider that goes from low to ultra and no other info. Ultra sounds great, and nobody wants low, right?
It's the same issue with PC hardware in general when people say "What is the best X for gaming?"

The problem is with the word "best" and how we're encouraged to chase it.

I blame the particularly energetic combination of general consumer culture and gaming culture.

Gaming culture itself combines enthusiasm for tech (at least to some degree) with an often macho-esque attitude where one's gaming performance and PC's technical performance are both somehow extensions of one's self. Obviously not saying that's universal in PC gaming, just that it's an attitude which seems more prevalent than it ought to be... Too much e-appendage comparison.

PC Gamer feed of it too with ludicrous articles like:
(One could be charitable and say that they are trying to educate - attracting attention by using terms people will respond to and then explaining about PSUs in more detail . But 1) the 'more detail' is poor and 2) the recommendations are poor - a £70-£100 750W Bronze PSU as the best budget option ... really? - 3) it still plays to "the best" mentality and I just can't perform enough mental gymnastics to kid myself the article was trying to meaningfully educate more than it was to post affiliate links)

People then end up with The Best System For Gaming on their low res, low refresh monitor, which is like advertising your 19 year old and his diet of steroids is the fastest swimmer in the shallow end. But at least they maxed out the settings.

Ok I'm done derailing the thread now

Anyone else here still rocking 1080p 60fps and happy with it?

Actually, we do have a 1080p 60hz gaming PC in the house, which features my old 970 from 2015 and my old 1080p 60hz monitor from 2012. Which both only got replaced a year and a half ago. However, the user is increasingly lusting after my monitor, even on games like Borderlands 2....
So we probably fail the 2nd half of the test above :(
 

TRENDING THREADS