Question What would it take for you to make the jump to 4K gaming?

The end of 2020 marks the release of both the Xbox Series X and PlayStation 5 consoles, both of which will likely push for at least 4K60 gaming. Depending on the game, these machines could even push into 4K120 territory over HDMI 2.1. We still know little about their real world performance, but it's likely that they'll be able to push beyond the capabilities of mainstream gaming PCs at lauch. What's more, with a guesstimated price of about $500 or €550 it's likely that they'll obliterate any PC matching their price range.

The question then becomes: how much would you be willing to spend to get a solid 4K60 experience on PC? You likely already have a 4K TV in your living room with at least one HDMI 2.0 or 2.1 port, but do you own a 4K monitor? If you don't, you would have to factor the cost of one of those in as well. Does your TV support HDR and variable refresh rates? Does your monitor? Do you even care?

Myself, I'm wondering if it's even worth the trouble and investment at this point. Right now there's only two graphics cards that could feasibly push 4K60 in the majority of games: the RTX 2080 Super and RTX 2080 Ti. They cost about $800 and $1300 on average. Assuming you already own a gaming rig that's otherwise capable enough to keep up, you'd still have to invest in the monitor. The only monitor that ticks most of the boxes for me (4K, HDR, 144hz, variable refresh rate) is the Acer Predator XB273K which comes in at a cool $999 if you are lucky, because it's actual MSRP is about $1299. Other monitors are just hideously expensive at $2k or more. So you're looking at an investment of about $2000 for just a GPU and a monitor to maybe have a shot at beating the experience that Xbox Series X and PS5 offer.

Of course, next gen GPUs could change all this. Rumour has it the prospective RTX 3060 should be about on par with the current gen RTX 2080 Ti. I don't believe that for a second, but let's assume it's true for the sake of argument. A current gen RTX 2060 Super goes for about $400-450 and I would venture a guess Nvidia will hike the prices of their newest offerings a bit. So we're probably looking at about $450-500 for the RTX 3060, which should perform about as well as an RTX 2080 Ti. If that's the case, we're still investing $1500 for 4K60 gaming on PC, assuming we already have the rest of the PC. If you have to build your setup from scratch, you're likely looking at $2000-2500 in total for your setup.

Now let's compare that to the holy grail of TVs right now, the LG B9 and C9. I would argue these models are the ultimate choice for gaming and content consumption, and they're about $1100 and $1300 respectively. Roughly the same price as that monitor we mentioned, but this is a 55" OLED TV that can push 4K120 signals over HDMI 2.1 and it supports G-Sync and VRR as well. Add to that the cost of your new Xbox or PS5 and you're still 'only' looking at about $1500-1800 depending on how good a deal you can get on that TV. If you're getting a TV at all, of course. As we discussed, you might very well have a 4K TV already.

Now that I've run the numbers on this, I think I'll stick to ultrawide 1080p gaming for a bit and just get myself an Xbox Series X when it drops. That and a Game Pass subscription (the biggest steal in gaming right now) should be all I need to get the most out of my gorgeous TV for the forseeable future. 4K gaming on PC really doesn't seem to be worth it for quite a while yet.
 

Inspireless Llama

Community Contributor
The question then becomes: how much would you be willing to spend to get a solid 4K60 experience on PC? You likely already have a 4K TV in your living room with at least one HDMI 2.0 or 2.1 port, but do you own a 4K monitor? If you don't, you would have to factor the cost of one of those in as well. Does your TV support HDR and variable refresh rates? Does your monitor? Do you even care?

I don't have a 4k monitor and the costs certainly count. Not only for the monitor itself (which I think are not always more expensive than 1440p monitors) but especially the hardware required to run at 4k. Also, I don'thave a 4k TV :p My TV is a lovely 1080p,, but sometimes I feel like that's actually better. My PS3 games look better on my 1080p TV than it does on my 1440p monitor. I have no intention on upgrading my TV either.

The MSI Optix MAG 321 CURV for example is exact the same specs as the monitor I have, so 32", same looks, the only difference is that my monitor is 1440p @144hz, while the 321 CURV is 4k at 60hz. Also exact the same price.

Myself, I'm wondering if it's even worth the trouble and investment at this point. Right now there's only two graphics cards that could feasibly push 4K60 in the majority of games: the RTX 2080 Super and RTX 2080 Ti. They cost about $800 and $1300 on average. Assuming you already own a gaming rig that's otherwise capable enough to keep up, you'd still have to invest in the monitor. The only monitor that ticks most of the boxes for me (4K, HDR, 144hz, variable refresh rate) is the Acer Predator XB273K which comes in at a cool $999 if you are lucky, because it's actual MSRP is about $1299. Other monitors are just hideously expensive at $2k or more. So you're looking at an investment of about $2000 for just a GPU and a monitor to maybe have a shot at beating the experience that Xbox Series X and PS5 offer.

Yup. I personally don't feel like investing 800 to 1300$ in a single card. So far the highest I've considered going is the RTX2070, but my mobo broke and I got the RTX2060 + mobo. The 2060 is a great card as well. Even with 1440p I'll reach 60 FPS (or 55) at games with high settings.

Of course, next gen GPUs could change all this. Rumour has it the prospective RTX 3060 should be about on par with the current gen RTX 2080 Ti. I don't believe that for a second, but let's assume it's true for the sake of argument. A current gen RTX 2060 Super goes for about $400-450 and I would venture a guess Nvidia will hike the prices of their newest offerings a bit. So we're probably looking at about $450-500 for the RTX 3060, which should perform about as well as an RTX 2080 Ti. If that's the case, we're still investing $1500 for 4K60 gaming on PC, assuming we already have the rest of the PC. If you have to build your setup from scratch, you're likely looking at $2000-2500 in total for your setup.

From what I read it's that the raytracing features from the 3060 will be outperforming the 2080ti, not the general card performance. That's what some sites stated at least. Even if the 3060 would be on par with the 2080ti I wouldn't upgrade, because I don't feel like upgrading my GPU every 2 years, and then for me I'd have to get that 4k monitor as well. That's something that's not worth it for me right now. Then I'd rather look at the series after that (4060 for example).

EDIT: I put in my PC specs into pcpartpicker the other day, turns out I payed arround that price for 1440p gaming :p. Now I did get some things more for looks then performance and I did include m+kb but still I invested a lot into this PC haha.


Now let's compare that to the holy grail of TVs right now, the LG B9 and C9. I would argue these models are the ultimate choice for gaming and content consumption, and they're about $1100 and $1300 respectively. Roughly the same price as that monitor we mentioned, but this is a 55" OLED TV that can push 4K120 signals over HDMI 2.1 and it supports G-Sync and VRR as well. Add to that the cost of your new Xbox or PS5 and you're still 'only' looking at about $1500-1800 depending on how good a deal you can get on that TV. If you're getting a TV at all, of course. As we discussed, you might very well have a 4K TV already.

The difference is though, but technology will tell; monitors will be capable of going to 144hz or higher. Am I right by thinking there aren't (or barel are) any TV's that go higher than 60hz? I thought there are a few 120hz ones but not that many, and surely not at 4k.

Now that I've run the numbers on this, I think I'll stick to ultrawide 1080p gaming for a bit and just get myself an Xbox Series X when it drops. That and a Game Pass subscription (the biggest steal in gaming right now) should be all I need to get the most out of my gorgeous TV for the forseeable future. 4K gaming on PC really doesn't seem to be worth it for quite a while yet.

You could go for 1440p too, it really seems a sweetspot right now between resolution and price.
I personally feel more for the PS5, because Xbox games tend to be playable on PC too. Now some of the PS excluves appear to be coming to PC, but PS5 seems more exclusive than an Xbox.
Also, another thing for me to consider for 4k gaming is how much space I have. I am thinking about DPI here, surely higher DPI is good, but 4k gaming on a 24" monitor seems weird to me. So what's the recommended DPI for 4k gaming? I mean, my 32" 1440p monitor feels perfect, but would I go for the same DPI on a 4k monitor it wouldn't fit onto my desk, so there would be no point in getting one :p
 
Last edited:
I sold my Xbox One and PS4 and got a great deal at CeX. I'd ideally like to get both again next gen.

I'm tempted to get a PS5 this year. I know the Series X is more powerful, But I've heard rumours that Gran Turismo 7 may be out on PS5 and that supply of the console might be well below demand.

Still, my mates are normally in the Xbox crowd, and I'm not going to make any decisions till after we've seen more, but that's my current thinking.

Also, I had heard something about some games still only aiming for 4k30 with ray tracing. I'm under the impression that not all games will achieve 4k60.

Edit: https://www.gamesradar.com/uk/assas...l-run-at-least-30-fps-at-4k-on-xbox-series-x/

It doesn't explicitly say that it won't achieve 4k60 but at the moment they are only guaranteeing 4k30.
 
Last edited:
From what I read it's that the raytracing features from the 3060 will be outperforming the 2080ti, not the general card performance. That's what some sites stated at least. Even if the 3060 would be on par with the 2080ti I wouldn't upgrade, because I don't feel like upgrading my GPU every 2 years, and then for me I'd have to get that 4k monitor as well. That's something that's not worth it for me right now. Then I'd rather look at the series after that (4060 for example).

EDIT: I put in my PC specs into pcpartpicker the other day, turns out I payed arround that price for 1440p gaming . Now I did get some things more for looks then performance and I did include m+kb but still I invested a lot into this PC haha.
You could be right, I didn't exactly research or fact check. I just remember reading that tidbit somewhere and it seemed a little extreme. I'm the kind of person that will upgrade their GPU every generation, or maybe every two generations, simply to keep up with the increasing demands of triple-A games. Then every 4-5 years I will upgrade the entire build.

The difference is though, but technology will tell; monitors will be capable of going to 144hz or higher. Am I right by thinking there aren't (or barel are) any TV's that go higher than 60hz? I thought there are a few 120hz ones but not that many, and surely not at 4k.

You could go for 1440p too, it really seems a sweetspot right now between resolution and price.
I personally feel more for the PS5, because Xbox games tend to be playable on PC too. Now some of the PS excluves appear to be coming to PC, but PS5 seems more exclusive than an Xbox.
Also, another thing for me to consider for 4k gaming is how much space I have. I am thinking about DPI here, surely higher DPI is good, but 4k gaming on a 24" monitor seems weird to me. So what's the recommended DPI for 4k gaming? I mean, my 32" 1440p monitor feels perfect, but would I go for the same DPI on a 4k monitor it wouldn't fit onto my desk, so there would be no point in getting one
I know that my TV at least will go up to 4K120 if the input device can drive it and is hooked up properly, but I think that's where it tops out. I honestly wouldn't be able to tell the difference between 120 and 144hz, particularly because it's so incredibly hard to reach framerates like that and deliver them at a steady frame pace.

You're going to think me crazy for saying this, but.. I own a 1440p 144hz panel, I've had it since 2015. Last year I decided I wanted to hop on the ultrawide bandwagon, but because I was already struggling to run the most demanding titles at 1440p I decided to go for ultrawide FHD instead of ultrawide 1440p. Effectively, I gave up some definition and clarity for higher framerates.

I don't think there's a recommended DPI/PPI for gaming, it's just up to your preference. In general, I think it's simply a matter of higher = better. I was just running the numbers on this and came to the realisation that my ultrawide monitor at 2560x1080 resolution is about the same DPI as my 55" OLED TV. That's just insane to me! I'm guessing DPI alone doesn't tell the whole story, though, because my TV is definitely much, much sharper and better defined than this monitor.

I think that for razor sharp image quality in PC gaming, a 4K monitor between 28" and 32" is the sweet spot. You might run into some scaling issues when you're just browsing Windows 10 and the web, but if gaming is the main purpose then I think that's where it's at.
 
Last edited:
To answer the original question, whenever I have a bit of money that I'm going to allow myself to upgrade the PC with I identify which components I think will give the most benefit and I buy the best thing I can afford within the budget I allowed myself. At the moment, to run 4K at high and above settings on an IPS display at 120hz and above is more then I want to pay. At the point it becomes within my budget I'll go there.

I know that my TV at least will go up to 4K120 if the input device can drive it and is hooked up properly, but I think that's where it tops out. I honestly wouldn't be able to tell the difference between 120 and 144hz, particularly because it's so incredibly hard to reach framerates like that and deliver them at a steady frame pace.

The difference is though, but technology will tell; monitors will be capable of going to 144hz or higher. Am I right by thinking there aren't (or barel are) any TV's that go higher than 60hz? I thought there are a few 120hz ones but not that many, and surely not at 4k.

I was under the impression that most tvs that can do 3D are capable of 120hz, I could be wrong though, maybe its to do with interlacing or something and not 'really' 120.

I don't think there's a recommended DPI/PPI for gaming, it's just up to your preference. In general, I think it's simply a matter of higher = better. I was just running the numbers on this and came to the realisation that my ultrawide monitor at 2560x1080 resolution is about the same DPI as my 55" OLED TV. That's just insane to me! I'm guessing DPI alone doesn't tell the whole story, though, because my TV is definitely much, much sharper and better defined than this monitor.

It depends how far away you are sitting from the screen, I'm sure if you calculate the PPI on a 4K cinema screen it sounds pretty terrible compared to a 6" phone with a 500PPI QHD display.

For desktop monitors the market had mostly kind of agreed that 24 inches is about the maximum for 1080p and 27 is the 'sweet spot' for 1440 which both average out to around 95 PPI. It seems most monitors before 4K were manufactured around that 90-110 PPI resolution/size too. You'd need to go up to 42 inches for anywhere close to the same PPI at 4K

Generally I guess people are sitting around 50cm to 1m away at a desk when using a monitor. . I'm not sure at what point we hit diminishing returns or subjectivity based on the quality of a persons eyesight. An 8K monitor would have to be 85 inches to hit around 100 PPI and I wouldn't want that on my desk :D

Anyway people used to (falsely) argue that the human eye cannot perceive more than 60FPS and now we have 360hz monitors. They'll find a way to keep us upgrading whatever happens. :D
 
  • Like
Reactions: Rensje

Inspireless Llama

Community Contributor
Anyway people used to (falsely) argue that the human eye cannot perceive more than 60FPS and now we have 360hz monitors. They'll find a way to keep us upgrading whatever happens.

It's not really to do with this topic but:

I'm reading into this now actually. First things first, human eyes do not work in Hertz :p. What I'm reading now is that it has to do with "Flicker Fusion Treshold". How fast does something need to flicker before it starts looking steady?

Apparently most people stop noticing the flicker above 400hz, but there have been reports of people noticing it close to 500hz.
For gaming that's not that important usually because displays don't flicker (depending on the technology). So apparently the background lightening (or reduced screen flickering) is more important than the actual framerate.

I notice myself too, I don't care about having 60FPS in every game. I prefer a game being stable at 30FPS than a game that runs at 100FPS and keeps having lags to 70FPS. That's way more disturbing because it destroys how fluid it's looking.



Back on topic though, short answer from me: I don't think I'll start going for 4k before it becomes mainstream (wouldn't supprise me if by that time 8k gets a thing too).
 

Frindis

Dominar of The Hynerian Empire
Moderator
Been playing with 1080P for over 12 years now, somehow I always seem to end up with it Right now I am using a Lenovo 27 144HZ (G-Sync). If I were to jump to 4K I would definitely need something better than my ugly bugly Acer PC.
 
Last edited:
  • Like
Reactions: Rensje
It's not really to do with this topic but:

I'm reading into this now actually. First things first, human eyes do not work in Hertz . What I'm reading now is that it has to do with "Flicker Fusion Treshold". How fast does something need to flicker before it starts looking steady?

Apparently most people stop noticing the flicker above 400hz, but there have been reports of people noticing it close to 500hz.
For gaming that's not that important usually because displays don't flicker (depending on the technology). So apparently the background lightening (or reduced screen flickering) is more important than the actual framerate.

I notice myself too, I don't care about having 60FPS in every game. I prefer a game being stable at 30FPS than a game that runs at 100FPS and keeps having lags to 70FPS. That's way more disturbing because it destroys how fluid it's looking.


Interesting, I dont think I'll go a deep dive right now though. I guess at some point the response time of current monitors become slower then the refresh rate making any faster refresh rates pointless? Out of my depth there for the moment haven't gone that deep before.

I'm definitely not any sort of pro gamer, but I feel that as long as the framerate generally doesnt dip into low frame rate compensation territory I dont notice. Then again Ive only had VRR for about 4 months so maybe I havent come across anything too bad yet.

Assassins Creed Odyssey is the worst yet for me, but I think its probably just quite hard on the CPU which sometimes causes small dips and stutters, because changing graphic options makes no difference to it. It doesnt disturb me too much still anyway.

As far as 30FPS goes, I think you can used to it but it is worse. I honestly think developers will target 4K 30FPS on PS5 and XboX SX so that they can crank up the fidelity as far as possible. Personally I'd much rather stay above 60hz at a lower resolution even if it varies, thats just my subjective experience though :)
 
  • Like
Reactions: Frindis and Rensje
As far as 30FPS goes, I think you can used to it but it is worse. I honestly think developers will target 4K 30FPS on PS5 and XboX SX so that they can crank up the fidelity as far as possible. Personally I'd much rather stay above 60hz at a lower resolution even if it varies, thats just my subjective experience though
You are probably right about that, most developers would much rather push the visuals extra hard even if that means a lower (but steady) framerate. However, on Xbox One X in particular, the raw power of the system is already being translated into higher and more stable framerates. Plenty of console games from this generation get quite close to a locked 60 fps on X.

Xbox top dog Phil Spencer at least sees the importance of higher framerate being a huge contributing factor to the experience and is constantly pushing developers to strive for it wherever possible. Their new machine will be able to handle it, if nothing else: Codemasters' Dirt 5 will do 4K60 or 1080p120 on Xbox Series X. That's a racing game, of course, a genre that 'needs' high framerates more than some, but still.. if they can do it, why not others?
 
  • Like
Reactions: Kaamos_Llama
May 16, 2020
12
17
15
Visit site
I guess it depends on the game.
I have a 4K TV and a 144hz 1440p monitor. Half a year ago, before i built a new rig, I played a lot of my games on the TV with a 1070 card. I pushed the graphics as high as I could as long as I was over 30fps, preferably in the 40's.
If its a fast paced game I will maybe go for the fps, but if it is something like Red Dead, I will go for 4K anytime.
It's just to pretty looking ...

My new rig has a 2080ti, just because i wanted better graphichs and frames in 4K playing games like Cyberpunk. I spent about 2200$ on that.
Still I find myself back on the TV with Red Dead, toggling graphics settings, just to get a stabile 50fps.
I don't know if I will get the next gen GPU's though, because I am pretty happy right now, and unless it can get to above 100fps, max settings at 4K I dont see the point. Plus I would also need a new monitor and the 4K ones just seem too expensive compared to what I already have.

Tldr; it depends. I was willing to spend alot to upgrade from the 1070. I don't think I am willing to spend as much again next gen unless the gains are at least similar.
 
  • Like
Reactions: Rivereyes
@Rensje Yea, you're right it will depend on the game. I was mainly thinking about 3d open world flagship type games. But who knows.

I figured with the XboneX they have to reduce the settings a fair amount to get things to work at 4K. Looks like the GPU in that thing is somewhere between a RX580 and 590 from the specs, and the CPU is based on a very slow (at the time even) laptop chip from 2012 that received a bit of a clock speed bump.

These new consoles are comparatively looking pretty amazing especially considering its easier to get more performance out of them. The PS5 SSD has some voodoo too that we dont really know about. Maybe they will make it 60FPS in most games if they are already managing close now.
 
I've made the switch to 4k back in '17, with a GTX 1080. At the time I had an Acer Predator 4K monitor with G-Sync to pick up the GPU's slack.

Transitioned to an RTX 2080 Super and a 4K/UHD TV (complete with a wireless Razer Turret keyboard/mouse setup) and games look incredible in 4K Ultra/60. Particularly epics like AC : Odyssey and Red Dead Redemption 2. The level of immersion that you can achieve when gaming on a 65'' 4K/HDR panel, even at 45-50 FPS on Ultra (DLSS goes a very long way towards this) is beyond what words can reasonably describe.

These days I only play indies and throwback point and click adventure games on my PC monitor, anything flashy goes on the TV.

Given the hardware of the new consoles, 4K/60/Ultra will become the new norm and frankly, it'll be much cheaper to achieve on console than PC, no doubt about that. Even if both the XB1SX and PS5 sell for 600$, it's still about 1/3 of the price you'd need to get the same gaming performance out of a PC.

I will upgrade to Zen 3 as soon as they become available and see if my RTX 2080 Super is sufficient to remain at or above next-gen console level but I fully recognize this as a luxury and caprice that I can afford because I'm a thousand years old and have succumbed to the terrors of 'Normal Life'.

If a kid was looking for sheer gaming pleasure these days? It's a no brainer for the X1SX.

Of course with the PC you have access to more frequent sales, sites like Instant Gaming and you may actually get to learn a thing or two (or a thousand) about computers if you are so inclined but if gaming is your main concern? A console will be a no brainer for most households come the holiday season.

PC's will then of course continue evolving and we'll be gaming at 8K long before console gamers can even dream of it, but it all feels a bit ridiculous to consider, honestly.
 
  • Like
Reactions: Alm and Avalon
Apr 27, 2020
2
0
10
Visit site
I purchased a 28" AOC 4k/60 almost 5 years ago for $350. First ran it with 970s in SLI, then 980s in SLI , followed by a 1080ti and now a 2080 super. I don't play shooters where frame rates can become super important.

Mostly play Total War Empire strategic game (11,000+ hours) which is intellectually challenging and each game is slightly different with details maxed 4k. Play at hard setting for campaign and very hard for battles. Can pause it to watch interesting tv and return to play during commercials.
 
Last edited:
I've had a 4k/45-60hz Freesync monitor for a couple of years now. It's pared with a GTX1060 6GB which is actually starting to show it's age now. However for a lot of games I play it's perfectly fine.

I would have bought something in the GTX2000 range but I though it was way too expensive and the Ray Tracing features were just not right.

So I'll wait until the GTX3000 series and Big Navi come out and possibly buy something towards the end of the year or early next year.

Ideally I want something that I can use in my current system and then transfer into a new build in 2021/2022 when AM5 DDR5 etc support comes out.
 
  • Like
Reactions: Alm
May 29, 2020
15
3
4,515
Visit site
I've never played any games in 4K, i'd have to try it out and see what its like before i'd commit to paying anything to upgrade. I game in 1080p and think the quality is great as is.. however I do have an older computer and my monitor is only 23" so right now I probably couldn't game in better quality if I wanted to
 
Apr 23, 2021
8
21
15
Visit site
Why stop at 4K? I own a PS5 and I looked at the box, it supports 8K, I mean, i guess its probably not worth buying a whole new monitor/TV JUST to play the Next Gen Consoles. if you already have an 8k tv, go ahead and buy the 8K HDMI cable. (i THINK, not sure, but i think you need a special HDMI Cord to use 8K) and Play the most advanced graphics ever on console, that paired with Ray Tracing and you might confuse that PS5 for a PC.
 
I don't think the PS5 will play AAA games rendered in 8K.

I think even at 4k there is a degree of upscaling.

But I'm not bashing the PS5, it's a great console. I wish I had one.

Unfortunately adulting has caught up with me and I have to be more sensible with my cash.

Edit: with the box saying it supports 8K, I think that's just that it will output an 8K signal. But I don't think it will render graphically complex games at 8K at a playable framerate.
 

TRENDING THREADS

Latest posts