Question Are Intel K CPUs worth it if you don't do overclocking ?

Jul 15, 2020
34
8
35
Visit site
Hi,

I'm wondering if getting an Intel K CPUs worth it compared to the non-K counterparts if you don't do overclocking ?

Base frequency and boost frequency are higher for K than non-K CPUs, in particular all-cores boost frequency, but does it really make such a difference ? Particularly for a gaming usage.

I got a i5-9600K and I dunno if I upgrade for a i7 or i9 of the same generation and more particularly if I should get a K version. I don't do overclocking at all.

Thanks.
 
Last edited:
There's more than one question there and nobody can answer any of them in isolation. There's no one-size-fits-all answer to any of these.

Why the upgrade?

What is it you want the PC to do that it does not do currently?

What do you use the PC for?

What is the full spec of the system? (everything, plus the monitor resolution and refresh rate)

What country and currency are you shopping in?

What's your budget for the upgrade

Are you buying new? What are you going to do with the old CPU?

My instinct says that if you're not overclocking, and if you bought an i5 CPU in the first place, you shouldn't buy an i7 or i9 from the same (now superseded) generation.
 
Jul 15, 2020
34
8
35
Visit site
There's more than one question there and nobody can answer any of them in isolation. There's no one-size-fits-all answer to any of these.

Why the upgrade?

What is it you want the PC to do that it does not do currently?

What do you use the PC for?

What is the full spec of the system? (everything, plus the monitor resolution and refresh rate)

What country and currency are you shopping in?

What's your budget for the upgrade

Are you buying new? What are you going to do with the old CPU?

My instinct says that if you're not overclocking, and if you bought an i5 CPU in the first place, you shouldn't buy an i7 or i9 from the same (now superseded) generation.
Hi Oussebon,

I knew it was you just by reading your answer :)

I use my PC mainly for gaming, and some activities like app dev, game dev (marginal) and of course Internet browsing, and office stuffs.
Here's my current config:
- Aorus pro z390 motherboard;
- Intel i5-9600K;
- 16GB of RAM at 3000MHz;
- GTX 1080ti;
- 1x 1TB NVMe Samsung 970 Evo Pro drive;
- Asus ROG PG27UQ.

Why the upgrade? Because I plan to get the next top NVIDIA GPU, the 3080ti or 3090, whatever it is called. I currently have a 4K gaming monitor I don't fully use because of my GPU being a little too weak for 4K gaming at 60fps or more. I fear my i5-9600K is gonna be too weak to go with a 3080ti / 3090. I tend to prefer higher refresh rate than pixel density.

I don't want to change CPU generation just yet. I got my motherboard and CPU not that long ago because my pre-built Alienware motherboard just fried after the warranty period expired and it was before the 10th gen release. I think a 9th gen i7 or i9 could go well with the next top NVIDIA GPU. But, man, they still cost something around 500€ even with the 10th gen release. I was wondering if getting a non-K CPU and saving a little money was worth it or not compared to a K CPU. I don't do overclocking at all.
 
Wait, are you saying you have a PG27UQ (a ~£1800 monitor that costs £1300 more than some of the best 1440p 144hz monitors on the market) but you play at sub-4k resolution and you intend to keep playing at sub-4k res even with a next gen GPU?

Playing at below native res on a monitor often looks worse than playing at native res on a lower res panel.

I fear my i5-9600K is gonna be too weak to go with a 3080ti / 3090.

Assuming you intend to play at 4k resolution on your super expensive 4k monitor with a next gen GPU, then:

1) you are unlikely to see a difference moving from an i5 to anything else in nearly all current gen games.

What CPU is a good match for what GPU is not a constant, fixed value. It depends on the games, and on the resolution. The higher the resolution, the more GPU-bound you are, and the less the CPU matters relative to the GPU.

At 4k with a 2080 ti:

There's almost no difference between anything. You're more or less at margin of error with a 9600k vs even a 10900k.

2) Obviously that's with a 2080 ti, and it could look a little different with a next gen GPU. However, only a little different.

Napkin maths: Even at 1440p, the differences between CPUs can be quite slim. 1440p is about half as demanding as 4k. Given that next gen GPUs won't be twice as powerful as current gen GPUs, the differences between CPUs at 4k with a next gen card will still likely be closer to that they currently look like at 4k (i.e. very small)

3) Nevermind when next gen titles launch which are even more demanding, and we go back to being even more GPU limited within a few months on new titles.

There's certainly no point to buying a new CPU until you know for a fact you need one - rather than on the off chance you maybe might need one.

I don't want to change CPU generation just yet. I got my motherboard and CPU not that long ago because my pre-built Alienware motherboard just fried after the warranty period expired and it was before the 10th gen release.
This would likely be throwing in good money after bad.

With an i7 or i9 from 9th Gen you're paying near as much as 10th Gen. You're going to see almost no real-world performance gains today. By the time you do need more performance from your CPU, far better stuff will be on the market. It already is (Intel 10th Gen) and there's a decent chance AMD's Zen 3 out in November could offer better performance than say a 9700k, for a lower price even factoring in a new mobo.

Keep what you have for as long as you can. Then replace it + mobo (+ RAM if we're on DDR5 by then) for something far more impactful.

Don't sink more cash into an obsolete platform.
 
  • Like
Reactions: Ryzengang
Jul 15, 2020
34
8
35
Visit site
Wait, are you saying you have a PG27UQ (a ~£1800 monitor that costs £1300 more than some of the best 1440p 144hz monitors on the market) but you play at sub-4k resolution and you intend to keep playing at sub-4k res even with a next gen GPU?

Playing at below native res on a monitor often looks worse than playing at native res on a lower res panel.



Assuming you intend to play at 4k resolution on your super expensive 4k monitor with a next gen GPU, then:

1) you are unlikely to see a difference moving from an i5 to anything else in nearly all current gen games.

What CPU is a good match for what GPU is not a constant, fixed value. It depends on the games, and on the resolution. The higher the resolution, the more GPU-bound you are, and the less the CPU matters relative to the GPU.

At 4k with a 2080 ti:

There's almost no difference between anything. You're more or less at margin of error with a 9600k vs even a 10900k.

2) Obviously that's with a 2080 ti, and it could look a little different with a next gen GPU. However, only a little different.

Napkin maths: Even at 1440p, the differences between CPUs can be quite slim. 1440p is about half as demanding as 4k. Given that next gen GPUs won't be twice as powerful as current gen GPUs, the differences between CPUs at 4k with a next gen card will still likely be closer to that they currently look like at 4k (i.e. very small)

3) Nevermind when next gen titles launch which are even more demanding, and we go back to being even more GPU limited within a few months on new titles.

There's certainly no point to buying a new CPU until you know for a fact you need one - rather than on the off chance you maybe might need one.


This would likely be throwing in good money after bad.

With an i7 or i9 from 9th Gen you're paying near as much as 10th Gen. You're going to see almost no real-world performance gains today. By the time you do need more performance from your CPU, far better stuff will be on the market. It already is (Intel 10th Gen) and there's a decent chance AMD's Zen 3 out in November could offer better performance than say a 9700k, for a lower price even factoring in a new mobo.

Keep what you have for as long as you can. Then replace it + mobo (+ RAM if we're on DDR5 by then) for something far more impactful.

Don't sink more cash into an obsolete platform.
Yeah, don't tell me, I think getting a 4K monitor was the worse decision I've made for my config. But, hey, this monitor was rated so good, I took an extra side job during months and when I got the cash I bought it. I wanted to be future-proof for my gaming monitor. But, actually, it's still too much compromises to do for this insane s amount of money and I figured it out too late.
But I won't complain and I hope new NVIDIA GPUs, at least the top ones, will enable playing all games at 4K in a playable framerate and high settings. When I said I prefer higher refresh rate than pixel density, I mean I try to play at 4K as much as I can, but sometime it's just not possible currently and I prefer to go for a lower resolution than playing at a low framerate.

I think your right and your advice is the wisest. At least, I've already got a good gaming monitor even for the next-gen. I'll wait for newer tech to renew all my rig, except the monitor of course. Thank you.
 
There's some expectation than Nvidia will release the 3080 first (announcement expected 1st Sept), then something more powerful afterwards (3090?), perhaps even a month later. AMD will also be releasing its RDNA2 GPUs, as well as its Zen 3 CPUs, in or by November. November being when the next gen consoles and so next gen games hit (and so get benchmarked).

Might be worth letting all that swordplay play out and review options in Nov / at the tail end of the year. Which could tie in to the black Friday period.
 
Jul 15, 2020
34
8
35
Visit site
There's some expectation than Nvidia will release the 3080 first (announcement expected 1st Sept), then something more powerful afterwards (3090?), perhaps even a month later. AMD will also be releasing its RDNA2 GPUs, as well as its Zen 3 CPUs, in or by November. November being when the next gen consoles and so next gen games hit (and so get benchmarked).

Might be worth letting all that swordplay play out and review options in Nov / at the tail end of the year. Which could tie in to the black Friday period.
Well, having a GSYNC ultimate monitor, I'm kinda stuck with NVIDIA GPUs.
I though they would announce the most powerful GPU first or at least the top 2, like 3080 and 3090 on Sept 1st?
 
There's some speculation that the first thing out the door won't be the most powerful one this time.

Whether that's because they're keeping something up their sleeve to whale on RDNA 2, or the card's not quite ready, or something else, I don't know.

But I mention it in case you hadn't heard the rumours and intended to pounce on the first thing out of Nvidia's gate. It may be best to wait a bit.
View: https://youtu.be/iSM1wCY2mcw?t=394


I think I'd heard and seen similar from the rumour mill before too.
 
Jul 15, 2020
34
8
35
Visit site
Yes, tha
There's some speculation that the first thing out the door won't be the most powerful one this time.

Whether that's because they're keeping something up their sleeve to whale on RDNA 2, or the card's not quite ready, or something else, I don't know.

But I mention it in case you hadn't heard the rumours and intended to pounce on the first thing out of Nvidia's gate. It may be best to wait a bit.
View: https://youtu.be/iSM1wCY2mcw?t=394


I think I'd heard and seen similar from the rumour mill before too.
Thanks for mentioning it, as I didn't get this info indeed.
 
  • Like
Reactions: Oussebon

TRENDING THREADS