Serious doubt :O

Sep 24, 2020
10
0
10
Visit site
Hi all! I'm losing some hair over this one and I'd be much appreciated if you guys could help me out with your PC experience. At the moment I'm running an i7 7700, b-150m-C, a pair of corsair vengeance (16gb/2400hz) and my current GPU is a RTX 2070 super. I was wondering to get a new mainboard + CPU (maybe a Ryzen 3600x or i5 10600k) + new pair of memory ram too (upgrade it to 32gb). But there's also the new line of the RTX 3000 coming (and also the next CPU gen of AMD 4000). So my doubt is: should I put my money on all of those items above or even wait for the new AMD to arrive, or should I go all-in for the new RTX 3000? My main aim is to play Cyberpunk 2077 ( and next gen games) to its fullest
 
If you really only want to upgrade because you want to play Cyberpunk 2077 then i say wait until it releases to see how it plays on the whatever components you were thinking of upgrading to that are available at that time. Your current build should be able to play it based on what they (CDPR) recommend for playing Cyberpunk. Heck, ill be running cyberpunk with a 4790k GTX 1080 build lol.
 

Zoid

Community Contributor
What @DXCHASE said. If your primary goal is to be able to play Cyberpunk, then wait to upgrade until it comes out. Speculative upgrading is never going to be as optimal as upgrading once you have benchmarks to go off of. Your current build is really pretty solid, so you may not need to upgrade at all. What monitor and resolution are you playing on?

Even if you do upgrade, in additional to the benefits of waiting for Cyberpunk benchmarks, it's also a good idea to wait for at least another month so that you can see what Ryzen 4000 series looks like, and what AMD announces about their new GPUs. There's a lot of new hardware coming out right now, so if you're not in a hurry (and with an RTX 2070 I'd expect you're not hurting for performance) it will benefit you to wait.
 
  • Like
Reactions: TARDYFUINHA
If you really only want to upgrade because you want to play Cyberpunk 2077 then i say wait until it releases to see how it plays on the whatever components you were thinking of upgrading to that are available at that time.
This.

By the time Cyberpunk is out, or shortly after, we'll hopefully have seen reviews of the AMD CPUs, AMD GPUs, and the RTX 3070 - and maybe even 3060 if nvidia launch that in response to AMD.

As exciting as all the new hardware launches are, I wouldn't get caught up in the upgrade fever before we know what's what with the games, and with the components :)

in additional to the benefits of waiting for Cyberpunk benchmarks, it's also a good idea to wait for at least another month so that you can see what Ryzen 4000 series looks like
I think the timeline is:
8 Oct - Zen 3 announce
15 Oct - RTX 3070 release
28 Oct - RDNA2 (AMD GPUs) announce
10 Nov - Xbox Series X + AC Valhalla
12/19 Nov - PS5
19 Nov - Cyberpunk 2077 launch

So I expect we'll have seen benchmarks of the new stuff by the time Cyberpunk launches, and also that performance reviews of Cyberpunk will include the new hardware. Probably... AMD would be mad to miss the initial launch of 'next gen' games with their GPUs.
 
Sep 24, 2020
10
0
10
Visit site
First of all, thank you for your attention and support :)
Well, undoubtedly there're A LOT of things hitting the fan at the end of this crazy year (and I'm referring only to the PC matter here, lol).
I feel that my RTX 2070 super's potential is being limited by the I7 7700 and RAM (16gb 2400mhz). For an example, when I play intense CPU games such as Red Dead Redemption 2, I get around 59 - 72 fps at crowded places - like Blackwater or Saint-Denis. Then I think to myself: "imagine on cyberpunk.. it will be even more demanding!" Of course the game will support DLSS 2.0 and that's already an advantage. But I don't want my bottlenecking cpu to limit my gaming experience. So that's the reason why I'm anxious about this matter.
But you guys have reasoned me well on this subject, it's always a good ideia to wait and see the benchmarks of how the newests CPUs and even my old I7 7700 will perform on this one - although I have a bad feeling that it will struggle
It is indeed expected that AMD will keep up the pace and that's truly exciting. However I'm probably not gonna be able to buy the newest stuff, since I'll have to purchase a Mainboard, ram and CPU altogether :/
 
Sep 24, 2020
10
0
10
Visit site
If you really only want to upgrade because you want to play Cyberpunk 2077 then i say wait until it releases to see how it plays on the whatever components you were thinking of upgrading to that are available at that time. Your current build should be able to play it based on what they (CDPR) recommend for playing Cyberpunk. Heck, ill be running cyberpunk with a 4790k GTX 1080 build lol.
We shall pray for the miraculous CDPR's optimization my friend... hahah
 
Sep 24, 2020
10
0
10
Visit site
What @DXCHASE said. If your primary goal is to be able to play Cyberpunk, then wait to upgrade until it comes out. Speculative upgrading is never going to be as optimal as upgrading once you have benchmarks to go off of. Your current build is really pretty solid, so you may not need to upgrade at all. What monitor and resolution are you playing on?

Even if you do upgrade, in additional to the benefits of waiting for Cyberpunk benchmarks, it's also a good idea to wait for at least another month so that you can see what Ryzen 4000 series looks like, and what AMD announces about their new GPUs. There's a lot of new hardware coming out right now, so if you're not in a hurry (and with an RTX 2070 I'd expect you're not hurting for performance) it will benefit you to wait.
I'm playing on a 1080p 144hz LG 24GM79G. So the performance will depend a lot on the cpu too, right? Sometimes I use Dynamic Super Resolution and go to 1440p, but I don't really know how that works - i mean if that will get more from my GPU, since it's above 1080p or it will even get more from both GPU and CPU lol. But being able to play Cyberpunk smoothly on 1080p/high (maybe some stuff on ultra?) will put a smile on my face :)
 
However I'm probably not gonna be able to buy the newest stuff,

It depends on your budget, but the idea that the newest stuff costs a fortune and the older generation is a "good deal" is a myth that just won't die the death it deserves.

Sometimes the older gen of (e.g.) AMD CPUs really does come down in price, as we saw with the 2000 series where you could snag at 2700/x for the price of a 3600. Which was great if you had certain production type workloads. But, since the 3600 would generally match or beat the 2700x in gaming, the 3600 was still the smarter play for gaming. Also, AMD have so far generally been releasing Ryzen parts at the same-ish price their predecessors launched at.

Old gen GPUs are rarely a good deal. And when Intel launched the i5 8400, which all but equalled the 7700k in gaming, the 7700k still stayed more expensive and was a poor deal. So new gen stuff, if it provides a big leap in performance, can give you more performance for the same price as the old stuff / the same performance for a lower price.

Also, since your existing RAM is compatible with new Intel and AMD motherboards, you need not replace the RAM right away.

Which means you shouldn't have much trouble affording an upgrade.

As for this:
Red Dead Redemption 2, I get around 59 - 72 fps at crowded places - like Blackwater or Saint-Denis. Then I think to myself: "imagine on cyberpunk.. it will be even more demanding!" Of course the game will support DLSS 2.0 and that's already an advantage. But I don't want my bottlenecking cpu to limit my gaming experience.
DLSS helps with improving GPU performance. If you're CPU limited, it may not be much of an asset.

However, we don't know whether you will be CPU limited or not in Cyberpunk. Often it can vary depending on the area of the game.

In The Witcher 3, your CPU wouldn't generally affect performance much. Novigrad tended to be where you'd see any CPU limitation, but tbh it was a non-issue for most people. In AC: Odyssey, Athens is a different performance scenario to the unpopulated middle of nowhere. RDR2 is a different beast again. Cyberpunk will be different in other ways too. We can't extrapolate from one game to another. Hence why we need to wait for benchmarks.

I'm playing on a 1080p 144hz LG 24GM79G. So the performance will depend a lot on the cpu too, right? Sometimes I use Dynamic Super Resolution and go to 1440p, but I don't really know how that works - i mean if that will get more from my GPU, since it's above 1080p or it will even get more from both GPU and CPU lol. But being able to play Cyberpunk smoothly on 1080p/high (maybe some stuff on ultra?) will put a smile on my face
The following is partly me guessing at why you're saying what you're saying, and then trying to unpack it a bit, so apologies for any faulty assumptions on my part :)

1) I'm playing on a 1080p 144hz LG 24GM79G. So the performance will depend a lot on the cpu too, right?

I'm assuming you're saying this because you've heard that playing at lower resolutions like 1080p shows more difference between CPUs in performance. That can be true, but it depends on the game, on the settings, and on the graphics card.

If you're playing a really demanding game at ultra settings, and you have things like raytracing on too, actually you might be GPU limited such that the CPU has little influence on performance.

Articles on tech websites and youtube that benchmark PC performance take the most powerful GPU they can in order to try to expose CPU limitations. Take this article with a 7700k and a 10600k, among other CPUs:
They use a 2080 ti, the most powerful gaming GPU available at that time. And still some games like Breakpoint showed basically no difference between CPUs. Other games did - at least with a 2080 ti.

A 2070 Super may have shown smaller differences i.e. have been less CPU-limited.

2) Sometimes I use Dynamic Super Resolution and go to 1440p, but I don't really know how that works - i mean if that will get more from my GPU, since it's above 1080p or it will even get more from both GPU and CPU

DSR is essentially form of anti-aliasing. AA is that magic you'll know of that tries to remove 'jaggies' from games - where lines instead of appearing straight and smooth look like a jagged / stepped line. There are various techniques and DSR renders the image above native quality (so at 1440p instead of 1080p in your case) and then downsamples the image back to 1080p.

DSR will load the GPU intensively, not the CPU. AA is often one of the most GPU-intensive settings. With DSR, you do sort of get 'more from your graphics card' in that it can improve fidelity and if you have the horsepower to spare, why not. However, arguably, if you're doing that you might as well get a better monitor and run games at native 1440p which will still be leagues better visually. If you have that kind of horsepower to spare in a AAA title you're probably running a monitor that is 'beneath' what the GPU is worth.

Also, using DSR and DLSS together is kind of strange, conceptually at least. Because DSR renders the image above 1080p and then reduces it to 1080p. Using spare performance to try to enhance fidelity. While DLSS renders the image at (e.g.) 720p, then upscales to 1080p (using Nvidia's Tensor cores and AI wizardry). The primary goal of which is to improve performance where the GPU might otherwise struggle. If you're using DLSS, it's because you need the extra performance - often because you have RTX raytracing features switched on which tank performance. If you're using DSR, it's because you have performance to spare.

The exact settings you'll use will depend how Cyberpunk behaves, but using both together may to some degree cancel out the point of them.
 
Last edited:
Sep 24, 2020
10
0
10
Visit site
It depends on your budget, but the idea that the newest stuff costs a fortune and the older generation is a "good deal" is a myth that just won't die the death it deserves.

Sometimes the older gen of (e.g.) AMD CPUs really does come down in price, as we saw with the 2000 series where you could snag at 2700/x for the price of a 3600. Which was great if you had certain production type workloads. But, since the 3600 would generally match or beat the 2700x in gaming, the 3600 was still the smarter play for gaming. Also, AMD have so far generally been releasing Ryzen parts at the same-ish price their predecessors launched at.

Old gen GPUs are rarely a good deal. And when Intel launched the i5 8400, which all but equalled the 7700k in gaming, the 7700k still stayed more expensive and was a poor deal. So new gen stuff, if it provides a big leap in performance, can give you more performance for the same price as the old stuff / the same performance for a lower price.

Also, since your existing RAM is compatible with new Intel and AMD motherboards, you need not replace the RAM right away.

Which means you shouldn't have much trouble affording an upgrade.

As for this:
DLSS helps with improving GPU performance. If you're CPU limited, it may not be much of an asset.

However, we don't know whether you will be CPU limited or not in Cyberpunk. Often it can vary depending on the area of the game.

In The Witcher 3, your CPU wouldn't generally affect performance much. Novigrad tended to be where you'd see any CPU limitation, but tbh it was a non-issue for most people. In AC: Odyssey, Athens is a different performance scenario to the unpopulated middle of nowhere. RDR2 is a different beast again. Cyberpunk will be different in other ways too. We can't extrapolate from one game to another. Hence why we need to wait for benchmarks.

The following is partly me guessing at why you're saying what you're saying, and then trying to unpack it a bit, so apologies for any faulty assumptions on my part

1) I'm playing on a 1080p 144hz LG 24GM79G. So the performance will depend a lot on the cpu too, right?

I'm assuming you're saying this because you've heard that playing at lower resolutions like 1080p shows more difference between CPUs in performance. That can be true, but it depends on the game, on the settings, and on the graphics card.

If you're playing a really demanding game at ultra settings, and you have things like raytracing on too, actually you might be GPU limited such that the CPU has little influence on performance.

Articles on tech websites and youtube that benchmark PC performance take the most powerful GPU they can in order to try to expose CPU limitations. Take this article with a 7700k and a 10600k, among other CPUs:
They use a 2080 ti, the most powerful gaming GPU available at that time. And still some games like Breakpoint showed basically no difference between CPUs. Other games did - at least with a 2080 ti.

A 2070 Super may have shown smaller differences i.e. have been less CPU-limited.

2) Sometimes I use Dynamic Super Resolution and go to 1440p, but I don't really know how that works - i mean if that will get more from my GPU, since it's above 1080p or it will even get more from both GPU and CPU

DSR is essentially form of anti-aliasing. AA is that magic you'll know of that tries to remove 'jaggies' from games - where lines instead of appearing straight and smooth look like a jagged / stepped line. There are various techniques and DSR renders the image above native quality (so at 1440p instead of 1080p in your case) and then downsamples the image back to 1080p.

DSR will load the GPU intensively, not the CPU. AA is often one of the most GPU-intensive settings. With DSR, you do sort of get 'more from your graphics card' in that it can improve fidelity and if you have the horsepower to spare, why not. However, arguably, if you're doing that you might as well get a better monitor and run games at native 1440p which will still be leagues better visually. If you have that kind of horsepower to spare in a AAA title you're probably running a monitor that is 'beneath' what the GPU is worth.

Also, using DSR and DLSS together is kind of strange, conceptually at least. Because DSR renders the image above 1080p and then reduces it to 1080p. Using spare performance to try to enhance fidelity. While DLSS renders the image at (e.g.) 720p, then upscales to 1080p (using Nvidia's Tensor cores and AI wizardry). The primary goal of which is to improve performance where the GPU might otherwise struggle. If you're using DLSS, it's because you need the extra performance - often because you have RTX raytracing features switched on which tank performance. If you're using DSR, it's because you have performance to spare.

The exact settings you'll use will depend how Cyberpunk behaves, but using both together may to some degree cancel out the point of them.
Thank you so much for the deep explanations!!

I've already taken a look at these comparisons and somehow the CPU's 10600k and 3700x still are, pardon me saying, "winners" for gaming in terms of price. I shall not worry on rams for now, you've convinced me :)
Yes, each game is gonna play a different role on cpu/gpu usage and demandings and I may have similar results with my current i7 7700. But Cyberpunk and next gens games seem to scare the **** out of my current build if I wanna have a smoother experience (in terms of CPU). But hey, that's my impression and we shall wait to see Cyberpunk's benchmarks - counting days for this actually...
I'll quote cyberpunk's words here:
"Please note that the game is both graphics- and processor-intensive, so make sure these components meet or exceed the minimum requirements. Also note that the minimum is created with Low settings and 1080p gaming in mind and Recommended with High and 1080p. " (it doesn't specify the average FPS on high settings, though)

Another problem that appeared recently with my current CPU is high temperature... it is now reaching 90 - 100 C degrees while gaming, and yes, I stood with the stock cooler since I naively thought it wouldn't be necessary to upgrade as long as I didn't OCed it. The table has turned however. So now I'm pretty much on a crossroads here: should I buy already a 10600k (a cooler too!!) and not fry it 'till cyberpunk's release or try to fix this errand first and wait for the next gen CPUs :p

As for the DSR I seldom use it tbh; this mixed with DLSS truly seems to cancel each's functionality respectively. As you said, I'd only use this if I had extra performance from the gpu to spare. I assume I'll be running cyberpunk on 1080p with some of the raytracing assets and the DLSS 2.0 on to do the trick. I'll be also tweaking the settings ingame untill I find a reasonable FPS/QUALITY fidelity too.

So returning to the point here is: I know my RTX 2070 super will be capable of handling the situation reasonably well for now. I just have a bad feeling with temperature and cpu usage , problem that I discovered yesterday only tbh
 
Last edited:
Another problem that appeared recently with my current CPU is high temperature... it is now reaching 90 - 100 C degrees while gaming, and yes, I stood with the stock cooler since I naively thought it wouldn't be necessary to upgrade as long as I didn't OCed it. The table has turned however. So now I'm pretty much on a crossroads here: should I buy already a 10600k (a cooler too!!) and not fry it 'till cyberpunk's release or try to fix this errand first and wait for the next gen CPUs
Fix the existing issue.

Clean the system - blowing compressed air into the heatsink and fans to remove dust. You can see a lot of guides online.

Repaste the CPU cooler. The stock cooler is easy to remove (so easy sometimes it comes a little loose). Clean the paste off the CPU and cooler with isopropyl alcohol, reapply paste, reinstall the cooler.

You could use one of these arcticlean kits:

The Intel stock cooler is not great at all but it's basically functional for a locked 65W TDP part, so 100 degrees is surprising and I think not what you should expect of it.

Edit: Two things to note.
1, you can consider it practice for a CPU upgrade. Cleaning the PC is essential maintenance anyway, and knowing how to paste/repaste is potentially useful for the upgrade and also may need to be done periodically too.
2) If the CPU hits 100 degrees, it's thermal throttling, which means it's dropping its frequency, which will reduce the CPU's performance in games - possibly causing or exacerbating any CPU bottlenecking in games.
 
Last edited:
Sep 24, 2020
10
0
10
Visit site
Fix the existing issue.

Clean the system - blowing compressed air into the heatsink and fans to remove dust. You can see a lot of guides online.

Repaste the CPU cooler. The stock cooler is easy to remove (so easy sometimes it comes a little loose). Clean the paste off the CPU and cooler with isopropyl alcohol, reapply paste, reinstall the cooler.

You could use one of these arcticlean kits:

The Intel stock cooler is not great at all but it's basically functional for a locked 65W TDP part, so 100 degrees is surprising and I think not what you should expect of it.

Edit: Two things to note.
1, you can consider it practice for a CPU upgrade. Cleaning the PC is essential maintenance anyway, and knowing how to paste/repaste is potentially useful for the upgrade and also may need to be done periodically too.
2) If the CPU hits 100 degrees, it's thermal throttling, which means it's dropping its frequency, which will reduce the CPU's performance in games - possibly causing or exacerbating any CPU bottlenecking in games.
Well, i've bought today the thermal paste from coolermaster (wasn't cheap) and yes, i've removed the cooler, took off all the dust from the cpu and around it, replaced the paste and installed the cooler... all really tight to the CPU. Done everything correctly. And you know what? My cpu is even WORSE now. I can't believe this.. before I had replaced the paste, i was getting around 30 C on idle... now i'm 65 C. While gaming, it's still getting 100 C but now my clock frequency is 3.4 - 3.2 ghz while i was getting 4.1 GHz. I'm almost giving up and buying a new CPU. Damn
 
Buying a new CPU to "fix this" is a bad idea. Because if there's something you missed, you might then have the same issue with the new processor. You need to work out what has not gone right.

What paste did you buy (it's not normally expensive)?

Did you clean off the old paste off both the CPU and the CPU cooler before applying new paste?
If you did, what did you use to clean it?

When applying the new paste, how did you do it? What guide did you follow?

Did you clip the Intel stock cooler back in properly? They are very easy to not clip in properly - check if a leg is loose.

Is the fan plugged into the motherboard header?

Is the fan working? Is it spinning, and if so at what RPM under load? Something like HWinfo would be able to tell you.
 
  • Like
Reactions: TARDYFUINHA
Sep 24, 2020
10
0
10
Visit site
Buying a new CPU to "fix this" is a bad idea. Because if there's something you missed, you might then have the same issue with the new processor. You need to work out what has not gone right.

What paste did you buy (it's not normally expensive)?

Did you clean off the old paste off both the CPU and the CPU cooler before applying new paste?
If you did, what did you use to clean it?

When applying the new paste, how did you do it? What guide did you follow?

Did you clip the Intel stock cooler back in properly? They are very easy to not clip in properly - check if a leg is loose.

Is the fan plugged into the motherboard header?

Is the fan working? Is it spinning, and if so at what RPM under load? Something like HWinfo would be able to tell you.
I bought Thermal Coolermaster high performance paste
The cooler is extremely tight, there are no lose legs. The fan is plugged on the MB's header and i reboot my system to check the fans RPM and the CPU fan is hitting 3200 rpm
 
And the other questions? :)

The possibilities that come to mind based on what you say are that the old paste wasn't cleaned off properly, there's too much new paste, or too little new paste, the new paste isn't on evenly, or some combination
 
Sep 24, 2020
10
0
10
Visit site
Buying a new CPU to "fix this" is a bad idea. Because if there's something you missed, you might then have the same issue with the new processor. You need to work out what has not gone right.

What paste did you buy (it's not normally expensive)?

Did you clean off the old paste off both the CPU and the CPU cooler before applying new paste?
If you did, what did you use to clean it?

When applying the new paste, how did you do it? What guide did you follow?

Did you clip the Intel stock cooler back in properly? They are very easy to not clip in properly - check if a leg is loose.

Is the fan plugged into the motherboard header?

Is the fan working? Is it spinning, and if so at what RPM under load? Something like HWinfo would be able to tell you.
I
I bought Thermal Coolermaster high performance paste
The cooler is extremely tight, there are no lose legs. The fan is plugged on the MB's header and i reboot my system to check the fans RPM and the CPU fan is hitting 3200 rpm
[/QU
And the other questions?

The possibilities that come to mind based on what you say are that the old paste wasn't cleaned off properly, there's too much new paste, or too little new paste, the new paste isn't on evenly, or some combination
I've put not too much and not too little, just a reasonable dot in the middle and 4 little dots on the cpu borders
 
Sep 24, 2020
10
0
10
Visit site
And the other questions?

The possibilities that come to mind based on what you say are that the old paste wasn't cleaned off properly, there's too much new paste, or too little new paste, the new paste isn't on evenly, or some combination
I cleansed it with a dry and clean sponge (the soft part of course). I took the cooler off again and took off some of the paste. Still getting the same results. I've checked some videos that shows how to apply the paste. I think I did it correctly (now you got me haha)
 
Sep 24, 2020
10
0
10
Visit site
I cleansed it with a dry and clean sponge (the soft part of course). I took the cooler off again and took off some of the paste. Still getting the same results. I've checked some videos that shows how to apply the paste. I think I did it correctly (now you got me haha)
And the other questions?

The possibilities that come to mind based on what you say are that the old paste wasn't cleaned off properly, there's too much new paste, or too little new paste, the new paste isn't on evenly, or some combination
Ok, sorry for flooding here, this is my first time handling my hardware. So, I did have some improvement now... not something huge but that's something already. I retried the operation and tried to stick the cooler even steadier and now my desktop temperature is about 40 C and I put some serious action on Red Dead Redemption and my CPU's temp was around 86-92... but that's still preocupating, right?
*** yea, the stock cooler definitely sucks. Just put a big Fan in front of my opened PC and now its 72- 80 LOL
 
Last edited:
The first time I applied paste to a CPU and mounted a cooler I had to re-do it; if you're not used to it, it's easy to get not quite right.

I took the cooler off again and took off some of the paste.
Whenever you remove a cooler, you will want to remove all of the paste from both the CPU, and the CPU cooler, and put on a fresh dose. Otherwise you can end up with uneven spread/areas that aren't covered.

The temps you quote with the case open and a fan pointing at the system are in line with what you should expect from the Intel stock cooler and a 7700 non-k. Temps in the 90s still sound too hot.

If taking the side panel off and blowing a fan into the case is helping temps, it may not be the CPU cooler itself that's the issue as much as system airflow (i.e. airflow through the case and whether heat generated by all components in the case is getting out).

What is the full spec of your system including the case?

And you've dusted all the case fans and the filters throughout the system?
 
Last edited:
  • Like
Reactions: Zoid and Frindis

TRENDING THREADS