It depends on your budget, but the idea that the newest stuff costs a fortune and the older generation is a "good deal" is a myth that just won't die the death it deserves.
Sometimes the older gen of (e.g.) AMD CPUs really does come down in price, as we saw with the 2000 series where you could snag at 2700/x for the price of a 3600. Which was great if you had certain production type workloads. But, since the 3600 would generally match or beat the 2700x in gaming, the 3600 was still the smarter play for gaming. Also, AMD have so far generally been releasing Ryzen parts at the same-ish price their predecessors launched at.
Old gen GPUs are rarely a good deal. And when Intel launched the i5 8400, which all but equalled the 7700k in gaming, the 7700k still stayed more expensive and was a poor deal. So new gen stuff, if it provides a big leap in performance, can give you more performance for the same price as the old stuff / the same performance for a lower price.
Also, since your existing RAM is compatible with new Intel and AMD motherboards, you need not replace the RAM right away.
Which means you shouldn't have much trouble affording an upgrade.
As for this:
DLSS helps with improving GPU performance. If you're CPU limited, it may not be much of an asset.
However, we don't know whether you will be CPU limited or not in Cyberpunk. Often it can vary depending on the area of the game.
In The Witcher 3, your CPU wouldn't generally affect performance much. Novigrad tended to be where you'd see any CPU limitation, but tbh it was a non-issue for most people. In AC: Odyssey, Athens is a different performance scenario to the unpopulated middle of nowhere. RDR2 is a different beast again. Cyberpunk will be different in other ways too. We can't extrapolate from one game to another. Hence why we need to wait for benchmarks.
The following is partly me guessing at why you're saying what you're saying, and then trying to unpack it a bit, so apologies for any faulty assumptions on my part
1) I'm playing on a 1080p 144hz LG 24GM79G. So the performance will depend a lot on the cpu too, right?
I'm assuming you're saying this because you've heard that playing at lower resolutions like 1080p shows more difference between CPUs in performance. That can be true, but it depends on the game, on the settings, and on the graphics card.
If you're playing a really demanding game at ultra settings, and you have things like raytracing on too, actually you might be GPU limited such that the CPU has little influence on performance.
Articles on tech websites and youtube that benchmark PC performance take the most powerful GPU they can in order to try to expose CPU limitations. Take this article with a 7700k and a 10600k, among other CPUs:
The Core i5-10600K is arguably the most compelling of the 10th-gen unlocked desktop parts. First, it's the most affordable of the bunch, set to retail for $262....
They use a 2080 ti, the most powerful gaming GPU available at that time. And still some games like Breakpoint showed basically no difference between CPUs. Other games did - at least with a 2080 ti.
A 2070 Super may have shown smaller differences i.e. have been less CPU-limited.
2) Sometimes I use Dynamic Super Resolution and go to 1440p, but I don't really know how that works - i mean if that will get more from my GPU, since it's above 1080p or it will even get more from both GPU and CPU
DSR is essentially form of anti-aliasing. AA is that magic you'll know of that tries to remove 'jaggies' from games - where lines instead of appearing straight and smooth look like a jagged / stepped line. There are various techniques and DSR renders the image above native quality (so at 1440p instead of 1080p in your case) and then downsamples the image back to 1080p.
DSR will load the GPU intensively, not the CPU. AA is often one of the most GPU-intensive settings. With DSR, you do sort of
get 'more from your graphics card' in that it can improve fidelity and if you have the horsepower to spare, why not. However, arguably, if you're doing that you might as well get a better monitor and run games at native 1440p which will still be leagues better visually. If you have that kind of horsepower to spare in a AAA title you're probably running a monitor that is 'beneath' what the GPU is worth.
Also, using DSR and DLSS together is kind of strange, conceptually at least. Because DSR renders the image above 1080p and then reduces it to 1080p. Using spare performance to try to enhance fidelity. While DLSS renders the image at (e.g.) 720p, then upscales to 1080p (using Nvidia's Tensor cores and AI wizardry). The primary goal of which is to improve performance where the GPU might otherwise struggle. If you're using DLSS, it's because you need the extra performance - often because you have RTX raytracing features switched on which tank performance. If you're using DSR, it's because you have performance to spare.
The exact settings you'll use will depend how Cyberpunk behaves, but using both together may to some degree cancel out the point of them.