RTX 5000 Series Review discussion


But in this timeline, the RTX 5090 is an ultra enthusiast graphics card that is begging us to be more realistic. Which, I will freely admit, sounds kinda odd from what has always been an OTT card. But, in the real world, a GB202 GPU running on a more advanced, smaller process node, with far more CUDA cores, would have cost a whole lot more than the $1,999 the green team is asking for this new card. And would still maybe only get you another 10–20% higher performance for the money—I mean, how much different is TSMC's 3 nm node to its 4 nm ones?


The RTX 5090 is a lot like this initial review: It's a bit of a messy situation — a work in progress. We're not done testing, and Nvidia isn't done either. Certain games and apps need updates and/or driver work. Nvidia usually does pretty good with drivers, but new architectures can change requirements in somewhat unexpected ways, and Nvidia needs to continue to work on tuning and optimizing its drivers. We're also sure Nvidia doesn't need us to tell it that.

Gaming performance is very much about running 4K and maxed out settings. If you only have a 1440p or 1080p display, you're better off saving your pennies and upgrading you monitor — and probably the rest of your PC as well! — before spending a couple grand on a gaming GPU.

Unless you're also interested in non-gaming applications and tasks, particularly AI workloads. If that's what you're after, the RTX 5090 could be a perfect fit.



At 4K resolution, with pure rasterization, without ray tracing or DLSS, we measured a 35% performance uplift over the RTX 4090. While this is certainly impressive, it is considerably less than what we got from RTX 3090 Ti to RTX 4090 (+51%). NVIDIA still achieves their "twice the performance every second generation" rule: the RTX 5090 is twice as fast as the RTX 3090 Ti. There really isn't much on the market that RTX 5090 can be compared to, it's 75% faster than AMD's flagship the RX 7900 XTX. AMD has confirmed that they are not going for high-end with RDNA 4, and it's expected that the RX 9070 Series will end up somewhere between RX 7900 XT and RX 7900 GRE. This means that RTX 5090 is at least twice as fast as AMD's fastest next-generation card. Compared to the second-fastest Ada card, the RTX 4080 Super, the performance increase is 72%—wow!


My own initial thoughts. On the same process node as the 4090, we have 33% more cores, providing 35% more performance on average in non RT games at 4K and 32% better RT performance without any frame gen. At the expense of 42% more power than the 4090, anyone calling this efficient is 'avin a laff. That heatsink is a ******* marvel, very cool, pun intended.

Frame gen better be really good then. The lesser cards will be very disappointing compared to 40 series without it going by this.
 











My own initial thoughts. On the same process node as the 4090, we have 33% more cores, providing 35% more performance on average in non RT games at 4K and 32% better RT performance without any frame gen. At the expense of 42% more power than the 4090, anyone calling this efficient is 'avin a laff. That heatsink is a ******* marvel, very cool, pun intended.

Frame gen better be really good then. The lesser cards will be very disappointing compared to 40 series without it going by this.
5090 draws too much power for me. My wife is way too lazy to pedal the renewable energy stationary bike that fast.

According to TechRadar:

there was a noticeable improvement in image quality between the 5090 and 4090 thanks to the new Transformer-based AI upscaling tech and improved Frame Generation techniques....I couldn't distinguish between real and generated frames while playing. Previous versions of Frame Generation could sometimes introduce a bit of blurring. That appears to be fixed here. Even though the RTX 5090 FE was now putting out 8K content at a frame rate that we just can't see at the moment (due to the 8K@60Hz limitation of current 8K TVs), I cranked up Multi-Frame Generation to 4X, so it was now generating three frames for every one rendered frame, allowing the RTX 5090 FE to hit 148.89fps, and without any noticeable reduction in image quality.
 
The 5090 shines at 4K.
Its the fastest thing out there by a hell of a long way, they just needed to use obscene amounts of power to do it. Its definitely hitting CPU limits hard at anything lower than 4K.

5090 draws too much power for me. My wife is way too lazy to pedal the renewable energy stationary bike that fast.

According to TechRadar:

Will be interested to see how much better the Frame Gen is for myself sometime. Something is a bit blurry with all the toys turned on with 4000 series, as I said before so that might have to be the saving grace.

AMD is not even in the picture whatever type of rendering we talk about at the upper end, but hopefully the 9070 will surprise with price/performance.
 
Seems pretty impressive after the general dismissal of AI Frame Gen; the promise of high FPS, high quality gaming at 4k is intriguing, but it's all just too rich for my blood.

5090 is definitely a halo product.

I'm not sure what to think still really. Going by specs its likely that the 5080, 5070Ti and 5070 are really not going to be much of an improvement over previous gen without Multi Frame Gen. So it depends on how many games have it over time and what AMD and Intel are going to come up with in the next couple of years.
 
  • Like
Reactions: Why_Me
Really, it just shows you don't need to go crazy with motherboard purchase as a pcie4 slot is close enough. My B850 motherboard in next PC should be good enough... not that I am buying a 5090 but if it doesn't see a massive gain, I doubt many others will either.

Really, not many cards need it yet. I don't think any were saturating PCIe4 yet.
Marketing for sure... PCIe 6 is on horizon... I guess that is for the 60 series...
 
Well, it does kind of seem that we reaching the point of diminishing returns in regards to PC hardware, not to mention graphical fidelity; but maybe I'm just an old curmudgeon.

I don't generally see a huge difference between Low and High graphical settings in most games these days and running things at higher resolution on my laptop doesn't appear any more crisp than 1080p on my desktop. But again, I might just be old.
 
Well, it does kind of seem that we reaching the point of diminishing returns in regards to PC hardware, not to mention graphical fidelity; but maybe I'm just an old curmudgeon.

I don't generally see a huge difference between Low and High graphical settings in most games these days and running things at higher resolution on my laptop doesn't appear any more crisp than 1080p on my desktop. But again, I might just be old.
Diminishing returns combined with bumping up against the limits of silicon.

Youre right though, most 3d games from the 2010s still look pretty good by todays standards, visual improvements are more and more incremental. Frame generation and DLSS arent bringing anything more than smoothness, RT is just icing on the cake.

Bracing myself for the 5070 reviews anyway, I want it to be great but theres just no way its going to be worth an upgrade from the previous generation. I want to see double what I have in my price range before I think about buying, might be waiting a while.
 
  • Like
Reactions: Why_Me
Graphics will never be finished, as its hard to sell a new GPU if there isn't some advantage to it.

I been waiting for it to get to end so gameplay might have a chance to be seen as important again, but after 20 years I am starting to think it won't ever happen. Just keep adding fake frames and make it look amazing, who cares if it feels like crap.
 

TRENDING THREADS