nVidia's Ampere architecture and VRAM?

I'm sure a lot of people agree that the RTX 3070's 8GB will fall short as time goes on, though I actually wonder what nVidia were thinking when they were making the RTX 30 series GPUs. The 3060 has a rather generous 12GB though the amount of actual juice in the card comes off to me as the hardware not being able to utilize it in an adequate manner, while the 3070 itself is a powerful card but it having less VRAM means you won't be targeting higher resolutions as time goes on.

As someone who has an RTX 3070 myself, i can say the following:

Cyberpunk 2077 is a 1080p game to that card even after tweaking the game's settings to account for okay performance and real-time raytracing, though the player might encounter temperature throttling after extended play

You can forget about 4K with Kingdom Hearts III -- 1620p is the highest resolution you can go at max settings before your framerates really take a hit

Killing Floor 2, i'm already using a custom low config in order to minimize performance hiccups because >ue3 in 2021

Necromunda: Hired Gun on the RTX 3070 can really only match the PS5 and Xbox Series X's 1800p ultra

Iunno, i just thought I'd make this thread to voice my confusion about nVidia's decisions regarding the amount of VRAM that they put in their cards.
 

In this case you can believe the Quora guy (Except the last bit about 3gb 1060's when he doesnt seem to know that there were more then VRAM differences with those cards). They didn't think 6 would cut it and due to the way memory buses work on graphics card they had a choice of 6 or 12.

Nvidia have been skimping on VRAM compared to AMD for a long time, its cuts their manufacturing costs and makes their cards obsolete more quickly so customers have to upgrade sooner. Whats not to like ? :p
 
Nvidia have been skimping on VRAM compared to AMD for a long time, its cuts their manufacturing costs and makes their cards obsolete more quickly so customers have to upgrade sooner.
I mean, that seems to be the likely scenario, though I feel like a more optimistic attempt at speculating things would be nVidia thinking that DLSS will help carry the 3070 at higher resolutions despite not having the VRAM for it.

Then again, I wouldn't know personally since i'm on a 1080p Samsung monitor that's only SLIGHTLY less old than my old LG one and have mostly been using downsampling wherever the option's available.
 
  • Like
Reactions: Kaamos_Llama
I mean, that seems to be the likely scenario, though I feel like a more optimistic attempt at speculating things would be nVidia thinking that DLSS will help carry the 3070 at higher resolutions despite not having the VRAM for it.

True that DLSS might help the newer cards, but DLSS has been around maybe 2 years, I'm not sure what their excuse was before that for the 2GB GTX680 for example.

This generation there was a hike in the cost of VRAM from the manufacturers. I believe what has happened is when Nvidia costed out the price of putting 16GB VRAM on the 3070 and up, they couldn't get it to a point where their margins were acceptable to them at a price the market would accept.

Now everything went nuts, the chips and cards are flying out of the factory before anyone sees them, and Nvidia are making more money than ever I don't think they have much incentive to adjust that behaviour next time around.
 

TRENDING THREADS