This is just my observation, there's really very few AAA PC games coming out year after year... and 90% of the games coming out are indie games

I play "indie" games almost exclusively. Last year I bought a large amount of games and only 1 of them, if I remember correctly was AAA, and this year there's only one I see myself buying.

The problem is that I'm really into running and developing AI on my computer, but my computer with the 2080 in it runs the AI just fine. It's slow, but it's workable.

So I've been back and forth as to whether I'm going to go big this time. A 5090 would run a training program about 16 times faster than my 2080, and that's quite a step up.
 
Dec 5, 2024
18
30
50
Visit site
I play "indie" games almost exclusively. Last year I bought a large amount of games and only 1 of them, if I remember correctly was AAA, and this year there's only one I see myself buying.

The problem is that I'm really into running and developing AI on my computer, but my computer with the 2080 in it runs the AI just fine. It's slow, but it's workable.

So I've been back and forth as to whether I'm going to go big this time. A 5090 would run a training program about 16 times faster than my 2080, and that's quite a step up.

so... how about those people who mostly plays indie games because lets face it

there's a lot coming.. new ones.. every year

and majority of them runs at 1660 for crying out loud
 
It's not just indies either. The digitalisation of video games is often criticised because you no longer "own" your games, but it's easier than ever to play old games.

I use a GTX 1050 ti in my PC, which is an 8 year old GPU which was considered a budget friendly card when it was released. But I'm not just playing indie games, there are plenty of AA or AAA games from 5+ years ago I haven't had the time to play yet which run just fine.

And if you're okay with running on the lowest graphics settings, you can run a surprising number of modern games on old hardware as well. Especially turn-based games are usually not a problem since they aren't as graphically demanding and it doesn't matter as much if it's only running at 30 FPS.
 
And there are very few AAA games worth buying among the ones released. They are made by dinosaurs, I wait for the asteroid.

But anyway, Yes... no reason to buy a new GPU when most of the games don't use the features. Or push your card to a point it really notices. I mostly play indie games which explains my frame rates
srPYBVH.jpg
 
  • Like
Reactions: Pifanjr

Frindis

Dominar of The Hynerian Empire
Moderator
Seems to me that the latest GPUs are mainly for 4K+ bragging rights and being able to get photorealistic photos or play games on multiple monitors setup like some do with Flight Simulator in their own made cockpit. I'm sure there might be more use for a strong GPU if you are doing a lot of video/photo rendering, but then you are probably also looking for something completely different than a gaming PC for more power like with a threadripper workstation.
 
Last edited:
when the majority of players are still using 1080p monitors there isn't a lot of demand for 4k.
Being able to run 4k also means updating most of the PC to support it.
5090 probably requires a new PSU as well. I seen 1600watts thrown around.

Then you add in the cost. A 5090 costs 2k in US, 4k in Australia.
I made up a new PC idea yesterday and the entire PC costs less than the 5090 would (mainly as my GPU is free, who knew (https://au.pcpartpicker.com/list/s7z9Kq))

So are the games worth spending the money to play?

No one mentioned a distinct lack of MTX in most indie games. Not all... but there are less than in AAA. Indie doesn't think it owns your wallet.

However, if faced with paying a subscription for a streaming service or upgrading my PC, I would still choose the latter personally.
Yes, well, owning is almost always better than renting.
 
when the majority of players are still using 1080p monitors there isn't a lot of demand for 4k.
Being able to run 4k also means updating most of the PC to support it.
5090 probably requires a new PSU as well. I seen 1600watts thrown around.

Then you add in the cost. A 5090 costs 2k in US, 4k in Australia.
I made up a new PC idea yesterday and the entire PC costs less than the 5090 would (mainly as my GPU is free, who knew (https://au.pcpartpicker.com/list/s7z9Kq))

So are the games worth spending the money to play?

No one mentioned a distinct lack of MTX in most indie games. Not all... but there are less than in AAA. Indie doesn't think it owns your wallet.


Yes, well, owning is almost always better than renting.
You don't have to spend $2000 on a GPU, though. The 5070, which costs $540, was 40 fps faster than the 4090 in one test per that article from PCG I posted. You don't have to move to the best neighborhood in town. You just have to get out of the abandoned building you've been camping in.

But, no, AAA are not worth spending $2000 on a GPU.
 
You don't have to spend $2000 on a GPU, though. The 5070, which costs $540, was 40 fps faster than the 4090 in one test per that article from PCG I posted. You don't have to move to the best neighborhood in town. You just have to get out of the abandoned building you've been camping in.

But, no, AAA are not worth spending $2000 on a GPU.

The xx70 iteration has always been the sweet spot. But I wonder how much longer 12 GB will be adequate for AAA games? I played 2 new AAA games last year, and they both were using upwards of 8-9 GB of VRAM. And that's with no way tracing.

Edit:
Some new AAA games are also no longer giving an option to disable way tracing entirely, so the only option is to use the lowest possible settings.

Anything above a xx70 - at the moment - isn't worth it to me since I game at 1440 (and sometimes at 5120 x 1440 if I'm on my actual monitor and not my TV). I can hit 120 fps/hz locked on most AAA games on my 4070S (again, with no way tracing). I had tried out a 4080S briefly and the price/performance just wasn't worth it at 1440.
 
Last edited:
The xx70 iteration has always been the sweet spot. But I wonder how much longer 12 GB will be adequate for AAA games. I played 2 new AAA games last year, and they both were using upwards of 8-9 GB of VRAM. And that's with no way tracing.
I would take a guess that itll be fine at least until the next console generation. If PS5 and Xbox both have 16GB combined RAM then they will need to aim to stay within that and I assume that should translate roughly to PC. I'm not a developer or anything though so I could be wrong, and obviously 4K is going to be pushing more than lower resolutions.

As I understand it, if youre looking at Afterburner OSD or GPUz or something, the reported VRAM used is the amount allocated or set aside for use, it doesnt necessarily mean its using all of it. If you are running out of VRAM I think the slow downs are quite obvious because system RAM is so much worse.

If devs want to sell a decent volume of games on PC then they will have to aim for 8GB minimum at 1080p for a few years I would think.
 
The xx70 iteration has always been the sweet spot. But I wonder how much longer 12 GB will be adequate for AAA games? I played 2 new AAA games last year, and they both were using upwards of 8-9 GB of VRAM. And that's with no way tracing.

Edit:
Some new AAA games are also no longer giving an option to disable way tracing entirely, so the only option is to use the lowest possible settings.

Anything above a xx70 - at the moment - isn't worth it to me since I game at 1440 (and sometimes at 5120 x 1440 if I'm on my actual monitor and not my TV). I can hit 120 fps/hz locked on most AAA games on my 4070S (again, with no way tracing). I had tried out a 4080S briefly and the price/performance just wasn't worth it at 1440.
Whatever is laziest, AAA developers will do. If they think they can get away with less optimization passes by increasing VRAM requirements, they will do it. But they have to be able to still sell games, and this would kill that right now. Hardly anyone has over 8GB of VRAM if you look at the Steam hardware surveys.
 
As I understand it, if youre looking at Afterburner OSD or GPUz or something, the reported VRAM used is the amount allocated or set aside for use, it doesnt necessarily mean its using all of it. If you are running out of VRAM I think the slow downs are quite obvious because system RAM is so much worse.

I do use AB from time to time, but one of the games I was referring to has an in-game VRAM usage bar. I'm fairly certain they are actual usage as Star Wars: Outlaws in particular has a memory leak, and you can see the darned VRAM slowly going up over time. When it gets above 10GB, the game starts to stutter.
 
What one gamer finds to be "AAA", another might not, so its really how one looks at it like the title suggests.

GGG who makes POE/POE 2 is not considered to be "AAA", but its production, community, updates etc. of its early access POE 2 suggest otherwise. It looks better than a lot of games considered AAA (better than ubisofts "AAAA" game skull and bones or whatever) because of the studio that released the game.

Also, these studios who are considered "AAA" are always releasing games and you just might not hear or see them. If you are a huge indie player, your feeds/steam feed/game feed are gonna reflect more and more indie games and bury what are supposed to be "AAA".

Either that or a big AAA company, like Bungie, decide to release Live Service "AAA" games where they have to sink huge amounts of money and time into 1 game year after year. Blizzard is another example. But remember, these two companies are owned by bigger AAA studios that release multiple AAA games.

I love that more and more indie games come out as tools to make them become extremely simple to use but there are still a huge amount of AAA, id say its more 60-40 rather than 90-10 imo.
 
How does a topic on AAA vs Indie games release frequency turn into a hardware performance topic?
Oh what a tangled web we weave. :)
so, it's really hard for some people to decide to upgrade their gpu's when most of the games coming out you don't need the 50 series...

heck even the 40 series
I think from there, no matter.
Eventual GPU and HW tech references/topics is the pc gamer/tech nerd equivalent to all conversations leading to sex.
Damn, I've been having conversations here for 5 years and not even once.
 
  • Haha
  • Like
Reactions: Pifanjr and Zloth

Zloth

Community Contributor
I'm not afraid to buy any game with my 3080. The main thing I see improving is ray tracing. It's not a big deal, certainly not compared to days of yesteryear when we would get a bunch of new features every generation, but it is nice. Bigger VRAM helps some things, too. But yeah, whatever class of gaming you talk about, the days of needing a new video card every couple of years are long gone.

Speaking of gaming classes.... HELLO!?!? It's not just indies and AAA! Remember B games - now more commonly called AA because everyone forgot about the B games, too. Plenty of those have very nice graphics, too.
Hardly anyone has over 8GB of VRAM if you look at the Steam hardware surveys.
I did. 35% have exactly 8GB. It looks like those above and below that point are split even, weighted a little toward below.

(And over half of people are now using Windows 11. Time for me to get upgrading. Which I'm pretty sure I said last year, too.)
 
I'm not afraid to buy any game with my 3080. The main thing I see improving is ray tracing. It's not a big deal, certainly not compared to days of yesteryear when we would get a bunch of new features every generation, but it is nice. Bigger VRAM helps some things, too. But yeah, whatever class of gaming you talk about, the days of needing a new video card every couple of years are long gone.

Speaking of gaming classes.... HELLO!?!? It's not just indies and AAA! Remember B games - now more commonly called AA because everyone forgot about the B games, too. Plenty of those have very nice graphics, too.

I did. 35% have exactly 8GB. It looks like those above and below that point are split even, weighted a little toward below.

(And over half of people are now using Windows 11. Time for me to get upgrading. Which I'm pretty sure I said last year, too.)
So you'd be able to sell your games to 30 percent or less of the market (in terms of your game being properly optimized). Given that AAA supposedly uses marketing data to reach the most people possible, I'm guessing this is the last thing they'd do, particularly if the consoles have 8 GB.
 
  • Like
Reactions: Pifanjr

Zloth

Community Contributor
Given that AAA supposedly uses marketing data to reach the most people possible, I'm guessing this is the last thing they'd do, particularly if the consoles have 8 GB.
Yeah, if it were really a requirement. But they aren't. The new FF7R only recommends a GeForce 2070 which has 8GB and I think it can deal with the 2060, which has just 6GB. Is there a game requiring more?

(Oh, it does require an RTX card. I wonder why? Raytracing is just a graphics option. Are they using something else RTX brings?)
 
  • Like
Reactions: Pifanjr
How does a topic on AAA vs Indie games release frequency turn into a hardware performance topic?
notices... bans everyone who went off topic... gee, its awfully quiet around here. :)

I guess its one the main difference between most AAA/Indie titles... what sort of PC you need to run them.
mainly stems from fact AAA games take so long to make because of all the time wasted on way tracing and other graphics features and not on the actual game play.
Indie take risks, AAA buy the Indie companies that succeed... and rip them to shreds trying to understand why it succeeded... often just ending with a dead carcass at end. That is why you could call them vultures.
 

TRENDING THREADS

Latest posts