RX 7900 XTX/XT Announcement day thread

Since we don't have one yet...

All I can offer is reviews of what they showed, no actual comparison tests

RX 7900 XTX - $USD 999 - https://www.amd.com/en/products/graphics/amd-radeon-rx-7900xtx
RX 7900 XT - $USD 899 (may be reduced) - https://www.amd.com/en/products/graphics/amd-radeon-rx-7900xt
XT only 85% the performance as an XTX


All the videos show same things so no point showing more

Seems some people expected AMD to catch Nvidia in RT, I never assumed that since they 1 generation behind. The XTX might make the 4080 look bad in price.

I was waiting for a 7800 xt assuming power draw would be the same or higher this year. Instead, the 7900 XT has same power draw as the 6800xt... so I am now looking at it.
 
Last edited:
Shame some of youtubers come out as clueless to how things work seeing as some of them fell for the 8k resolution trick AMD did (they say 8k but in reality what they present as 8k is half of 8k in pixel dencity) But other than that, i can't wait to see real tests on these cards seing as AMD actually got interesting for me for the first time in 15 years.
 
The mention of 8k by AMD & Nvidia was probably promoted by Panel makers who want us constantly updating our screens. There aren't many actual 8k screens now so its a little premature. More 5k monitors. Only idiots think it makes sense to play at above 4k now, gl convincing some gamers that 1440p is good enough now... I mean, DP 2.1 can only run 1440p at 900hz so obviously that is too slow (sarcasm) to play any games now.
 
  • Like
Reactions: Brian Boru
The mention of 8k by AMD & Nvidia was probably promoted by Panel makers who want us constantly updating our screens. There aren't many actual 8k screens now so its a little premature. More 5k monitors. Only idiots think it makes sense to play at above 4k now, gl convincing some gamers that 1440p is good enough now... I mean, DP 2.1 can only run 1440p at 900hz so obviously that is too slow (sarcasm) to play any games now.
Was more a grudge toward AMD using 8k about a resolution that isn't realy 8k. 8k is 8192 × 4608 while they refered to was 8k UW which is 7680 × 2160 so 2/4th of the pixels, and even though they are technically correct, it is a bit of a shady thing to do imo since most people have no clue about the diference and allso because Nvidia got a ton of flak for doing the same thing.

My next card will probably be a7900xt or XTX depending on price and performance, unless they turn out realy disapointing which i kinda doubht.
 
Last edited:
  • Like
Reactions: Brian Boru
I didn't know AMD used a slightly misleading definition of 8k but really no one will be playing at that resolution any time soon (I realise by saying that someone will post "I do" but the majority of the population don't.) so its more a reply to Nvidia saying they can in the 4090 reveal.

Tangently could blame who ever controls screensize names, as there are too many that are too similar. 4K & 8K shouldn't both be UHD. And yet...

I know that RT won't be as good as Nvidia. That doesn't effect my decision as most of the games I play won't use it anyway.
Have to wait until December before any actual tests can be run. I hadn't noticed they both only required a 750watt PSU so I will have to look at partner board models to see if any of those need more. If the price difference here isn't too great, I might get an XTX. But I don't expect to see any of them here much before January.

The unreleased 4080 8gb sure looks like a dead duck compared to these 2. It was meant to be 900, it would have been lined up against a card with 24gb of vram.
 
  • Like
Reactions: Brian Boru
I am in no rush to buy a new one, the cooling seem stable on my current card atm after some paste replacement so. Just want more FPS in Warhammer 3 tbh :) Both 8k and 4k has allot of diferent resolutions inside their "defenitions" making it realy confusing sadly :(
 
The 40(7)80 had/has 12GB of Vram but still, yea. I heard somewhere (PC per podcast?) they already released the ref cards to some partners and had to compensate some of them for box at design and so on.

Wonder when the review embargo is up for 7900's, found somewhere I never heard of saying 14 November and Guru 3d says more info next week.
 
only required a 750watt PSU
My concern is the large number of fans required to cool such systems. They'll be spinning so fast in an enclosed space near each other, that they'll generate enormous heat which will have nowhere to go—cos there'll inevitably be another fan in the way blocking its path.

There's gotta be a way to adapt solar tech to capture all that heat and turn it into electricity. That's the final big hurdle to making PCs self-contained units, ideal for the doomsday prepper or Antarctic explorer.

ETA: this post is a joke, should've made that clearer :)
 
Last edited:
My concern is the large number of fans required to cool such systems. They'll be spinning so fast in an enclosed space near each other, that they'll generate enormous heat which will have nowhere to go—cos there'll inevitably be another fan in the way blocking its path.
I don't foresee having to change the number of fans I have, though part of my plan was always to replace all of my exhaust fans with Noctua Chromax Black swap fans. I don't know what you envision cases look like inside but the amount of obstructions between fans and the outside world are small.

its early but its possible the cards don't generate any more heat than previous models and might not be as hot as Nvidia 4090 would be. That and reference design isn't a brick. Air can flow around it pretty easy
 
  • Haha
Reactions: Brian Boru
Not sure what the market will try to sell us when bigger screen become impractical for the size of normal houses and apartments, and resolution no longer makes any difference to the human eye. Wont be much point in a GPU that performs a huge amount faster every 2 years when we plateau at something like 8K 240hz at 55 inches and no one can tell the difference at home between a xx90 and an xx60 except for a 10C difference in temperature in the room. Sure theyll think of something.

its early but its possible the cards don't generate any more heat than previous models and might not be as hot as Nvidia 4090 would be. That and reference design isn't a brick. Air can flow around it pretty easy

I noticed the board power of the 7900XTX is 20 watts higher than the 6950XT. Considering that 6950XT custom cards were hitting 450 Watts, we can definitely expect more heat comparatively to the last gen. Thats around the same as the 4090.


The 7900XT is rated the same as the 6900 at 300W though, so probably almost the same at 350W for custom cards.
 
It makes me think. we reaching a stage where
  • resolution of screen doesn't matter (past 4k anyway)
  • refresh rates are getting so fast you can't tell a difference anymore
Soon only graphical effects will separate cards. How good are they at RT. What other effects can they pull out to sell people.

Getting people to buy your next card will get harder as the gains will be harder to show.

Could be why AMD only half heartedly tested 8k as really, its not something many of us will own - https://www.digitaltrends.com/home-theater/eu-efficiency-standards-would-ban-8k-4k-tvs/

I might try to get a reference card as both max at 750 psu.
 
  • Like
Reactions: Brian Boru
I guess there'll always be a market for faster compute, datacentres, supercomputers and AI stuff.


Looks like datacentre is almost double the gaming revenue for Nvidia.

I think RT will just be in every 3d game and wont be tested seperately in a gen or two like it is now. Stuff like FSR and DLSS might become irrelevant as power of cards increases and resolutions and refresh rates plateau.

Thats all my crystal ball has for now, theyll think of something to sell gamers for more performance for sure.
 
  • Like
Reactions: Brian Boru
My concern is the large number of fans required to cool such systems. They'll be spinning so fast in an enclosed space near each other, that they'll generate enormous heat which will have nowhere to go—cos there'll inevitably be another fan in the way blocking its path.

There's gotta be a way to adapt solar tech to capture all that heat and turn it into electricity. That's the final big hurdle to making PCs self-contained units, ideal for the doomsday prepper or Antarctic explorer.
There has been tech to use waste heat for ages but it is usually used to heat up entire buildings etc. I am no expert on this but I have a fealing that doing it on such a small scale as just 1 pc is probbably realy inefficient.
 
  • Like
Reactions: Brian Boru
I didn't know AMD used a slightly misleading definition of 8k but really no one will be playing at that resolution any time soon (I realise by saying that someone will post "I do" but the majority of the population don't.) so its more a reply to Nvidia saying they can in the 4090 reveal.

Not sure who was first but they have both been doing it for a long time, was the same when 4k was the shizzle. Doubht it is a reply to Nvidia, it's more of a, they did it so we can do it too to make our stuff look better than it realy is. Yes no one is gaming on 8k now, but the same was said about 4K not long ago :p
 
Yes no one is gaming on 8k now, but the same was said about 4K not long ago :p

I know, I had a 4k screen 7 years ago. Not being able to play games on it at its native resolution was a problem until I got this PC 5 years later. I have since gone back to 2k (1440p) as I had been running the 4k desktop at that resolution anyway - Win 10 defaulted to it.

So I won't be sold on a bigger screen any time soon. Current one only 2 years old.

Shame there won't be a lot of info on the 7900 cards for a while now. I probably get the XT since it will be cheaper. It will still do everything I need and beat my 2070 Super.
 
I know, I had a 4k screen 7 years ago. Not being able to play games on it at its native resolution was a problem until I got this PC 5 years later. I have since gone back to 2k (1440p) as I had been running the 4k desktop at that resolution anyway - Win 10 defaulted to it.

So I won't be sold on a bigger screen any time soon. Current one only 2 years old.

Shame there won't be a lot of info on the 7900 cards for a while now. I probably get the XT since it will be cheaper. It will still do everything I need and beat my 2070 Super.
Honestly, I am so content with my 27" 1440p 144hz monitor that I really just don't see the need for anything better for the foreseeable future. It looks great and provides substantially more performance overhead relative to 4k. I'm sure 4k does look better at this screen size, but I doubt it is worth the performance hit. But monitor size is obviously a major factor in this discussion. If you're running something bigger like 32"+ then 4k is significantly more compelling.
 
I know, I had a 4k screen 7 years ago. Not being able to play games on it at its native resolution was a problem until I got this PC 5 years later. I have since gone back to 2k (1440p) as I had been running the 4k desktop at that resolution anyway - Win 10 defaulted to it.

So I won't be sold on a bigger screen any time soon. Current one only 2 years old.

Shame there won't be a lot of info on the 7900 cards for a while now. I probably get the XT since it will be cheaper. It will still do everything I need and beat my 2070 Super.
I am still at 1440p and i realy wouldn't consider changing for myself if it wasnt for a dead pixel mid screen, but anything that is an uppgrade from my current monitor is so damned expencive I decided I will live with that pixel for a while more.
 
Getting a little tired of same old line. The xtx is better value for just 100 more... which is fine unless you need a new PSU to cover that extra 55 watts it needs to run. Then price is different.

Also depends on if you must have best 4k performance, and as I have a 1440p and won't be changing anytime soon... that argument doesn't work. I am all for everyone else wanting an xtx as it makes it easier for me then.

Also depends on location as I don't expect the difference be that close in Australia.

It will be an upgrade over what I have now, I don't care about best.

also 3 x 8 pin...

seems i only have 2 8 pins.. damn it. I could get a AMD card and not worry as they are 2 pin
Goes back to ignoring them until they actually released.
 
Last edited:
  • Haha
Reactions: Brian Boru
Nov 15, 2022
1
0
10
Visit site
HDMHR5pjKRsTS5PAXxRkVa-970-80.png.webp


I think the interesting thing here is that AMD really feels like a "new generation" where they're releasing direct replacement for the old cards which hit similar price points with better performance. And that improved cost-to-performance ratio is big.
Pauls Hardware: see this video
 
Last edited by a moderator:

TRENDING THREADS