Random Hardware topics

Page 3 - Love gaming? Join the PC Gamer community to share that passion with gamers all around the world!
7800 xt might be rejected 7900xt that didn't quite have enough cores to work. So it will be based on chiplets as opposed to the old monolithic dies that Nvidia still uses. It seems they have plenty of spare silicon. it won't be as powerful obviously. I not sure what vram it would have.

Until consoles get more than 16gb of unified ram, there won't be many games that need more vram than 16 at most. So it makes me wonder why some cards have more unless its for non game tasks. Having more than you need is a nice position to be in though.

I probably need to find a game that pushes card. I wonder what Diablo 4 will be like, I know its not RT. I suspect my CPU is likely the bottleneck in my system - but the secret with bottlenecks is you will always have one, regardless of how hard you try.
 
7800 xt might be rejected 7900xt that didn't quite have enough cores to work. So it will be based on chiplets as opposed to the old monolithic dies that Nvidia still uses. It seems they have plenty of spare silicon. it won't be as powerful obviously. I not sure what vram it would have.

Until consoles get more than 16gb of unified ram, there won't be many games that need more vram than 16 at most. So it makes me wonder why some cards have more unless its for non game tasks. Having more than you need is a nice position to be in though.

I probably need to find a game that pushes card. I wonder what Diablo 4 will be like, I know its not RT. I suspect my CPU is likely the bottleneck in my system - but the secret with bottlenecks is you will always have one, regardless of how hard you try.

Exactly, bottlenecking is a bit of a noob trap. If your CPU only gets a game to 130FPS but your card can get to 200 if it wasnt it probably doesnt matter in most cases. Lot of games are poorly optimized as well so hard on CPU whatever happens. Diablo 4 looks pretty good from screenies I've seen, GameGPU makes it look like not overly demanding though, at least at 1080p


I read they had 2 more chips Navi 32/33 that the new cards were going to be based on. Also read a rumour somewhere that the 7800 and 7700 might have been delayed partly because they had figured out some bugs in the architecture to improve clockspeeds to 3ghz plus which they couldnt retroactively apply the fixes to the 7900 series putting them closer together than they intended.. It was probably on Videocardz so grains of salt and all that.

Personally I'm hoping its true and also they improve RT as well, I can dream. 4070 pricing is at a level here I wouldnt be against paying, hoping the Rdaeons are thereabouts with faster raster and more VRAM at least.
 
  • Like
Reactions: Brian Boru

Brian Boru

King of Munster
Moderator
does anyone actually play games at ultra settings?
I played Crysis amped to the max, just to see what the fuss was about. That was around 5-7 years after it released tho :D
Otherwise I usually play with default, or try on High if default is lower—visuals aren't a big deal for me unless they interfere with gameplay, like eg making dark places unplayable.

secret with bottlenecks is you will always have one
One of the most user-friendly business books I read was The Goal by Eli Goldratt. It was done as a fiction novel, and so much more readable than most biz books, and espoused his Theory of Constraints, which centered around bottleneck management as an Operations Management technique. Aren't you glad I shared this? :rolleyes:
 
  • Like
Reactions: Kaamos_Llama
Diablo 4 shows how the more VRAM you have on GPU, the less ram it has to steal from system
UDn32Vz.jpg

The test was conducted on the base configuration Core i 9 13900K with 32 GB DDR5 6400 MHz pre-installed memory. All used RAM was taken as an indicator. The RAM test on the entire system was carried out on various video cards without running extraneous applications (browsers, etc.). In the graphics, you can add and remove any resolutions and video cards as desired.

annoyingly to me, the website lists a 7900xt but doesn't include it in most test results. Instead shows the xtx.

I should have known a Diablo game would hardly stress PC out... hmm, I need to look elsewhere.

I can tell that if the new 7800xt is just a cut down 7900xt, there will still be more reviews for the thing than there are for my GPU. 4 months after release there still aren't any good reviews of the card I bought.
 
  • Like
Reactions: Brian Boru
Sigh, no one is buying our GPU.... I know, lets cover the fans with Spiderman stickers and call it the Spiderman version
 
  • Haha
Reactions: Brian Boru
Missed that question, motion blur off and AA to low levels, but otherwise all Ultra if possible.

If Ultra doesn't run well enough then I start reducing expensive settings or enabling FSR etc. So yea I always play at Ultra if I can get over 60 FPS minimums while doing it.

Diablo 4 shows how the more VRAM you have on GPU, the less ram it has to steal from system
UDn32Vz.jpg



annoyingly to me, the website lists a 7900xt but doesn't include it in most test results. Instead shows the xtx.

I should have known a Diablo game would hardly stress PC out... hmm, I need to look elsewhere.

I can tell that if the new 7800xt is just a cut down 7900xt, there will still be more reviews for the thing than there are for my GPU. 4 months after release there still aren't any good reviews of the card I bought.
GameGPU has always been a bit weird, if you click the drop down menu and select the I3 10100 for some reason its the fastest of all the CPUs :p

Still they benchmark stuff that others dont sometimes and the results mostly line up with other places, even if their web page sometimes has mistakes.

I think they have the 7900XT though. I didnt realize the drop down menu on the left lets you select resolution too.
 
Last edited:
  • Like
Reactions: Brian Boru
The difference between minimum and max detail in D4 isn't that massive. It likely looks different in action. DIablo isn't a spectacle really and it appears way more gritty in this edition.

I could set my games to ultra but as most of mine are all old, its unlikely to make that much of a difference.My newest games don't really push the graphics, Last Epoch doesn't do a lot.
No one I watch play games is playing anything that makes me want to play it. I tend to watch games I will never play, its one way to see the end I guess.

They include 7900xt in some charts, here is one where i compared it against my last GPU on my CPU:
Jddu0lN.jpg

looks a little CPU bound at 1080p and 1440p. Not that I can use the speed its hitting and I don't care about 4k. if grey is the lows, then 152 is still faster than my monitor.
I can't afford this CPU:
UhXYddh.jpg

I9 can't help the 2070 but it sure helps the 7900xt
I assume 5800x3d can run GPU faster but my monitor can't.
 
Last edited:
  • Like
Reactions: Brian Boru
IGPU have come a long way

It can run crysis at 75fps - sure it is at 1080p


low end Nvidia cards may be unnecessary soon

Its impressive, but people have been saying for years that low end cards are at risk from onboard as it improves. Hasnt really moved anywhere yet. I guess until the latest gen AMD's CPU's didnt have integrated graphics, and people still use the crappy GT730s to output to multiple displays in offices and stuff.

Steam Deck and all the handhelds coming out are basically using integrated graphics and getting decent results. I think the new Asus thing is using the Radeon 780m.
 
  • Like
Reactions: Brian Boru
Nice, didnt realize there were two configs. The lower one has quite a bit lower GPU specs though.

I'm tempted to get a Steam Deck, sometimes I just want to lay on the sofa and play something. The Asus looks great and all but the price on deck is better and I like the idea of the track pads for playing mouse based games.
 
  • Like
Reactions: Brian Boru
This is related. Jedi Survivor needs 18gb vram to run in 1440p alone. As test was run on a 4090 we have to assume that the 4k total is more than 24gb or they would have shown it. No Gpu exist outside pro market that have that much. Yet.
Silly thing is the game only uses 4 cores so it is CPU bound on the 4090. Gpu is mostly idle.
If it used all 12 cores on the 5950x, it would be better. They want speed over multi thread... Sure sounds like Crysis revisited. CPU are about as fast as they can be on silicon. Use more cores.
Once my internet works I will add a Daniel Owens video as evidence.
 
Last edited:
  • Like
Reactions: Brian Boru
4060Ti

Y3BjF3y.jpg


15% faster than the 3060Ti with the same amount of VRAM on the base model, so its basically a 3070.

Theyre leaning really hard on DLSS 3 frame generation, anyone tried it? Techspot didnt like it much last I heard.

Based on this it's hard to recommend anyone use DLSS 3 at moderate to low frame rates. To avoid any latency issues and effectively playing the game at a lower performance level than you started with, you need a high baseline FPS before enabling frame generation. We would recommend a minimum of 120 FPS before DLSS 3, targeting around 200 FPS with frame generation enabled. In this configuration you'll probably see latency equivalent to playing in the 100-120 FPS range, but with a higher level of smoothness, which we feel is acceptable for single-player titles where latency isn't a huge factor.

Maybe its improved since then, and it probably doesnt matter as much in games where reaction times arent as much of a factor. Then again those are the types of games where high FPS doesnt matter as much anyway.
 
  • Like
Reactions: Brian Boru
RX 7600 doesn't appear to be in competition with it but is aimed at lower end again. Not out for a few weeks still.

I don't know why I keep watching videos on new GPU, its not like I am going to buy one... how many cases could take 2 GPU now most of them are triple slot. Not like I could buy a 4060 and put it in a physx card... Guess it would be one way to avoid sag on my 7900xt, use a 4060 under it to hold it up. Who needs fresh air anyway.

DLSS... fake frames. Could be jealousy at not having anything like it on AMD side. But DLSS is used as a reason to buy Nvidia GPU with lower vram totals. The promise of compression algorithms that mean you don't need more VRAM on horizon is what reduces the cries for more on new Nvidia cards going forward. Anything to not increase it... Nvidia's pro cards start around 24gb of Vram, putting more on the gamer cards reduces profits. Especially since they based on same cores. AMD don't care so much as their pro cards have more VRAM still. Think they start at 32gb

Originally I was going to wait for the 7800 to come out... I would still be waiting. I wouldn't have any idea what to buy now. So glad I was impatient and lucky enough to get the card I wanted.
 
  • Like
Reactions: Brian Boru
RX 7600 looks roughly 1/3 of an 7900XTX

Shading Units 6144/2048
TMUs 384/128
ROPs 192/64
Compute Units 96/32
RT Cores 96/32
Memory Bus 384 bit/128bit
Bandwidth 960.0 GB/s /288 GB/S

It might clock higher and have improved drivers, but if it performs about the same as the specs itll be around an RTX 3060 in raster. Not too exciting for me at least.

Yea 8GB of VRAM isnt what I'd be looking for right now. then again if I wanted to ultra stuff with RTX on I wouldnt be looking at a 60 series card.
 
  • Like
Reactions: Brian Boru
The elusive 7700/7800 might be more interesting but no need to release those yet when your previous gen still competes well against the newest Nvidia cards. Only card AMD can't match is the 4090 and that won't happen this generation. I doubt they make a super 7950 xtx with more power just to compete... as Nvidia never released a 4090 TI model.

So no rush to release new AMD cards beside low end for some reason. Could be it offers AV1 encoding which is about only thing the 40 series low end cards have over the AMD cards now. That will even that out.

Very likely all the GPU above it in AMD series have somewhere between 20 and 8gb VRAM on them. Possibly next has 12 if its anything like last gen... they might just jump to 16. Unlikely to have more than 20gb

FSR3 meant to include frame generation

But perhaps the biggest and most intractable problem for FSR 3 and DLSS 3 is CPU bottlenecking. It doesn't matter how many frames the GPU is putting out if the CPU can't keep up, and even the fastest CPUs have limits.

It seems we need those 10ghz Pentium 4's still

bottlenecks are fun. Just need to learn to stop trying to remove them all. As you can't.
 
  • Like
Reactions: Brian Boru

TRENDING THREADS

Latest posts