Random Hardware topics

Page 25 - Love gaming? Join the PC Gamer community to share that passion with gamers all around the world!
Nvidia are hardly trying, they giving less and less per generation for more money. They boiling the frog and so far most users like the heat.

If Nvidia really tried I suspect they would win more convincingly but why when the frogs seem to be happy,
 

ZedClampet

Community Contributor
Nvidia are hardly trying, they giving less and less per generation for more money. They boiling the frog and so far most users like the heat.

If Nvidia really tried I suspect they would win more convincingly but why when the frogs seem to be happy,
I think the real issue is that Nvidia are concentrating on the buzzword more than gaming. Even the 50XX have made greater gains in AI than in gaming.

They have a huge advantage in architecture. If you compare the specs of the 5070 ti to the 9070 XT, the 9070 XT should blow the 5070 TI out of the water. But it doesn't. The 5070 TI is probably cheaper to make than the 9070 XT, but they are charging $150 more for it. That's amazing business if you can get away with it.
 

ZedClampet

Community Contributor
By the way, to expand on the AI statement above, AMD has finally started adding CUDA cores to their GPUs, which is great, but the 5070 TI is still more than twice as fast as the 9070 XT at AI, and there are a ton of people using AI on their PCs now. That art site I visit has over 60,000 members. A lot of them go big with the 5090 or even the A100, which costs thousands of dollars more. But there are a lot of hobbyists like me who are content with the general gaming GPUs.
 
Nvidia still use monolithic die in their GPU whereas AMD use chiplets. I wouldn't think there is a great variation in costs. Nvidia can use four templates for all their families and release cut down versions as they get more failed dies. More chance of fails with large dies.

They all use the same factory to make all the chips -TSMC.

Its more profitable for AMD to use the silicon to make a Ryzen 7700 than it is to make the GPU.
If they didn't have the foothold in Consoles it would be a good question as to why they make GPU

stacked 3d cache = 2 layers of 96mb each of cache on one CPU, right now the 9800x3d only has one layer. If so the PS6 is based on AM6 and we won't get CPU based on it until 2026 at earliest.

AMD can probably use the graphics core chiplets in other formats, such as in their APU and mobile processors, given AMD don't have as many GPU to use them in as well. Only making 9070 and 9060 this gen. 4 cards so far though some people wish they make a 9080 XT as well.
 
Last edited:
  • Like
Reactions: ZedClampet

ZedClampet

Community Contributor
Nvidia still use monolithic die in their GPU whereas AMD use chiplets. I wouldn't think there is a great variation in costs. Nvidia can use four templates for all their families and release cut down versions as they get more failed dies. More chance of fails with large dies.

They all use the same factory to make all the chips -TSMC.

Its more profitable for AMD to use the silicon to make a Ryzen 7700 than it is to make the GPU.
If they didn't have the foothold in Consoles it would be a good question as to why they make GPU

stacked 3d cache = 2 layers of 96mb each of cache on one CPU, right not the 9980x3d only has one layer. If so the PS6 is based on AM6 and we won't get CPU based on it until 2026 at earliest.

AMD can probably use the graphics core chiplets in other formats, such as in their APU and mobile processors, given AMD don't have as many GPU to use them in as well. Only making 9070 and 9060 this gen. 4 cards so far though some people wish they make a 9080 XT as well.
Honestly, I'm not sure they can make a 9080 XT except by trying to stuff a ton more stuff on it. If they could, though, and charge $750 for it, that would be huge for them. But the power draw would be much more than the 5070 TI, and it would be a lot bigger. But then again, I guess it would be competing with the 5080
 
Videos I saw last year showed they were only after mid tier this generation. If they had intended to make a 9080 XT some YouTube channels would know about it now. The source of the 9080 XT story isn't one of the ones that get tips about what AMD is working on.

It would have come out first instead of the 70.

There could be a better version of the 9070 XT but I don't think its based on facts or rumours he heard from AMD.
He is one of the channels who would know for sure.
 

ZedClampet

Community Contributor
Videos I saw last year showed they were only after mid tier this generation. If they had intended to make a 9080 XT some YouTube channels would know about it now. The source of the 9080 XT story isn't one of the ones that get tips about what AMD is working on.

It would have come out first instead of the 70.

There could be a better version of the 9070 XT but I don't think its based on facts or rumours he heard from AMD.
He is one of the channels who would know for sure.
I don't think they need it right now. They already have an advantage in the most important segment. Maybe later in this generation they can sort of refresh the market with something like a 9070 XT Extreme kind of like Nvidia will do with the Super.
 
...covered under my new family plan Assurance 4 year warranty...

So I guess the warranty makes up at least some for the price difference. I can throw the thing down the stairs 3 years and 11 months from now and get a new one. Can't beat that.
I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since.
 
  • Love
Reactions: ZedClampet

ZedClampet

Community Contributor
I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since.
They are apparently very reliable. I'm going to send back my Sager laptop. It has several things wrong with it. I have no idea what they are going to do about it. The GPU is a 2080, so it's pretty out of date.

****

Got my computer. Haven't set it up yet. Thing came in a 4 foot tall box. I thought maybe I'd accidentally ordered a Smart Car. Computer is big and heavy. The last desktop I had was tiny compared to this and probably half the weight. I'll have to find a demanding game on Game Pass to give it a try.
 
  • Like
Reactions: neogunhero

ZedClampet

Community Contributor
In my experience, Stalker 2 and Indiana Jones were the most demanding games I found on there, Indy especially. I was pushing 20fps on Low settings :ROFLMAO:
Weren't you only using half your RAM? I seem to remember reading something about you and RAM. I'm not interested in Stalker anymore, but I'd like to try the Indiana Jones game. Good recommendation. I'm already playing Avowed, and I know exactly what FPS my laptop gets on that, so I'll probably fire that up, too, so I can see the difference.
 
  • Haha
Reactions: neogunhero
Weren't you only using half your RAM? I seem to remember reading something about you and RAM. I'm not interested in Stalker anymore, but I'd like to try the Indiana Jones game. Good recommendation. I'm already playing Avowed, and I know exactly what FPS my laptop gets on that, so I'll probably fire that up, too, so I can see the difference.
Definitely, and that could have caused a lot of the issues. At the same time, it is a damn demanding game, it may be a good benchmark for your new PC.
 
  • Like
Reactions: ZedClampet

ZedClampet

Community Contributor
Pay more for less, its not just a feeling you have
The problem is that all of these people disregard the new DLSS, but the fact is that it is real and it's fantastic, and it makes this a huge leap rather than a small bump. I don't particularly like Nvidia, but most of these YouTubers are just shock jocks that are just as biased as they think that userbenchmark is.
 
Last edited:
I actually dont think this is related to DLSS. Gamers Nexus are probably the best of the tech Youtubers, data is usually good and methodology explained unlike a lot of them.

You can see here 70 class GPUs until 3000 series were over half of a flagship CUDA wise, the current gen '70' series is 28% of the flagship and costs more than ever. I think its valid to bring up.
 
  • Like
Reactions: neogunhero
I wish older SSDs were cheaper to buy. My mobo only supports PCIe Gen3, so speeds theoretically max out at 4GB/s. I have been searching for Gen3 SSD's and they are quite rare these days. Also it doesn't make sense to buy one, since Gen4 is backwards compatible, and the few Gen3's they do have are almost the exact same price as Gen4. Then there are sort of different tiers to Gen4, with some slower SSD's being cheaper than the ones that hit max speeds.

My current issue is my boot SSD does not support DRAM, and I don't believe it is 3D NAND either, and as far as I understand it, that hurts performance, those things make it perform better especially when reading/writing multiple things at once. This is probably due to Windows being on that same drive, so it is doing a lot that it basically can't handle when I start to download games to it. When downloading games to that drive, the download speed is almost interlocked with write speed, so somewhere in that mix it causes downloads to be very spotty, often stopping for upwards of 30 seconds, and only hitting max maybe 15MB/s when my internet can hit 45MB/s.

So, I'm considering buying a new Gen4 SSD with all those bells and whistles and reinstalling Windows to it. Maybe a 2TB drive, so I don't need to worry about sizes so much. I still have another 1TB SSD that works just fine, and a 500GB SATA SSD for extra storage, mainly programs on there.
 
  • Like
Reactions: Kaamos_Llama
Considering another "big" purchase for my PC. I really need to get a new monitor. I have been wanting one for a long time, and I've been window-shopping for a decent budget one. I had a pretty decent Philips monitor that was 32 inches, curved, 75hz, it was a major upgrade over the old circa 2010 HP monitor I was using, but I destroyed the monitor during a move a few years back. Since then I've been using a Samsung smart tv, but smart tv as in when they were first coming out in maybe 2012-14. It's old and looks like crap but I'm so used to it that it doesn't bother me all the time. I'm not entirely sure if this will fix my AA issues that I mentioned in the general gaming thread when playing Atomfall, but it is still an upgrade that could last me for a long time.

What I'm looking for is 32 inches, 1440p, at least 120hz, 1ms MPRT, curved, Gsync or FreeSync, all under $200. Bonus features would be an IPS panel and HDR, but at this price point for the size and resolution I want, it may not be possible. If I'm going to upgrade then I want to have a major upgrade over my current TV monitor, and my requirements will definately deliver on that alone. HDR would be a massive bonus as well, but all I know about it is that the standards get very confusing, so I'd have to do more research on what is best for me.

I've been looking at Amazon refurbished ones and there are a lot of good picks there. This AOC monitor ticks a lot of boxes and has good reviews. One thing that is cool about this monitor that is slightly unusual is that is has 1000R curvature, which is pretty much needed for a VA panel of this size. A lot of monitors even at 32 inches stick with 1500R curvature which is less extreme of a curve, and I have seen numerous posts online stating that isn't enough for a 32 inch screen. VA is semi glossy, so the extreme curve theoretically helps with viewing angles, in practice I'm not sure how much I'd like it, but I definitely am willing to try it out.

I'm just looking around really, not sure how serious I am about actually getting one, and the options that check all my boxes are quite slim. I'll have to really look around to find a great deal. My GPU couldn't handle most modern AAA games at 1440p, but it would be very nice for older games and indies. I think my old monitor also has low amount of pixels, and plus 1080p at 32 inches with my face two feet from the screen makes stuff like that a lot more noticeable.
 
  • Like
Reactions: Kaamos_Llama

TRENDING THREADS

Latest posts