Random Hardware topics

Page 25 - Love gaming? Join the PC Gamer community to share that passion with gamers all around the world!
Nvidia are hardly trying, they giving less and less per generation for more money. They boiling the frog and so far most users like the heat.

If Nvidia really tried I suspect they would win more convincingly but why when the frogs seem to be happy,
 

ZedClampet

Community Contributor
Nvidia are hardly trying, they giving less and less per generation for more money. They boiling the frog and so far most users like the heat.

If Nvidia really tried I suspect they would win more convincingly but why when the frogs seem to be happy,
I think the real issue is that Nvidia are concentrating on the buzzword more than gaming. Even the 50XX have made greater gains in AI than in gaming.

They have a huge advantage in architecture. If you compare the specs of the 5070 ti to the 9070 XT, the 9070 XT should blow the 5070 TI out of the water. But it doesn't. The 5070 TI is probably cheaper to make than the 9070 XT, but they are charging $150 more for it. That's amazing business if you can get away with it.
 

ZedClampet

Community Contributor
By the way, to expand on the AI statement above, AMD has finally started adding CUDA cores to their GPUs, which is great, but the 5070 TI is still more than twice as fast as the 9070 XT at AI, and there are a ton of people using AI on their PCs now. That art site I visit has over 60,000 members. A lot of them go big with the 5090 or even the A100, which costs thousands of dollars more. But there are a lot of hobbyists like me who are content with the general gaming GPUs.
 
Nvidia still use monolithic die in their GPU whereas AMD use chiplets. I wouldn't think there is a great variation in costs. Nvidia can use four templates for all their families and release cut down versions as they get more failed dies. More chance of fails with large dies.

They all use the same factory to make all the chips -TSMC.

Its more profitable for AMD to use the silicon to make a Ryzen 7700 than it is to make the GPU.
If they didn't have the foothold in Consoles it would be a good question as to why they make GPU

stacked 3d cache = 2 layers of 96mb each of cache on one CPU, right now the 9800x3d only has one layer. If so the PS6 is based on AM6 and we won't get CPU based on it until 2026 at earliest.

AMD can probably use the graphics core chiplets in other formats, such as in their APU and mobile processors, given AMD don't have as many GPU to use them in as well. Only making 9070 and 9060 this gen. 4 cards so far though some people wish they make a 9080 XT as well.
 
Last edited:
  • Like
Reactions: ZedClampet

ZedClampet

Community Contributor
Nvidia still use monolithic die in their GPU whereas AMD use chiplets. I wouldn't think there is a great variation in costs. Nvidia can use four templates for all their families and release cut down versions as they get more failed dies. More chance of fails with large dies.

They all use the same factory to make all the chips -TSMC.

Its more profitable for AMD to use the silicon to make a Ryzen 7700 than it is to make the GPU.
If they didn't have the foothold in Consoles it would be a good question as to why they make GPU

stacked 3d cache = 2 layers of 96mb each of cache on one CPU, right not the 9980x3d only has one layer. If so the PS6 is based on AM6 and we won't get CPU based on it until 2026 at earliest.

AMD can probably use the graphics core chiplets in other formats, such as in their APU and mobile processors, given AMD don't have as many GPU to use them in as well. Only making 9070 and 9060 this gen. 4 cards so far though some people wish they make a 9080 XT as well.
Honestly, I'm not sure they can make a 9080 XT except by trying to stuff a ton more stuff on it. If they could, though, and charge $750 for it, that would be huge for them. But the power draw would be much more than the 5070 TI, and it would be a lot bigger. But then again, I guess it would be competing with the 5080
 
Videos I saw last year showed they were only after mid tier this generation. If they had intended to make a 9080 XT some YouTube channels would know about it now. The source of the 9080 XT story isn't one of the ones that get tips about what AMD is working on.

It would have come out first instead of the 70.

There could be a better version of the 9070 XT but I don't think its based on facts or rumours he heard from AMD.
He is one of the channels who would know for sure.
 

ZedClampet

Community Contributor
Videos I saw last year showed they were only after mid tier this generation. If they had intended to make a 9080 XT some YouTube channels would know about it now. The source of the 9080 XT story isn't one of the ones that get tips about what AMD is working on.

It would have come out first instead of the 70.

There could be a better version of the 9070 XT but I don't think its based on facts or rumours he heard from AMD.
He is one of the channels who would know for sure.
I don't think they need it right now. They already have an advantage in the most important segment. Maybe later in this generation they can sort of refresh the market with something like a 9070 XT Extreme kind of like Nvidia will do with the Super.
 
...covered under my new family plan Assurance 4 year warranty...

So I guess the warranty makes up at least some for the price difference. I can throw the thing down the stairs 3 years and 11 months from now and get a new one. Can't beat that.
I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since.
 
  • Love
Reactions: ZedClampet

ZedClampet

Community Contributor
I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since.
They are apparently very reliable. I'm going to send back my Sager laptop. It has several things wrong with it. I have no idea what they are going to do about it. The GPU is a 2080, so it's pretty out of date.

****

Got my computer. Haven't set it up yet. Thing came in a 4 foot tall box. I thought maybe I'd accidentally ordered a Smart Car. Computer is big and heavy. The last desktop I had was tiny compared to this and probably half the weight. I'll have to find a demanding game on Game Pass to give it a try.
 
  • Like
Reactions: neogunhero

ZedClampet

Community Contributor
In my experience, Stalker 2 and Indiana Jones were the most demanding games I found on there, Indy especially. I was pushing 20fps on Low settings :ROFLMAO:
Weren't you only using half your RAM? I seem to remember reading something about you and RAM. I'm not interested in Stalker anymore, but I'd like to try the Indiana Jones game. Good recommendation. I'm already playing Avowed, and I know exactly what FPS my laptop gets on that, so I'll probably fire that up, too, so I can see the difference.
 

TRENDING THREADS

Latest posts