I hope you don't mind. I kind of skipped through that after being tortured for the first two minutes 🤣
I think the real issue is that Nvidia are concentrating on the buzzword more than gaming. Even the 50XX have made greater gains in AI than in gaming.Nvidia are hardly trying, they giving less and less per generation for more money. They boiling the frog and so far most users like the heat.
If Nvidia really tried I suspect they would win more convincingly but why when the frogs seem to be happy,
Honestly, I'm not sure they can make a 9080 XT except by trying to stuff a ton more stuff on it. If they could, though, and charge $750 for it, that would be huge for them. But the power draw would be much more than the 5070 TI, and it would be a lot bigger. But then again, I guess it would be competing with the 5080Nvidia still use monolithic die in their GPU whereas AMD use chiplets. I wouldn't think there is a great variation in costs. Nvidia can use four templates for all their families and release cut down versions as they get more failed dies. More chance of fails with large dies.
They all use the same factory to make all the chips -TSMC.
Its more profitable for AMD to use the silicon to make a Ryzen 7700 than it is to make the GPU.
If they didn't have the foothold in Consoles it would be a good question as to why they make GPU
stacked 3d cache = 2 layers of 96mb each of cache on one CPU, right not the 9980x3d only has one layer. If so the PS6 is based on AM6 and we won't get CPU based on it until 2026 at earliest.
AMD can probably use the graphics core chiplets in other formats, such as in their APU and mobile processors, given AMD don't have as many GPU to use them in as well. Only making 9070 and 9060 this gen. 4 cards so far though some people wish they make a 9080 XT as well.
I don't think they need it right now. They already have an advantage in the most important segment. Maybe later in this generation they can sort of refresh the market with something like a 9070 XT Extreme kind of like Nvidia will do with the Super.Videos I saw last year showed they were only after mid tier this generation. If they had intended to make a 9080 XT some YouTube channels would know about it now. The source of the 9080 XT story isn't one of the ones that get tips about what AMD is working on.
It would have come out first instead of the 70.
There could be a better version of the 9070 XT but I don't think its based on facts or rumours he heard from AMD.
He is one of the channels who would know for sure.
I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since....covered under my new family plan Assurance 4 year warranty...
So I guess the warranty makes up at least some for the price difference. I can throw the thing down the stairs 3 years and 11 months from now and get a new one. Can't beat that.
They are apparently very reliable. I'm going to send back my Sager laptop. It has several things wrong with it. I have no idea what they are going to do about it. The GPU is a 2080, so it's pretty out of date.I had the same warranty company when I bought my HyperX Cloud II headset in ~2016. In about 2019, I knew the warranty was coming to an end, so I contacted Best Buy which is where I bought it from, and complained how the headset was failing. It worked just fine, but because I had that warranty they agreed to replace it. At the time the store didn't have the Cloud II, so they let me get the Cloud Alpha thinking they were the same. I basically got a free upgrade 3 years after buying the original headset and haven't spent money on headsets since.
In my experience, Stalker 2 and Indiana Jones were the most demanding games I found on there, Indy especially. I was pushing 20fps on Low settings 🤣I'll have to find a demanding game on Game Pass to give it a try.
Weren't you only using half your RAM? I seem to remember reading something about you and RAM. I'm not interested in Stalker anymore, but I'd like to try the Indiana Jones game. Good recommendation. I'm already playing Avowed, and I know exactly what FPS my laptop gets on that, so I'll probably fire that up, too, so I can see the difference.In my experience, Stalker 2 and Indiana Jones were the most demanding games I found on there, Indy especially. I was pushing 20fps on Low settings 🤣
Definitely, and that could have caused a lot of the issues. At the same time, it is a damn demanding game, it may be a good benchmark for your new PC.Weren't you only using half your RAM? I seem to remember reading something about you and RAM. I'm not interested in Stalker anymore, but I'd like to try the Indiana Jones game. Good recommendation. I'm already playing Avowed, and I know exactly what FPS my laptop gets on that, so I'll probably fire that up, too, so I can see the difference.
The problem is that all of these people disregard the new DLSS, but the fact is that it is real and it's fantastic, and it makes this a huge leap rather than a small bump. I don't particularly like Nvidia, but most of these YouTubers are just shock jocks that are just as biased as they think that userbenchmark is.Pay more for less, its not just a feeling you have