There's a lot of things going on here
That I know, the Nvidia cards aren't compatible with the FreeSync or other "sync" technolgies because it's actually a hardware thing.
That's not correct (thankfully).
Freesync is essentially an AMD branding of adaptive sync, certifying various features depending on what 'level' of Freesync is covered. Nvidia cards are
compatible with adaptive sync, they do support it at a hardware level.
Nvidia's 1000 and newer GPUs were always capable of supporting adaptive sync over Displayport - but Nvidia didn't enable it via drivers until January 2019. Once they did, you could use Nvidia GPUs with Freesync monitors, enabling adaptive sync.
Nvidia, to avoid losing face entirely, introduced 'gsync compatible' which is a certification programme where they say that an adaptive sync monitor performs with one of their GPUs to a given standard. But in reality, just about any Freesync display works fine with an Nvidia GPU:
'full' gsync uses a hardware module inside the monitor to handle the adaptive sync, and that only works with Nvidia GPUs, so the monitor will not do adaptive sync with non-Nvidia GPUs. This will rule out AMD and Intel's forthcoming GPUs as upgrade options. And isn't very futureproof, as (like I noted above) even Nvidia are going to make sure that future 'full gsync' monitors also support adaptive sync so you can use them with other companies' GPUs.
The better buy at this point is Freesync or 'Gsync Compatible'.
Also my understanding is G-SYNC has been tested and "proven" (independently) to be better than AMD's implementation.
Interestingly, that's arguably not really true either. On paper, gsync is technically better, in that it can work syncing the refresh rate all the way down to 1hz / 1fps.
Which sounds great until you think that if you're gaming at those framerates, the experience is going to be awful, tearing or not.
Also, freesync high refresh monitors have the ability to handle low frame/refresh rates through LFC - low framerate compensation. Where they can double up the number of frames in order to keep the sync going. So if the monitor has a range of 40-144hz for example, and frames drop to 35fps, the monitor just displays 70hz and keeps the sync so there's still no tearing.
The advantage gsync has here is proportionately bigger at 4k, where the lower refresh rates of monitors typically means Freesync monitors can't manage LFC. But you're not buying a 4k monitor, so...
But in terms of "independently proven" to be better, well, not so much. Blind tests haven't given much to separate the pack. And a more formal review by Tom's Hardware says:
When we first conceived this experiment, we expected the result to swing in favor of the G-Sync monitor. G-Sync monitors usually have more features and perform a little better in our color and contrast tests. We hypothesized that while you’re paying a premium for G-Sync, those monitors perform slightly better in other areas related to image quality.
Our testing of the AOC Agon AG241QG and AG241QX proved that theory wrong. Obviously, the FreeSync-based QX is superior in every way except for its maximum refresh rate. But why would one give up contrast and spend an extra $300 just to gain 21Hz that you can’t actually see when playing? Clearly, the AG241QX is the better choice.
We test two nearly identical monitors—one with Nvidia G-Sync, another with AMD FreeSync —to see if one adaptive sync tech is best.
i.e. that the monitor matters more to your experience than the sync category of sync.
But if what you're saying is, gamers really shouldn't concern themselves with whatever "sync", what should we look for and what to expect one vs the other?
I think we should generally avoid gsync-exclusive, certainly at 1080p/1440p as it locks us into Nvidia for no real benefit to us. Moreover, a lot of gsync exclusive monitors use older models of panel too
And then look at reviews:
One thing I've had problems finding are just general review and comparisons for hardware. I remember as I was getting away from gaming, a few reviewers were renouncing "benchmarks" and taking a stance of "you can get benchmarks at other reviewers". I used to got like Toms, Anandtech, hothardware, etc. but they seem to be the same way these days (their reviews are nothing you can't get out of a brochure).
Thankfully it's not really gone that way. There are lots of reviews out there.
- I'm not entirely certain on the trimonitor thing. I first wanted an ultrawide-wide screen, something like hte LG 49BL95C-W 49" UltraWide Dual 5120x1440 5ms HDMI DisplayPort USB-C Anti-Glare HDR Built-in Speakers IPS Monitor
But then I learned this is only a 60hz and also it's bad for productivity (this particular monitor has a mask).
It's confusing because that's not typical ultrawide, that's 'super ultra wide' (32:9 aspect ratio). And those are extremely niche.
There are lots of 21:9 ultrawide monitors out there that do support 100-120hz refresh rates.
TFT Central and pcmonitors.info specialise in very in-depth reviews for monitors, while other hardware sites like Hardware Unboxed also do detailed monitor reviews.
or monitors I'm looking for probably IPS, because I want to do some work too.
Not sure if you said what work, but if it's just general work and you want decent reviewing angles, VA panels are good for that too. VA has become increasingly popular, due to the deep blacks, strong colour accuracy, and lack of glow, while still offering much better viewing angles than TN.
It just seemed like a lot of people run the multi monitor setups these days. I don't even know, my first game will be Doom Eternal. Would DE even work with trimonitors? I guess I assumed most games would these days, but I don't know that.
I don't think multi monitor gaming is very common at all these days, if anything it feels far less common - thanks to higher resolutions and higher refresh rates, and ultrawide, giving people many more options for a luxury gaming experience than in the past where if you wanted your gaming experience to be somehow more
, you just had to buy more monitors. Now you can have a single monitor that just gives you more
(pixels/ refresh rate/ aspect ratio / smoothness through adaptive sync...)