Question Another PC build thread...

Mar 20, 2020
9
2
15
I'm looking to get back into gaming after a few years away (6-7 years approx). I haven't been paying much attention to the hardware market, But I'm experienced (and used to be in IT) building PCs and that sort of thing. What caught my attention was the new Doom game I believe being released today.

I want a higher end machine, although not the highest. I'd like consider running a tri-monitor setup, but that's one area I'm sure what kind of GPU power would be required. So here's where I'm at with what I've learned over the past few days

Ryzen 7 3800X 3.9GHz
Windows 10 Pro 64bit
MSI MGP X570 GAMING PLUS
Patriot Viper 2x8 DDR4 4400
* Silicon Power 1TB NVME PICIE Gen3 X4 3400/3000
EVGA SuperNova 850 G+ 80+ Gold 850 watts, full modular, 10 year warranty, Japanese Caps
Enermax MAKASHI Addressible E-ATX Full Tower Gaming (comes with LiqMax III 120 CPU cooler)
* had good expierence with 2 of their SSDs, good price, extremely fast have lasted years now

So here's where we get to GPUs and displays
From what I've gathered, I want display(s) that are true G-SYNC, alhtough they are more epxensive

Monitor(s)
LG 32GK650G-B 32" QHD 2560 x 1440 (2K) 5ms (Faster) 144Hz (OC 165Hz) HDMI, DisplayPort NVIDIA G-Sync 3-Side Virtually Borderless Gaming Monitor

RTX 2080 Super - I'm assuming if had one of the above monitors, with this GPU I'd be able turn the very latest games at screen resolutions and top quality settings

What if I were to use a tri-monitor setup of these same monitors?
Would 2 RTX 2080s in SLI work?
Would 1 RTX 2080 Super or TI work?
Would it take 2 Supers, or 2 TIs?

Thanks and I look foward to getting to know you all.
 
Feb 17, 2020
174
172
270
You don't really want a 'true gsync' display as the advantages are relatively slim for 1440p 144hz gaming, while the downside of locking yourself into the Nvidia ecosystem is very real as it will limit your future upgrade choices.

Note Nvidia have said they will make future 'true gsync' monitor also support adaptive sync / freesync which is frankly an indication of which way the wind is blowing.


RTX 2080 Super - I'm assuming if had one of the above monitors, with this GPU I'd be able turn the very latest games at screen resolutions and top quality settings
Yes.

Though beware of talking about 'top' quality settings. Many games contain a few settings that will kill performance in even the most beastly powerful of PCs, often for little to no real fidelity gain. It's not a problem that can be solved by money, the only solution is to pay attention to the settings :)

What if I were to use a tri-monitor setup of these same monitors?
Would 2 RTX 2080s in SLI work?
Would 1 RTX 2080 Super or TI work?
Would it take 2 Supers, or 2 TIs?
Depends on the game.

In many games, SLI will give 0 extra performance. TAA in games like the Division 2 and Total War Three Kingdoms effectively precludes SLI as I understand it, and TAA is very popular these days. And then some other games don't use TAA but still don't support SLI and forcing it to work is challenging at best.

Some games would be fine with 3 x 1440p monitors and a single RTX 2080 ti (say) but others would want more grunt if you are really pushing the settings.

Why 3 monitors? Is it for something very specific like racing simulation gaming?

if it's more about just having a wide and immersive view, then what about an ultrawide monitor (21:9 aspect ratio). Important to caveat that not all games support that too, but a lot more do or can be easily modded to do so.
 
Mar 20, 2020
9
2
15
Thanks for the reply.

You hit a few key points that I'm not sure about.

- I get you on the game settings, like maybe runnning different levels of the different types of anti-aliasing. I'm not looking to run Doom Eternal every setting at it's max, but close to max.

- I don't completely understand the refresh rate, such as G-SYNC and whatever. What I'm most concerned about is tearing, shadows, etc. that people seem to complain about with modern setups that aren't quite optimal. That I know, the Nvidia cards aren't compatible with the FreeSync or other "sync" technolgies because it's actually a hardware thing. Also my understanding is G-SYNC has been tested and "proven" (independently) to be better than AMD's implementation. But if what you're saying is, gamers really shouldn't concern themselves with whatever "sync", what should we look for and what to expect one vs the other?

- I'm not entirely certain on the trimonitor thing. I first wanted an ultrawide-wide screen, something like hte LG 49BL95C-W 49" UltraWide Dual 5120x1440 5ms HDMI DisplayPort USB-C Anti-Glare HDR Built-in Speakers IPS Monitor But then I learned this is only a 60hz and also it's bad for productivity (this particular monitor has a mask).

For monitors I'm looking for probably IPS, because I want to do some work too. It just seemed like a lot of people run the multi monitor setups these days. I don't even know, my first game will be Doom Eternal. Would DE even work with trimonitors? I guess I assumed most games would these days, but I don't know that.

I like FPS, racing games (more sim than arcade), RTS, RPG, etc,

One thing I've had problems finding are just general review and comparisons for hardware. I remember as I was getting away from gaming, a few reviewers were renouncing "benchmarks" and taking a stance of "you can get benchmarks at other reviewers". I used to got like Toms, Anandtech, hothardware, etc. but they seem to be the same way these days (their reviews are nothing you can't get out of a brochure).

You don't really want a 'true gsync' display as the advantages are relatively slim for 1440p 144hz gaming, while the downside of locking yourself into the Nvidia ecosystem is very real as it will limit your future upgrade choices.

Note Nvidia have said they will make future 'true gsync' monitor also support adaptive sync / freesync which is frankly an indication of which way the wind is blowing.


Yes.

Though beware of talking about 'top' quality settings. Many games contain a few settings that will kill performance in even the most beastly powerful of PCs, often for little to no real fidelity gain. It's not a problem that can be solved by money, the only solution is to pay attention to the settings



Depends on the game.

In many games, SLI will give 0 extra performance. TAA in games like the Division 2 and Total War Three Kingdoms effectively precludes SLI as I understand it, and TAA is very popular these days. And then some other games don't use TAA but still don't support SLI and forcing it to work is challenging at best.

Some games would be fine with 3 x 1440p monitors and a single RTX 2080 ti (say) but others would want more grunt if you are really pushing the settings.

Why 3 monitors? Is it for something very specific like racing simulation gaming?

if it's more about just having a wide and immersive view, then what about an ultrawide monitor (21:9 aspect ratio). Important to caveat that not all games support that too, but a lot more do or can be easily modded to do so.
 
Feb 17, 2020
174
172
270
There's a lot of things going on here :)

That I know, the Nvidia cards aren't compatible with the FreeSync or other "sync" technolgies because it's actually a hardware thing.
That's not correct (thankfully).

Freesync is essentially an AMD branding of adaptive sync, certifying various features depending on what 'level' of Freesync is covered. Nvidia cards are compatible with adaptive sync, they do support it at a hardware level.

Nvidia's 1000 and newer GPUs were always capable of supporting adaptive sync over Displayport - but Nvidia didn't enable it via drivers until January 2019. Once they did, you could use Nvidia GPUs with Freesync monitors, enabling adaptive sync.

Nvidia, to avoid losing face entirely, introduced 'gsync compatible' which is a certification programme where they say that an adaptive sync monitor performs with one of their GPUs to a given standard. But in reality, just about any Freesync display works fine with an Nvidia GPU:
techspot.com/article/1810-lg-freesync-and-nvidia-geforce/

'full' gsync uses a hardware module inside the monitor to handle the adaptive sync, and that only works with Nvidia GPUs, so the monitor will not do adaptive sync with non-Nvidia GPUs. This will rule out AMD and Intel's forthcoming GPUs as upgrade options. And isn't very futureproof, as (like I noted above) even Nvidia are going to make sure that future 'full gsync' monitors also support adaptive sync so you can use them with other companies' GPUs.

The better buy at this point is Freesync or 'Gsync Compatible'.

Also my understanding is G-SYNC has been tested and "proven" (independently) to be better than AMD's implementation.
Interestingly, that's arguably not really true either. On paper, gsync is technically better, in that it can work syncing the refresh rate all the way down to 1hz / 1fps.

Which sounds great until you think that if you're gaming at those framerates, the experience is going to be awful, tearing or not.

Also, freesync high refresh monitors have the ability to handle low frame/refresh rates through LFC - low framerate compensation. Where they can double up the number of frames in order to keep the sync going. So if the monitor has a range of 40-144hz for example, and frames drop to 35fps, the monitor just displays 70hz and keeps the sync so there's still no tearing.

The advantage gsync has here is proportionately bigger at 4k, where the lower refresh rates of monitors typically means Freesync monitors can't manage LFC. But you're not buying a 4k monitor, so...

But in terms of "independently proven" to be better, well, not so much. Blind tests haven't given much to separate the pack. And a more formal review by Tom's Hardware says:

When we first conceived this experiment, we expected the result to swing in favor of the G-Sync monitor. G-Sync monitors usually have more features and perform a little better in our color and contrast tests. We hypothesized that while you’re paying a premium for G-Sync, those monitors perform slightly better in other areas related to image quality.

Our testing of the AOC Agon AG241QG and AG241QX proved that theory wrong. Obviously, the FreeSync-based QX is superior in every way except for its maximum refresh rate. But why would one give up contrast and spend an extra $300 just to gain 21Hz that you can’t actually see when playing? Clearly, the AG241QX is the better choice.
i.e. that the monitor matters more to your experience than the sync category of sync.

But if what you're saying is, gamers really shouldn't concern themselves with whatever "sync", what should we look for and what to expect one vs the other?
I think we should generally avoid gsync-exclusive, certainly at 1080p/1440p as it locks us into Nvidia for no real benefit to us. Moreover, a lot of gsync exclusive monitors use older models of panel too

And then look at reviews:

One thing I've had problems finding are just general review and comparisons for hardware. I remember as I was getting away from gaming, a few reviewers were renouncing "benchmarks" and taking a stance of "you can get benchmarks at other reviewers". I used to got like Toms, Anandtech, hothardware, etc. but they seem to be the same way these days (their reviews are nothing you can't get out of a brochure).
Thankfully it's not really gone that way. There are lots of reviews out there.

- I'm not entirely certain on the trimonitor thing. I first wanted an ultrawide-wide screen, something like hte LG 49BL95C-W 49" UltraWide Dual 5120x1440 5ms HDMI DisplayPort USB-C Anti-Glare HDR Built-in Speakers IPS Monitor But then I learned this is only a 60hz and also it's bad for productivity (this particular monitor has a mask).
It's confusing because that's not typical ultrawide, that's 'super ultra wide' (32:9 aspect ratio). And those are extremely niche.

There are lots of 21:9 ultrawide monitors out there that do support 100-120hz refresh rates.

TFT Central and pcmonitors.info specialise in very in-depth reviews for monitors, while other hardware sites like Hardware Unboxed also do detailed monitor reviews.

or monitors I'm looking for probably IPS, because I want to do some work too.
Not sure if you said what work, but if it's just general work and you want decent reviewing angles, VA panels are good for that too. VA has become increasingly popular, due to the deep blacks, strong colour accuracy, and lack of glow, while still offering much better viewing angles than TN.

It just seemed like a lot of people run the multi monitor setups these days. I don't even know, my first game will be Doom Eternal. Would DE even work with trimonitors? I guess I assumed most games would these days, but I don't know that.
I don't think multi monitor gaming is very common at all these days, if anything it feels far less common - thanks to higher resolutions and higher refresh rates, and ultrawide, giving people many more options for a luxury gaming experience than in the past where if you wanted your gaming experience to be somehow more, you just had to buy more monitors. Now you can have a single monitor that just gives you more (pixels/ refresh rate/ aspect ratio / smoothness through adaptive sync...)
 
Last edited:
Feb 17, 2020
174
172
270
Just to showcase some monitors and reviews:

1440p IPS, gsync compatible:

Asus TUF Gaming VG27AQ - tftcentral review

Gigabyte AORUS AD27QD (also NB the slightly newer FI27Q - apparently gigabyte offer a 0 bright pixel guarantee on both) - pcmonitors.info review, tftcentral review

LG 27GL850-B nano IPS - Techspot review, Linus Tech Tips review video, tftcentral review.

LG 27GL83A-B, IPS, gsync compatible, using the same panel as the LG 27GL850-B(?)


1440p ultrawide (21:9 aspect ratio): (VA)

Typically: Freesync, 100hz, 48-100hz range, with LFC. Mostly based on the same panel, 1800R curvature.

BenQ EX3501R- tftcentral review
AOC Agon AG352UCG - pcmonitors.info review, tftcentral review
ASUS ROG Strix XG35VQ - pcmonitors.info review

I think those (edit - those 3 1440p ultrawides) might all be 'the same' monitor in that they all use the same AMVA panel (AUO M350QVR01.1)
 
Last edited:
Mar 20, 2020
9
2
15
Thank you Oussebon you have been super helpful!

I'm going to go with the LG monitor with G-SYNC ($459) and a MSI 2080 Super Ventus X5 ($699) for now.
The main reason there's new LG is just releasing
LG UltraGear 38GL950G-B 38" 21:9 Curved 144 Hz G-SYNC IPS Gaming Monitor but it's just becoming available. I'll probably wait a couple months, let some of early adopters get the bugs worked out, and use the 32" for now. Then when I get this 38 I can separate my "office" work area from my gaming area and use the 32 for that (at that point, it's just a display). The 32" isn't that expensive really, so I don't feel like I'll be much by going with it for now.

Does this sound like a workable plan? Is 16gb ram to the point I don't need to worry about it?

Also a question about the M.2 NVMe drive? I forget what it's called, but when partitioning is it still best practice to leave approximately 10% of the drive unallocated to allow the drive's wear algorithms to have some free space to work with?
 

Kaamos_Llama

Community Contributor
Jan 31, 2020
105
131
270
More for my own curiosity @Oussebon. I was under the impression that with Ryzen Ram clocked above 3600MT/s performed worse in gaming without specific tuning. Something to do with the memory divider?

@UnknownPriority 16GB RAM is perfect for gaming at the moment, not sure on the wear levelling question I'm afraid.
 
  • Like
Reactions: UnknownPriority
Feb 17, 2020
174
172
270
As above, I advocate against gsync monitors at the moment.

I would personally be looking at the 32GK850F-B instead. (Edit: Assuming you meant the 32GK650G-B from your initial post, which is gsync only)

It's also hard to really see the value in buying a top end gaming monitor only to replace it with another top end gaming monitor after 2 months. Just delay the purchase altogether.

It's your money of course but tbh if it were mine I'd not use it that way.
 
Last edited:
Feb 17, 2020
174
172
270
More for my own curiosity @Oussebon. I was under the impression that with Ryzen Ram clocked above 3600MT/s performed worse in gaming without specific tuning. Something to do with the memory divider?
I have no idea, it's not something I've looked at due to the pricing making it bad value.

If paying uber $$ for ultra fast RAM, you might as well have bought a 9900k with 3200 / 3600 MHZ RAM. For gaming.

I haven't looked over the main build in detail tbh, was mostly focusing on the monitors.
 

Kaamos_Llama

Community Contributor
Jan 31, 2020
105
131
270
I have no idea, it's not something I've looked at due to the pricing making it bad value.

If paying uber $$ for ultra fast RAM, you might as well have bought a 9900k with 3200 / 3600 MHZ RAM. For gaming.

I haven't looked over the main build in detail tbh, was mostly focusing on the monitors.
Agreed :)

Heres something about what I was referring to


'Here’s a quote from AMD to clarify further: “Enabling 2:1 mode crosses clock domain boundaries, imparting a DRAM latency penalty of approximately 9ns that may be overcome with additional memory clocks, higher CPU frequencies, or sub-timing adjustments.” In other words, 2:1 mode hurts performance, but not so badly that it can’t be made up for elsewhere. '

I guess it turned out to be a be a nit pick, but thats one thing. Make sure that the RAM kit you pick is on the QVL list. The lower timings will be better tuned with XMP and you wont miss out on any performance out of the box. If playing around with secondary and tertiary RAM timings isn't your bag.

EDIT: I'll also throw in, I have an LG GL850. I prefer IPS over VA particlularly for fast paced gaming (Doom) as dark colors tend to smear on VA panels in fast action. Techonology on some VA might be better then I've experienced though as I havent played on one recently.

Totally agree that its worth getting a Gsync compatible monitor over pure Gsync to keep future GPU options open.
 
Last edited:
  • Like
Reactions: Oussebon
Mar 20, 2020
9
2
15
Interesting about the ram speed.

Earliter this week I read that the AMDs RyzensThreadripper peform better with faster ram. I did wonder if there was a maximum the CPU might be able to use or benefit from.

After reading the review you linked and a couple others like https://www.techpowerup.com/review/amd-zen-2-memory-performance-scaling-benchmark/2.html I am changing my mind.

I think will just go with 3600 for the moment. What it looks like is, overclocking, and with tuning for specific conditions higher speed memory may perform better but there is a risk of lower performance if things don't work out.

Potentially an upgrade path at some point if it seems worth it at a later time and now doubt the 3600 will still be within 2-3%+- in most cases.

Thanks Kaamos_Llama very helpful!

More for my own curiosity @Oussebon. I was under the impression that with Ryzen Ram clocked above 3600MT/s performed worse in gaming without specific tuning. Something to do with the memory divider?

@UnknownPriority 16GB RAM is perfect for gaming at the moment, not sure on the wear levelling question I'm afraid.
 
Mar 20, 2020
9
2
15
I dont consider a sub-$500 monitor high end. It's a very nice monitor, I think it'll be great. I haven't spent money on computer stuff in a a while. My 2 current computers are still Core 2s!.But at least that might attest to the quality of computers I typically build...

For what I've paid for displays in the past, the prices are mindblowingly CHEAP these days!

Around 1998 I bought a 17" Sansung 6Ne for $650. That would probably be around $1000-$1100 these days. I have a 24" Westinghouse LCD monitor that I paid $400 for (which was a bargain price) [from around 10 years ago, again that would probably be 500-600 in todays money. In neither case were those considered high end.

The 38" monitor that I may consider in a couple months is FreeSync or AMD version and I think it'll be even better. If it were available right now, I would probably just be getting it.

I definiately appreciate your help it has given me more to look at specifically. I think I will prefer the IPS vs VA panel (faster response/less input lag), plus it's quite a bit less.

QUOTE="Oussebon, post: 12784, member: 3921"]
As above, I advocate against gsync monitors at the moment.

I would personally be looking at the 32GK850F-B instead. (Edit: Assuming you meant the 32GK650G-B from your initial post, which is gsync only)

It's also hard to really see the value in buying a top end gaming monitor only to replace it with another top end gaming monitor after 2 months. Just delay the purchase altogether.

It's your money of course but tbh if it were mine I'd not use it that way.
[/QUOTE]
 
Mar 20, 2020
9
2
15
I'm fairly sure drives have to be partitioned. If you mean why use multiple partitions there's a lot of advantages to doing so, in my professional and personal opinion.

I'll probably at least make 3 partitions, a small fat partition for DOS boot, Windows OS partition, and general storage parition. THis way if the OS needs wiped, it doesn't affect the general parition (although apps will probably need reinstalled).

If you've never heard of this, this is an excerpt from https://www.diskpart.com/windows-10/ssd-optimization-windows-10-4348.html

Way 2. Leave some free space
To extend the lifespan, most SSDs use wear-balanced algorithm. An SSD will slow down if you fill it up. To improve performance, you would better not format the whole SSD but to leave some free space on your SSD and leave 25 percent of disk free space of the SSD for the best performance.



Why partition the drive at all?
 

Kaamos_Llama

Community Contributor
Jan 31, 2020
105
131
270
RE: The wear levelling free space. I had a quick sniff around and it seems that yes its still recommended to leave 20-25% free per partition for wear levelling purposes to extend life span. Thats just me parroting someone I trust though, and I'm not sure how much it matters with a drive full of games vs professional work loads which might have a lot more writing going on.
 
Mar 20, 2020
9
2
15
Thanks to everyone who helped... my orders have been placed and in about a week a semi load worth of computer parts are going to show up!

Ended up going with everything sorted out above plus....

Memory - switched to G.Skill Ripjaw 3600 DDR4 16 cas (16x2) (was same price as 4400 16gb)
Optical drive - Asus Black Blu-Ray Burner Sata 16D1HT
UPS - Cyberpower CP1500 PFC Sinewave
Doom Eternal standard
 
Feb 17, 2020
174
172
270
Yes, obviously Windows creates the partitions it needs during the install process >.<

The advantages of then further partitioning the drive are frankly not worth it, which I say as someone who had a drive partitioned in more or less the way you describe, for more or less the reasons you describe, and then grew bored of balancing things (including free space) for each partition. And that was a 2TB drive. Partitioning does come with a performance cost, and juggling things between the partitions isn't fun.

Leaving unallocated space on an SSD isn't something I've heard of and smells of yet more SSD voodoo.

It's well known you shouldn't overfill the SSD, not just for wear levelling but often for performance especially depending on the model.

$500 is high end for a monitor tbf.
 
Mar 20, 2020
9
2
15
Windows will automatically created a partition if you select that during setup. But that doesn't mean for performance or wear that's the optimal way to partition a drive. Generally giant partitions become unmanagable and there are many advantages to multip partitioning, on SSDs, and on HDDs.

Windows 7 and up (maybe Vista, I don't recall) have auto defragmentation. At least at some point in the development of Windows 8 or 10, Microsoft changed the default for SSDs disabling the auto defrag. This is because the auto defrag could cause excessive state changes which was, and that I can tell, is still some concern with SSDs.

Partitioning doesn't have a performance cost with solid state drives. With hard drives, partitioning can effect performance based on where partitions are located on the physical area of the disk. This is something you should look into, no offense, but your current knowledge level is limited based on your statements about partitioning.

My current SSDS have been running for years, and Ihave 10% unallocated for wear leveling (there is a term specifically applies to this). Back when I got them I think the general recommendation was 10-25%. I think 25 is overkill and maybe on extremely active drives might be benefical. I will probably stick to 10%.

Also if > $500 are high end monitors, then I'm safe because the monitor I'm buying is only $460, so I'm not "wasting" a high end monitor.

I used to be a member of these forums. On Delphi. When PC Gamer moved it from Delphi to PCgamer.com. When PC Gamer moved it off PC Gamer. And now apparently it's back on PC Gamer.com.

Professionally, I have been in IT around 20 years now, with the first 14 or so years spent on the pc/network/administration side. I'm in software development now, so I've let some of my IT side/knowledge fade.

I've been around a while. Again, I appreciate your help and I hope that you accept help from others, even if our ideas or knowlege are different from yours... it doesn't necessarily mean it's voodoo.

Yes, obviously Windows creates the partitions it needs during the install process >.<

The advantages of then further partitioning the drive are frankly not worth it, which I say as someone who had a drive partitioned in more or less the way you describe, for more or less the reasons you describe, and then grew bored of balancing things (including free space) for each partition. And that was a 2TB drive. Partitioning does come with a performance cost, and juggling things between the partitions isn't fun.

Leaving unallocated space on an SSD isn't something I've heard of and smells of yet more SSD voodoo.

It's well known you shouldn't overfill the SSD, not just for wear levelling but often for performance especially depending on the model.

$500 is high end for a monitor tbf.
 
Feb 17, 2020
174
172
270
Also if > $500 are high end monitors, then I'm safe because the monitor I'm buying is only $460, so I'm not "wasting" a high end monitor.
Edit: The eyeroll smiley isn't showing up, so we can just take it as read.

I won't respond to the rest, which is variously out of date, incomplete, designed to be offensive, or a weird form of boasting(?).
 

ASK THE COMMUNITY