I'd make the case that getting a monitor with a high refresh rate and adaptive sync is a relatively small cost, in that you can claw back some of the cost of the monitor through savings on the GPU.
If there's X and Y GPU that are ~£30 (or your local equivalent) different in price, and one pushes 5fps more than the other, you might be tempted to get the one that's £30 more if it keeps you closer to 60, if that's what your monitor does or it only has a 55-75 freesync window. Whereas with a large adaptive sync window you're not so bothered about the numbers.
You might also be able to defer a GPU upgrade for longer for the same reason. The
-ish part of 60-ish FPS gets a lot more flexible.
And for games that are going to run at 100fps+ anyway like Fortnite or any older / less demanding titles, then the high refresh monitor also has an advantage to offer there.
So you benefit both in titles that run at high framerates, and in titles that run at 60-ish FPS. The
ish is much less of an issue.
While the expense certainly isn't viable or may plain not make sense for everyone (e.g. you game on your 1080p 60hz TV from the comfort of your sofa), refresh rates and/or resolutions above 1080p/60 shouldn't be seen as premium goods any more. They're mainstream and not just for enthusiasts with deep pockets or people who want MOAR FPS / pixels.
Although people who want MOAR FPS tragically often still buy a -70 Super class GPU for 1080p 60, which is just tragic
I die inside every time I see someone say they want to enjoy max settings and max raytacing at 1080p. The point that raytracing and all other settings is about improving fidelity, and resolution plays a huge part in fidelity often seems lost.