Weekend Question: Have you ever had to downgrade after an upgrade?

PCG Jody

Staff member
Dec 9, 2019
224
1,256
6,670
Visit site
I ask the PCG staff a regular Weekend Question and post the answers on the site. If you'd like to throw in an answer here, I'll squeeze the best into the finished article!

This week's question is: Have you ever had to downgrade after an upgrade?

"I can never go back to seeing 60 images per second", a member of the PC Gamer staff who shall remain nameless recently said. "From here on over 100 images must be shot at my face each second or I will whine and complain. In 10 years I will refuse to use anything less than 1000Hz." While that was said with tongue in cheek, the sentiment is relatable. It's hard to go back after an upgrade. There are some conveniences you get so used to a step backward seems unthinkable, like cleaning plates with your actual human hands after owning a dishwasher, or seeing a mere 1080p after you've upgraded to 1440, or 2K, or something well out of my price range.

Have you ever had to downgrade after an upgrade, and did you struggle to cope or shrug it off?
 
1080p after you've upgraded to 1440, or 2K,
I assume you mean 4k at the end since 1440p = 2k :)

I intentionally downgraded from a 4k monitor to 2k, as well... desktop icons are really hard to read at 4k so even windows defaulted scaling to 150%
I have a 2k monitor now so at least games don't ignore Windows scaling and load in 4k native.

I think using a hdd now would drive me mad. Start PC, walk away for a few minutes while it loads desktop, compared to NVMe where its start PC... wait a few seconds and then logon.

refresh rate is silly, I expect people will still imagine they can see flicker at 1000mhz or more. Much of its in their head, and if displays with 2000mhz show up and they wonder, would that be better. People probably still see flicker at infinite refresh rates...
 

Brian Boru

King of Munster
Moderator
A few short spells while traveling, I had to make do with one monitor—oh the horror! On a similar note, I still chafe at the lack of dual monitor support in almost all games—I so enjoyed that in Supreme Commander.

cleaning plates with your actual human hands after owning a dishwasher
Had that too, but found a good work-around—lick the plate clean!
 
I think people probably downgrade their expectations more than their hardware. Like people spouting off about high frame rates and resolutions, and then RTX comes out and throws a wrench in that. New, cutting-edge effects tend to keep us held back all of the time in the resolution and frame rate departments. Rather than getting games to run at beastly speeds on modest hardware, they want to keep giving us new eye candy, and keep us crippled.
 
A few short spells while traveling, I had to make do with one monitor—oh the horror!

I've been making due with one monitor (a laptop one even!) since I started working from home. It hasn't bothered me much, though once I return to two monitors I'll probably enjoy all the extra room.

We do have a desk, but it was buried under stuff upstairs in our old apartment and is buried now in the living room in our new apartment. That should be temporary until we can get some more cabinets to put stuff in though.
 
The closest I've come to this is dropping the resolution to 720p on an old laptop to get a game running acceptably, but I've never actually gone backward in hardware. I tend to keep my hardware far longer than your average PC gamer, so when I need something new, it usually isn't even possible to downgrade. Well, anything is possible these days with people selling 20 year old computer parts on Ebay, but I've never purchased used hardware, either.
 

Sarafan

Community Contributor
While I didn't have the occasion to downgrade in terms of hardware thankfully, I often play older games. My PC is quite capable, but it doesn't stop me from returning to titles that were released many years ago. So basically I'm a witness of graphical downgrade on daily basis. One day I can play marvelous looking Cyberpunk 2077, while the other enjoy pixelated Quake 1. And I don't have a problem with that. I can switch at every moment from a stunning visually game to some old gem that was released before 3D accelerators became a thing. I adapt to the situation in a matter of minutes. Switching from 60 fps to 30 fps is a little more painful, but it's only a matter of time to get used to it. I even don't mind the blurry image caused by running the game in a resolution lower than the native resolution of my monitor. Looks like I'm bulletproof when it comes to graphical downgrade...
 
TLDR: I bought a new pc and had to run a pc from x8 AGP to X4 AGP before i made a trial of fire to fix the problem.



Not technically a downgrade but i once bought a new pc for university. It had all the specs i wanted at the time: a p4 3ghz, radeon 9800 and 2gb ram and my dad bought it for £900.

Unfortunately i bought it from a company called Tiny. Those of you who live in the UK remembers Tiny had a reputation for poor quality PCs. even worse then Dell or HP. I thought that it was exaggerated. How wrong i was.

The pc worked fine. but start playing games on x8 AGP and the machine would just crash. I was heart broken and i didn't want to return the PC and tell my Dad nor deal with the hassle. As a temp solution i thought i could live with running the pc on x4 AGP instead. It was better crashing after an hour.

Cue my personal plans to try and fix the problem. Thought it was a mem problem so i doubled the ram. No luck. Thought it was power issue so i replaced the light PSU with something with more power and reliable. Still didn't fix the problem. Decided to fork out for a new Geforce graphics card (512mb 7800 i think?) and thats when the problem was sorted. by that point i spent 500+ just to fix the issue. The PC was practically a new machine and one i upgraded to the max.

By the end of it i had learnt a lot about PCs and the courage to actually go down the route of building my own PC.
 

TRENDING THREADS

Latest posts