Why computers are ridiculous now
There's nothing fundamental that most of us are doing on a computer in the year 2024 that requires 16GB of RAM or a brand new processor. Why has it become the new default? Why does your computer that runs fine today require a replacement to run Windows 11? There's not one answer but several.
The footnotes are for technical folks
My goal with these articles is to be informative for people who don't know as much about computers. Don't feel the need to read the footnotes. They're a form of Craig Maloney statement.
More is better
Computer makers know that “more is better” when it comes to advertising. This used to be easy because in terms of speed, we could “just” talk about the speed of the processor and the bigger number would obviously be better. Even back then, it was more complex than that but most people didn't need to care about it one way or the other. They just needed the bigger number.
Manufacturers want you to buy a new computer regularly. That means making you feel inadequate about your computer.
The history of personal computing
Hardware used to be more expensive even before you adjust for inflation. Once you adjust for inflation, it becomes really clear that hardware was like “wowza 💸 💰 🤑”. Giving hardware manufacturers of the time the benefit of the doubt, economies of scale hadn't hit. Only a few companies did custom chip manufacturing. Those few companies were able to charge whatever they wanted to for their products. That means the computers those parts went into were expensive.
There was no “upgrade cycle” like phone companies try to get us into today. Most people expected to keep a computer for a long time. When there were options to upgrade parts of a computer, some people had the expertise to consider taking advantage of those upgrades. Part upgrades were common among enthusiasts but the average person was probably stuck with whatever they bought until they had to buy a new one.
If you were a software maker, you had a good reason to make sure your software worked on the greatest number of compatible computers. You wanted to use less RAM and have lower CPU requirements because, in most cases, your your potential customers weren't going to plonk down $2000-3000 dollars (or more) for a new computer that could run your software. Most of them weren't going to install daunting upgrades either.
The software
Today, we have so much extra compute power running around that we don't notice that a lot of software we're running is secretly a website running in a locked down web browser. Each of these pieces of software is using the memory and processing requirements of a modern web browser on top of whatever else you have running. Probably at least one actual web browser.
Your computer may run slow but it's likely not running slow because of hardware limitations.
To put this in human terms, you could type on your keyboard. Or you could pull up the entire works of Shakespeare and copy and paste the words and letters you need from Shakespeare instead. Clearly, for most cases, copy-pasting letters from Shakespeare isn't an efficient use of your time.
The software nonsense isn't quite nonsense, though. There's a reason so much software is developed that way. We have two1 viable consumer operating systems. No, wait. Make that four2 (counting the two mobile OSes).
It takes time and money to develop software even once. You're not just making the software. It needs to look distinctive and it needs to look like it belongs on the operating system you're releasing it for. Mac does things a certain way and PC does things a certain way. PC users and Mac users both expect to have their normal experience when they use your software. You probably have at least one piece of software that doesn't act like it belongs on your computer or phone. At best, it's annoying and feels unprofessional. Many software developers want to have an experience across all OSes which feels the same while looking like it belongs on the OS you're running it on.
There's a reason so much software is developed that way.
If you're not a programmer, you might find it weird to hear this but ... there's no good options for developing software just once and having it work similarly across all platforms. Actually, that's wrong. There's one: making a web app. Which is how we got here, unfortunately.
Oh, but there's (...)
Tell that to Discord, Slack, Microsoft (Teams and VS Code), Atom, and countless other developers. Not me. I can tell you from my own experience as a hobbyist developer, the options aren't good. Large companies (like Microsoft, Discord, and Slack at least) can afford to hire developers to bridge the gap.
Does that mean we have to keep letting our software use ever increasing amount of system resources? No, not really. The situation was similar ten years ago and our computers were still functional (and that ten year old computer still can be functional today!). There are potentially more performance-conscious ways of doing these cross-platform applications, too. Developers aren't motivated to do them right now because they can just tell you to go buy a new computer.
1The two viable consumer OSes are Windows and Mac OS. Linux is my main OS. I've been using it as my main OS for more than 20 years. I've used it since 1999. My sound still doesn't work for some of the most ridiculous reasons imaginable. It's a joke but it's also extremely true and I never recommend Linux for anything unless you're talking servers. In which case, you shouldn't need me to tell you that between zero and a very small number of things should be running on Windows servers.
2 I'm not counting iPad OS as a separate OS for the purposes of this article. There are no other viable consumer OSes on mobile right now.
Unplanned obsolescence
If you have a Windows computer that was brand new 10 years ago, it is likely perfectly capable of running Windows 11. It has the RAM and CPU it needs to run it. Could it be faster if it was a newer computer? No doubt but it's adequate to the task of running the software. This is an almost unique situation in the history of personal computing. Ten year old computers don't historically play nice with brand new operating systems.
Microsoft has introduced a new requirement for a TPM version 2.0 chip. If you have a ten year old laptop, you not only probably don't have a TPM 2.0 chip, you also probably can't install one. When you hack the requirement out of Windows 113, your computer will probably work fine.
It may not be fair to say that Microsoft is intentionally forcing you to buy a new computer for nefarious reasons. Maybe the value of the security enabled by the TPM 2.0 chip is important enough to make us all eWaste a bunch of computers that are perfectly fine apart from not having one.4 So I'm not saying that.
I am saying that your ten year old computer is probably fine for what you need it for. It just may not be fine for upgrades to the specific OS you're using with it.
Similarly, Apple recently switched architectures from Intel to their own custom Apple Silicone. They're not providing as much support for the Intel based Macs you already own. We've seen this in the past when Apple switched from the PowerPC architecture to the Intel architecture. If you got an Intel based Mac at the start of that run, you got a computer that likely would have lasted you a very long time. If you got a PowerPC based Mac near the end of the PowerPC run, you probably felt a little like your money had been wasted.
The way Apple rolled out its M-series processor architectures initially probably led to a greater than normal number of people buying an Intel based Mac when it wasn't really a good idea to do it.
3 I'm not getting into this fully but they also introduced that requirement for new Windows 10 computers a few years back. Linked article has more info about that too.
4 Also, that should be your choice.
What to do?
It's not reasonable for me to tell you to use a workaround to install Windows 11 on a computer that Microsoft doesn't want to support. Even if you're pretty tech-savvy, it's easy for me to tell you to do something that's going to have no consequences for me but is going to be really annoying for you to live with.
There are some folks out there who are going to tell you to install Linux. I don't tell people to do that (unless we're talking servers; don't run servers on Windows unless you have a really good reason to do so).
This paragraph is for those of you who are in the U.S.: If the incoming president gets his way, electronics prices are going to shoot up dramatically very soon. Despite what he promised, tariffs are an almost direct tax on the consumer. Most of the stuff you buy on Amazon, Temu, AliExpress, etc. comes from outside of the U.S. Those things are going to become very expensive and there really aren't viable alternatives available or forthcoming for manufacturing inside the U.S. What I'm telling you is ... if you're in the U.S. and you're reading this before the inauguration, you should go buy the nicest computer you can afford to buy. It's going to have to last you a while. I'm not the only one saying that. Broadly speaking, this is how tariffs work. Anyone who tells you that tariffs will lower prices for people in the country levying the tariffs is a liar.
Now that everyone's back ... if you do end up installing Linux, I recommend you get a new computer with Windows 11 on it, back up your data on your old computer, and then install Linux on your old computer.
You'll still be able to do stuff on your new Windows 11 computer for the foreseeable future and you'll get to try out Linux in a way that's a lot less stressful.