Ok, we admit it... we have it pretty easy these days. Gaming is one of the largest facets of the technology industry today, and it benefits from a lot of cutting-edge technology. Often, you can experience incredible stories and blockbuster games without leaving your couch.
But gaming wasn’t always this way. Many gamers are nostalgic for the dawn of gaming in the 80s and 90s, but there’s an era that we all just don’t talk about that much - the 2000s. And that’s mostly because this era is the equivalent of gaming’s awkward teenage years - things are improving, and some things we know today are starting to develop, but we have a long road before we’re playing Horizon: Forbidden West on a PS5 hooked up to a 4K 55 inch TV. Here are 10 problems that gamers faced in the 2000s.
Not having an HD TV
I think we sometimes forget that TVs have come a long way since their invention in 1927, and the advent of colour TV in 1953. TV has had a long, fascinating technological history that is too complex for us to go into in this article (for an awesome history of American TV, we recommend The Columbia History of American Television by Gary Edgerton) but suffice it to say that the HDTV technology that we use today wasn’t widely adopted worldwide until the late 2000s.
Before that point, most TV sets could only handle Standard Definition (SD), which was of significantly lower quality (though it did inspire a certain vintage aesthetic that modern games now attempt to emulate). In the transitional period between SD and HD TV, many games were made for HD but would be played on SD TVs. This could suck, especially if you had a friend whose family could afford an HD TV, and you could directly witness the difference.
Terrible PC Ports
I know this sounds crazy, but PCs weren’t always the highest quality gaming equipment on the market. The 2000s were also the PC’s awkward teen years, and the PC versions of popular games released during this period are notoriously terrible. Some of the biggest offenders were games like Grand Theft Auto 3 (2001) and Battle for Bikini Bottom (2003). Often, the version of a game available on PC and the version available on consoles were completely different - with the PC versions being much worse.
If you were too poor to buy a console - or you were just desperately trying to hold out while the PC transformed into the preferred gaming equipment - you would just have to play the worst version of the decade’s best games.
Trying to Run Crysis on Ultra Settings
As we’ve gone over already, PCs in the 2000s were far from the powerful machines we know today. Most commercially available PCs could barely run a browser, let alone a video game. But games kept being released for them, and one game was pushing gamers to build their own PCs - Crysis.
Crysis is a fairly standard FPS that was released in 2007. You play as nanosuit-enhanced supersoldiers, battling mercenaries, enemy soldiers, and a sinister alien race. It was one of the highest-rated games in PC Gamer at the time, having been awarded a 98%. However, it had INSANE system requirements for the time and was punished for everything but the highest end GPUs.
So gamers at the time had a choice - play the game at the minimum settings, or build a machine that could at least run it comfortably. Unsurprisingly, many hardcore gamers took their first leaps into building PCs because of this game - it was so ubiquitous that it became a benchmark for your machine’s prowess. If you’ve ever been asked, "But can it run Crysis?", this is why.
Building your own PC
Nowadays, building a PC can be a rewarding project for tech nerds and hardcore gamers that want a perfectly customized gaming machine. And with the huge wealth of resources and advice on the Internet, it’s not as hard as it might sound.
Unfortunately for gamers in the 2000s, these resources did not exist. If you wanted to build your own PC - perhaps to run Crysis with half-decent settings - you would have to rely on other local tech lovers or computer shops to guide you through the complicated process. And finding parts could be difficult - depending on where you lived, you might have had to order parts online or from shops, which could take WEEKS to arrive. Huh, with the ongoing GPU shortage in the world, it feels almost nostalgic, but not in a nice way.
It’s not to say that building your own PC was impossible back then, or that it’s easy now, but these were limiting factors for a lot of people, making custom-built PCs a lot rarer in the 2000s.
Manually Downloading Patch Files
When’s the last time you even thought about how your games receive patches? For modern gamers, updates and patches often just magically show up and integrate into our games - the most inconvenience it might cause us is a patch downloading when we want to play our favourite game.
In the 2000s, this technology didn’t exist yet - patches weren’t streamlined until Valve released Steam in 2003 (and even then, we had a long way to go). Before this point, developers would upload patch files to hosted websites, and players would have to seek them out and download them manually. This came with a whole bunch of problems - you could download the wrong file accidentally, or fall prey to fake patch files that contained viruses. On top of that, online games would often have to shut down until most players had implemented the patch - this is why Valve originally developed Steam.
Modern gamers often regard Steam as essential for PC gaming - it’s the largest gaming shop available, and it's often the only place to get blockbuster games. But it wasn’t always the ubiquitous powerhouse that it is now - when it was released in 2003, Steam was just a vehicle for Valve to use to update their games, particularly Half-Life 2. It was meant to fix a lot of the problems with patch files that we just mentioned, but often, it just made things worse.
The first and most prominent issue was that Valve hadn’t anticipated the huge amount of traffic that Steam would attract. Between 80,000 and 300,000 players used the program during beta testing before its release in September 2003, and the website was quickly fried by the sheer amount of people attempting to use it at the same time. Further, Steam had yet to be well-optimized for simultaneous use with pretty much any other software, and it would often gobble up valuable resources that the games it was responsible for needed to run.
Unfortunately for Half-Life 2 fans, Steam was required in order to play the game and receive Valve’s patch files for it. So you basically had to deal with this little vampire of a program being a drain on your system until Valve started making improvements to it in the late 2000s. Thank goodness that Valve put the time and effort into the program to make Steam into the cornerstone of PC gaming that it is today!
No Cloud Saves
Cloud saving is a relatively new technology - it wasn’t popularized until Amazon released its Elastic Compute Cloud in 2006. So gamers in the 2000s had a lot more limitations on how and whn they could save their progress. Most game chips of the era came with limited save slots (three was the norm), and they were coveted real-estate. Many a friendship was ruined or sibling rivalry intensified when someone overwrote your save files.
To get more space to save data, you often had to buy extra memory cards for your consoles. These cards were not cheap - they generally ran for anywhere between $30 and $40, and they were just a bit bigger than your thumb. Which made them easy to lose, unfortunately. Not only did you lose all of your progress, but then you had to go and shell out another $40 for another save file? No sadder story has ever been written.
Ah, wires. You can’t live with them, and you can’t live without them. Today, I like to think that wiring is slowly becoming more and more obsolete - with the advents of things like Bluetooth, wireless charging, and more robust internal batteries, maybe we won’t have to deal with wires at all one day.
Unfortunately for gamers in the 2000s, they were still living in the era when everything was wired. Controllers, ethernet cables, HDMI... everything that you can think of was connected to everything else by wires. And frustratingly, very few of these wires were long enough to actually be functional. Often, wired controllers weren’t long enough to even sit on your couch while playing - you’d have to sit on the floor in front of the console, ruining your eyes and your back. Ouch!
Nowadays, it’s hard to find a place that doesn’t have at least some access to the Internet. It feels like it's all around us, accessible at any time on phones, tablets, and laptops. But in the 2000s, the Internet was still in its infancy. The type of connection that many of us use today, broadband, was just starting to be offered in the late 90s, and many gamers in the early to mid-2000s would have still been gaming using dial-up Internet connections. As we know, it wasn’t the best system, and it made online gaming a huge pain.
Not being able to find the game you want at a rental store
Ah, rentals. Is there anything that sparks more nostalgia than the image of a Blockbuster Video store, shelves lined with movies available for rent? As console gaming became more ubiquitous, Blockbuster and other video rental stores started offering games for rent as well.
This program worked really well, especially if you wanted to try a game before buying it. The only problem was that you were essentially at the mercy of what the rental shops had in stock - if they didn’t feel like a game would make their money, they wouldn’t stock it. And getting them to order a copy of a game you wanted could be a long, inconvenient process. Many 2000s gamers were calling rental stores every day to get the latest game they wanted.
And that’s all! Hopefully, this list sparked a little nostalgia in you or made you appreciate all of the technology and conveniences that we have at our fingertips today.