Stop if you've heard this one before - with all the unplayable buggy launches, rampant monetization, season passes, chunks of games being locked behind DLC paywalls, day one updates, always-online single player games, early access, lootboxes, massive patches, etc, games were so much better back in the eighties, or the nineties, or the oughties (pick whichever applies to your age bracket) right?
But were they, really? Let's, for a moment, try to ignore Betteridge's law of headlines as well as kneejerk answers and take a look at things a little more closely. The video game community is one that, by its inherent nature, is terminally online which deeply affects any and all related discourse. Nostalgia, selective memories, the sheer validation of hating things together, a lack of understanding of how the industry works, psychology and a dozen other factors also play into the prevalent idea that games are now passionless cash-grabs, game companies are soulless megacorps run by hellspawn masquerading as executives who literally eat kittens for brunch and that the hobby as a whole stinks compared to the golden age (the exact placement of said golden age varies, curiously).
This is also a topic that is nigh impossible to even discuss in any serious degree, because depending on your stance you are either blinded fanperson sheeple or a seething troll with zero middle ground, as far as the other side is concerned. Yet here we are, wading into this particularly putrid can of worms in the hopes of finding loot that increases our wisdom stat.
Granddaddy’s grass is always greener
Something you'll notice whenever the topic of contemporary gaming being awful comes up is what I've decided to, with the power vested in me by having access to a news outlet on the internet, dub the "Tongue of the Fatman Effect".
Do you remember Tongue of the Fatman, the 1989 fighting game also called Mondu's Fight Palace in places where ridiculous names like that just don't fly? Of course you don't - it sucked. Whenever people talk about how terrible modern fighting games are because they're all just costume DLC selling hackjobs all they'll ever remember is Street Fighter and Final Fight, or Street Fighter II and Mortal Kombat, or any of the good ones.
It is an exceedingly common fallacy with the whole old vs new debate that the worst product our current era has to offer is always compared to the cream of the crop from ages past, because that's all anyone ever remembers; the successful and popular games of previous decades are the ones you'll still hear about, which you will see recommended, or which will get remasters and releases. If you were around back then, those are the ones that stick with you, because they were memorable.
Even more insidious, dare I say, are those old classics which came to be loved despite issues at launch, we just give them less crap because the good memories are the ones that stick. Let us demonstrate: The launch of Diablo 3 was a disaster, it was impossible to log in due to Error 37! Wasn't it so much better when you could log into Diablo 2 and... you'd somehow come back to life next to your corpse without items, you'd get curb stomped by Fire status effects due to the FE bug or wallow with poor drops?
Another: I can't play the single player campaign of Call of Duty: Modern Warfare (the new one, not the old one) offline because I need to log into Battle Net to launch the game. Wasn't it so much better when Half-Life 2, often lauded as one of the greatest games of all time, launched with mandatory Steam integration and the service utterly collapsed?
Here's the thing - gaming has problems. Big problems. DRM forcing us to always be online even when playing single player games is a problem. Games launched with crippling bugs is a problem. Arguably predatory monetization practices are a problem. Professionals in the industry being forced to crunch only to be laid off while executives pockets millions in annual bonuses is a problem.
Nobody is saying everything is perfect and lodging complaints against the current state of the industry is bad - but we need to realize that gaming always had problems, just different ones. A lot of things may have gotten worse, but a lot of things have also gotten better, and that nebulous temporal anomaly known as "Back Then" wasn't as rosy as most people make it out to be.
Just think about how much more available games are these days - you can get anything via digital distribution which was a huge blessing during the lockdown measured we've endured this past year or so due to COVID-19. Even if there isn't a pandemic going on, people living in places with spotty retail availability can get any game they want, and they don't need to pay for shipping or wait if they'd had to order stuff in previously.
Games are also vastly cheaper these days, especially on PC. Sure, that next-gen game price-hike to $70 is a bit shady, but come on - everything goes on sale soon and frequently, and there are a dozen subscription models across platforms which offers an affordable option to access multiple games.
Ah yes, buggy game launches. We love to hate them, and they sure get their fair share of scathing criticism. There's no shortage of recent high-profile examples to choose from, such as Cyberpunk 2077 or Outriders. Hyped up AAA titles with embellished trailers being released in unserviceable states often get brought up in these debates, and it's quite telling how there's always a recent example to roll out in this age-old argument. These huge, multi-million dollar projects have plenty of expectations levied against them, and rightfully so. It's true, there is a difference here between modern games and the titles of yore.
It may not be what you think it is, though - no, buggy games were always a part of the industry. The difference is that these days games have the potential to fix themselves up, and very frequently do. Back in the blissful days of "plug and play", when a game was a dud it would never be anything else.
Just think about Ultima IX: Ascension or Battlecruiser 3000 AD and their game breaking bugs. Think about the PlayStation 2 Demo Disk that nuked your save files. Then there's Pokémon Red and Blue's Missingno, which is remembered almost fondly these days as a beloved easter egg even though it was a horrid bug that messed up your Hall of Fame, in-game NPCs and potentially corrupted save files.
Modern gaming, with all of its patches and updates offers botched launches a chance at redemption. Several games in recent memory launched in disastrous states cleaned up their act and not only became functional and passable, but in some case they turned into delayed hits, rising like a virtual phoenix from the ashes. It took a while, but No Man's Sky not only delivered on the original vision but has since then surpassed it. The laughably mismanaged Fallout 76 is now popular with a dedicated fanbase. EA's Star Wars: Battlefront 2 went through a forced metamorphosis following the whole lootbox controversy and turned into a fantastic shooter.
Something else to consider is that the perception of games being buggier these days comes down to the numbers. With the extremely healthy indie game development scene, the overall increased popularity of the medium and the spread of accessible development tools, the sheer quantity of new games being released dwarfs that of past decades by magnitudes - even if we choose to ignore the heaps of shovelware on Steam. Of course there are more buggy games where there are more games.
Modern titles are also inconceivably more complex and complicated, leaving much more room for error. The virtual worlds we inhabit, the game mechanics, the storylines all consist of vastly more bits and pieces that need to work together and interact in hundreds of different ways, spawning problems developers ten or twenty years ago couldn't even imagine, let alone be forced to deal with.
Indie is dead, long live Indie
The entire "then versus now" discussion often touches on the topic of indie games - understandably, considering how gargantuan a role they play in the medium's history - and their roles in making gaming in the past/present good/bad, whatever your preferred combination is. The issue is that often times the points being made are incorrect.
Indies have transformed significantly, but are also sort of the same. Meanwhile, many people have pretty different perceptions and memories of what indies are. The driest and most technical definition of an indie game is one that is published by the developers. These self-published projects do not have a separate publishing company, unlike most AAA titles.
However, in some cases a publisher will help fund a project and assist with the business side of things while giving the developers complete creative control without oversight or exerting influence. These could be called indies too in general parlance. Sometimes suitably wealthy and influential developers self-publish, which would make a game both AAA and indie.
While not necessary factors of being an indie title, the moniker does have a set of characteristics most people tend to assume or even expect - low-budget, developed by a small team, featuring heavily stylized visuals and having retro sensibilities. A common persona gamers have come to associate with indies - something often espoused by these indies too for marketing purposes - is that they're the spiritual successors of old classics through which the golden age survives.
Indie developers are often styled as rebellious underdogs daring to break from bland focus-tested big-budget projects designed by committee to cash in on the latest mainstream casual trends, and there is definitely truth to that - but this is a two sided coin. To be entirely fair, it is more like a complex 3D geometric shape with several sides because life is never that easy, but so goes the adage.
There are several factors that need to be remembered and considered when mixing indies into this already mired and complex topic. The history of indie gaming is the history of gaming, as the technical definition of the indie game is where the entire medium was born. Even if we fast forward to the 80s when the video game was hardly a new concept anymore, indies were a huge deal. Legendary seminal shooters Wolfenstein 3D and DOOM were developed and published by id Software - they were indies. At the same time, this was the era of the SNES and N64, so industry titan Nintendo was already operating in a manner comparable to the modern AAA segment. These are just some of the most apparent examples.
The roots of several of today's still surviving huge AAA franchises are in indie development - just think of the previous two examples, Wolfenstein and DOOM. Comparing Wolfenstein: Youngblood with the genre defining Wolfenstein 3D (or even with the original Castle Wolfenstein, developed and published by MUSE) entirely misses the gargantuan differences between indie and AAA development. A fair comparison would be between a modern indie shooter instead.
The indie segment of the game industry has also grown to include a broader range of products since the major resurgence in the latter half of the 00's. When digital distribution and increased availability of better consumer-grade development tools made indie development vastly more accessible, the initial gamut of 'true indie' one-developer and small team productions was followed by indie projects of larger and large scope. The popularization of crowdfunding made access to larger budgets also more viable for indie projects, and games with much larger scopes and daring designs could be finished without publishers.
Meanwhile, the AA market of mid-budget games with publishers attached that didn't quite fit the heavyweight category of larger titles steadily evaporated. Most gamers who remember the hit-and-miss nature of AA titles, with the few diamonds in the rough would lump it with the AAA segment of the industry if we had to look at things in a binary manner, whereas modern big(ger) budget indies are still indies. This replacement of an entire, rather large, market segment in the common consciousness leads to nigh infinite comparison fallacies between the "then" and the "now". For all intents and purposes, indie gaming has replaced the AA market entirely.
There's also the tiny detail of a lot of indie games being, well, crap. A common criticism of Steam nowadays is that the digital storefront is absolutely overflowing with zero-effort asset flips and shovelware that can be sold as games due to the nearly nonexistent moderation of content distributed on the platform. In a technical sense, all of these are indie games too. This phenomenon is a modern equivalent of the old bargain bins full of half-assed licensed movie tie ins and the worse examples of the AA market, simply adapted to the new way most gamers buy their games.
When speaking about indie games, it's impossible to skip mentioning Early Access. A different take on the concept of crowd funding, Early Access allows developers to put their work out onto the market before it is complete, allowing players to buy into the promise of what the game will be. Sometimes, the added funds provided by interested players buying the Early Access version allows the developers to realize their dreams and deliver the complete experience. Often this... doesn't happen.
The Early Access issue often invites reminiscing about how "back in my day, when you bought a game you got a finished product". Here we loop back to a point made earlier about how in those good old days, if said finished product was a horrid, buggy, unplayable mess then it will never become anything else. While plenty of Early Access projects fail, or are outright a scam from the get-go to cash in on an idea the developers have no intentions of actually finishing, it is also a fantastic resource that makes game development even more accessible with lessened financial risk involved. One might argue that it is a system to be exploited by people who want to make money off less effort, but people have been doing that in this industry even before Early Access was a thing.
Head in the Cloud
Leading on from early access, the dynamic duo of the "finished product" and "plug and play" characteristics of old games that people generally reminisce about can often be really easily unraveled. Gaming as a whole being a lot more online than before comes with a lot of inconvenience and difficulty, and it also gives developers and publishers new tools which they admittedly sometime (often?) exploit in a way that definitely isn't consumer-friendly. Nobody is denying this.
The prospect of being able to launch a buggy game to hold onto those pre-order bucks in the hopes of just fixing it later is an issue gamers didn't have to deal with in the past. Frequent server maintenance, large updates preventing you from actually just playing while hogging storage space, single player games demanding a constant internet connection... There are legitimate complaints here.
Gaming as a whole has always been a medium that by its nature would be an early adopter of new tech, even if it hasn't quite taken ideal shape yet. The industry refuses to recognize the reality that a massive portion of the world just doesn't have access to great internet connections, or that top-of-the-line hardware just isn't affordable for most consumers; it's always been an industry that grabbed nascent advancements by the horns and charged ass-first into innovations that 99% of gamers wouldn't care about or have the means to enjoy until years later. As the advancement of technology is speeding up, so is the industry's early adoption, and we struggle more and more to keep up.
Streaming games - as in, having the program run on a different machine and then have the actual display output and control input streamed to your location, not people broadcasting their overacted screaming to popular horror games - only sort of started getting into its own recently, and it's still just not viable to the majority of gamers. It's 2021. OnLive, a dedicated gaming streaming service was already around in 2009, and probably wasn't the first.
So yeah, these arguments? We totally get you. Huge improvements needed, and these really were issues we didn't have to deal with. However - yes, there is a however because that's the point of this rambling - we've also got access to experiences that were impossible back then. We've already mentioned how games can completely alter themselves and change for the better with constant support. Titles remain popular for years on end as developers bring new content with frequent updates. We can jump into huge multiplayer matches with 100 players all existing in the same virtual space. There is a silver lining, and the inconveniences that plague modern gaming are the warts on what also makes modern gaming great.
The More Things Change...
The gaming industry has problems. A lot of problems.
We didn't set out to write this article in order to deny that - there is no use denying it, because it stares everyone who cares about this hobby, this medium, this industry even a little bit right in the face. The point that gamers as a gestalt need to realize is that the industry always had problems, just different ones. Pining for a mythical golden age in the past that never happened does nothing to improve the state of gaming today.
The right complaints need to be made; awareness needs to be raised about the right issues; the appropriate solutions and responses need to be fielded. Ten, twenty years from now people will continue to complain and reminisce of a better era, because they'll only remember the good parts of what video games were like today - it's on us to realize those good parts and work on the bad parts.