11 things that ruined video games forever
It’s undeniable that video games are one of the most important and profitable entertainment mediums of the modern era. They’ve evolved from incredibly primitive, abstract experiences like Pong to fully immersive, realistic simulations that let you live a million fantasy lives. And, soon enough, Oculus Rift and other VR products will make them even more immersive.
You know what, though? Gaming could be even more awesome if different decisions had been made. You might not agree with all of them, but here are 11 “features” that ruined games forever.
Continues as a form of extortion
The first arcade game that let players continue where they left off was Fantasy in 1981, and it changed the way we interacted with games forever. Previously, arcade players had just a single quarter to do their best against the machine, and everybody was on the same level. But now, people with more money to spend could keep their game going longer, removing the skill barrier. It wasn’t long before continues were standard practice, and arcade titles likeDouble Dragon became artificially difficult to siphon money out of your pockets.
Celebrity game designers
The first company to give designers credit for their software was Activision, and in the decades since their founding, the cult of the “game designer” has only grown. The thing is, no one person is responsible for the success or failure of a big-budget game. That didn’t stop us from canonizing game developers as the celebrities of the industry. The nadir probably came in the late 90s with John Romero making you his bitch with Daikatana, but people like Peter Molyneux should make us very cautious of putting too much trust in any designer who puts their personality over their games.
As games got longer and more difficult, players wanted a way to step away and resume their progress later. The first Legend of Zelda introduced saving to console gamers, but PC players had been saving since 1981’s Zork, if not earlier. It wasn’t long before PC gamers got used to being able to save and reload at any opportunity. That’s all well and good, but one of the most important aspects of video game is using repetition to build skills. By saving and reloading obsessively — “save scumming,” as it’s sometimes known — we reduce games to weird, unpleasant trial-and-error experiences and take ourselves out of the flow state.
The whole point of video games is that they let you influence a living story with your own skill and decisions. So why is just about every game that hits the market now loaded down with non-interactive cutscenes that play the game for you? Pac-Man had brief animated segments in between sets of levels, which grew and grew until some games like Metal Gear Solid 4 have a staggering 71 minutes of cutscene. Let’s face it: many video game designers are lousy movie directors, and they should stick with what they know.
Quick time events
One of the biggest problems with video games is translating controller inputs into on-screen actions. It works out fine when your avatar only has to do a couple different things — jump and shoot, for instance — but the more complex the simulation becomes, the less adequate the amount of buttons you have at your disposal are. One way that developers have addressed this is with “quick time events,” which connect timed button presses with a variety of events. Arguably popularized in Yu Suzuki’s Shenmue, they’ve become an unavoidable part of most action games.
A game designer is, in many ways, like a God. He or she creates a little world with its own rules and inhabitants and then opens it up for gamers to explore. Part of the pleasure of gaming is diving into that new terrain and discovering what it has to offer. So when GameFAQs opened for business in 1995, it marked the end of an era for the medium. Instead of having to figure stuff out yourselves, you could just go on your computer and download a massive text file that spelled out where every secret was and spoiled every plot development. Why not just hire somebody to play the game for you in that case?
Online gaming causing dehumanization
When you wanted to play games against people twenty years ago, you did it in the arcade, where you had to look your opponent in the eye, win or lose. PC gaming had an equivalent with the LAN party, where gamers got together and brought their boxes to a big room to play Quake and Marathon. But as internet connection speeds increased, we stopped doing that and instead turned to competing online. Not being in the same room as the person you’re fragging is innately dehumanizing, creating a generation of gamers whose main social interaction is yelling racial slurs over Xbox Live.
In the old days, video games didn’t all hit stores on the same day, so there wasn’t a panicked rush to get your hands on the newest titles. Sega changed all that on “Sonic 2sday,” when they managed to make copies of their platformer sequel all land at the same time, and now “release day” is a weekly event. Publishers saw this and created an atmosphere of artificial scarcity where you could put money down in advance and guarantee yourself a copy via pre-order. The problem with this is, obviously, if the game’s not any good you’re out the cash. With digital distribution, there is literally no reason to ever place a pre-order, so hopefully this crap will die out soon — but too bad digital pre-purchases often come with in-game bonuses like items or maps.
In the old days, when you bought a game you knew exactly what you were getting — a complete experience that was worth the money. With the introduction of downloadable content, though, now your $60 doesn’t even get you the whole game. Publishers now expect to be able to bilk more cash out of gullible gamers a nickel and dime at a time with downloadable content. This varies from cosmetic additions that you can live without to great heaping lumps of game. Now you can even buy fighting games where you have to pay extra for each combatant, and we’re not even going into the worst kind of DLC, where publishers make you pay just to unlock stuff that’s already on the game disc.
On the surface, the idea of early access game builds seems like a winner for everybody. Game developers get to have a group of players to test new features and help them make them better. Gamers get a chance to see the development process and play games before anyone else. But in practice, early access has been an unmitigated disaster. For every game that continues to deliver updates like Vlambeer’s Nuclear Throne, there are a dozen more that dissolve into never-released piles of broken promises and ambitious ideas. Early access has enabled studios to make money from unfinished games with few repercussions, and that’s bad for everyone. Perhaps worse, gamers tend to play the crap out of early access games, ultimately getting bored of a work-in-progress game, then having trouble going back to the finished product when it finally releases.
Once game publishers learned that they could get money out of us even after we bought the game, the floodgates were opened for all kinds of extra charges. Probably the most odious are microtransactions, which we see a lot in mobile gaming but are starting to make their way into console and PC titles as well. One of the worst microtransactions are the ones that give you an advantage in the game, completely removing the need for skill entirely. If you’re not playing a game to get better at it, what’s the point? You might as well be watching anime on Netflix. Even worse, though, are the microtransactions that are basically required to progress in a game.
So what do you think? Are there any game-ruiners we missed on this list? Argue about it in the comments. Oh, comments also ruined gaming. Deal with it.