Steam's economy of trading cards and other community items got a little more convoluted a few days ago with the introduction of gems, a virtual currency which can be created by recycling unwanted items. Gems can then be used to create booster packs of cards or to bid on games in the auction that's currently acting as a weird prelude to the usual winter sale. Predictably, the whole thing was initially a disaster — an exploit
caused millions of gems to flood the market and the pre-auction
gem-collecting frenzy was shut down temporarily — but everything was fixed in time for the main event. The bidding began yesterday and runs until December 18th, with rounds ending every 45 minutes.
Previously, Steam trading cards were introduced in the summer of 2013. These cards are acquired primarily by logging playtime in certain games (or by spending money on optional content in free games), and they can also come from booster packs which are randomly distributed to eligible users. Once you've collected a full set of cards for a given game, those cards can then be used to craft a badge to be displayed on your profile. The cards are consumed in the process, but crafting a badge also generates some other items (which include game-themed emoticons and profile backgrounds, and sometimes coupons). If you don't want any of this stuff, trading cards (as well as the emoticons and profile backgrounds that are come from crafting badges) can be sold to other users on the Steam Community Market for money (or, perhaps more accurately, for credit to be used in the market or the Steam store). Sellers can specify prices when they list items for sale, and buyers can place buy orders at the prices they choose. Typical transactions are only a few cents per item, but foil cards and other rare items sometimes cost significantly more.
The introduction of gems, like the market itself, is good for people who just don't care about crafting badges, and would rather get rid of their cards and whatever community items they might have acquired. Now, in addition to selling their unwanted items, Steam users can recycle their unwanted items into gems. In theory, having more options is a great thing. The added confusion, though, might be a bit too much. There's already an established Steam market operating with actual currency, so introducing a secondary currency at this point is a bit weird. It's clear that gems are not meant to be a replacement for cash — you can only use gems to bid on games in this auction and to create booster packs of trading cards — but since gems can be bought and sold in sacks of 1,000 on the Steam market, they definitely count as an alternate currency that needs to be considered in certain scenarios.
Let's say I have a bunch of trading cards and I want cash. I could:
1) sell the cards on the market;
2) recycle the cards for gems and then sell the gems on the market;
3) craft the cards into badges to produce community items and then sell the items on the market; or
4) craft the cards into badges to produce community items, recycle the items for gems, and then sell the gems on the market.
Alternatively, let's say I have a bunch of trading cards and I want gems. I could:
1) recycle the cards for gems;
2) sell the cards on the market and then use the money to buy gems;
3) craft the cards into badges to produce community items and then recycle those items for gems; or
4) craft the cards into badges to produce community items, sell the items on the market, and then use the money to buy gems.
The number of gems awarded for recycling each item varies, and this value in gems is not a function of an item's (user-driven and constantly changing) market price. My trading cards for Hammerwatch are apparently worth 24 gems each, while my McPixel cards are worth only 1 gem each, but Hammerwatch cards certainly don't sell for 24 times more cash on the market than McPixel cards do. One of my foil cards, from Europa Universalis III, can be recycled for 320 gems; another of my foil cards, from Droid Assault, is only worth 80 gems, but on the cash market it's currently worth more than twice as much as the Europa Universalis III foil card.
Since different items have different gem values, and since I don't have a list of them all, I'm not sure whether it's better in general to sell items or to recycle them. For trading cards in particular, however — whether you ultimately want money or gems — I would recommend selling or crafting them, instead of simply recycling them. A typical trading card seems to be worth more in cash than in gems, given the price of a sack of gems right now. The combined market price of all the dozens of cards I'd need to recycle for 1,000 gems exceeds the current market price of a single 1,000-gem sack. Selling the cards would be better than recycling them and selling the gems produced. Similarly, selling the cards and using the cash to buy gems would be better than just recycling the cards.
Many people are, however, selling sacks of gems on the market, so
clearly there are some non-card items which are worth more after recycling. In any case, whether you care about gems or not, it's generally a good idea to craft badges during events like this, since you tend to get a special trading card in addition to the regular badge and other goodies.
As mentioned above, gems can also be used to create booster packs, but this seems particularly silly to me at the current cost of gems. To create a booster pack of three Alan Wake cards, for example, I would need 750 gems, which means I would need to recycle dozens of other cards to afford it. Meanwhile, on the user-driven community market, a three-card booster pack for any given game
costs about as much as three non-foil cards for that same game. If I really wanted an
Alan Wake booster pack, I would be much better off just selling a few
cards and then using that money to buy a booster pack from the market. Recycling dozens of cards, or even crafting full sets of cards and recycling the items that come out, seems guaranteed to be less far efficient. The only reason I'd ever spend 750 gems on an Alan Wake booster pack is to get rid of 750 gems that I can't use for anything else, since gems can only be sold in sacks of 1,000. Some other booster packs, however, cost 1,000 gems or more. There's absolutely no reason to use gems to create these booster packs as long as the market price of a sack of gems remains higher than the market price of a booster pack.
Of course, given that the currently useless Booster Pack Creator is the only way to spend gems outside of the auction, it's safe to say the market price of gems is bound to plummet after the auction ends, and this will change everything. Even if I were an expert on the ever-changing prices and exchange rates in Steam's crazy economic microcosm, it's likely that none of the advice I could provide would be worth anything by the end of the year. So you have a big pile of cards and items and you don't know what to do with them? I'm not sure that any rule of thumb exists. Just be aware of the current market price for a sack of 1,000 gems, and do some math.
So what about the auction itself? Gems are better sold than used for booster packs, at current prices, but are they better sold than used to bid on games? Not necessarily. Again, you should be aware of the current market price of gems before you place a bid. Also be aware of how much a game actually costs on the Steam store, and how much it's likely to cost during the impending sale. Some of the top bids on popular games are way higher than they should be, because some of the people placing bids on these games are out of their stupid minds. Case in point: when I checked the auction last night, the top bid for Counter-Strike: Global Offensive was more than 20,000 gems, an amount which, at the time, could have fetched considerably more than the game's $15.00 retail price if sold on the market. The Forest — a game still in Early Access (i.e., it's not even finished) — also had a top bid around 20,000 gems and a $15.00 store price.
This insanity doesn't apply to every game in the auction, though. More obscure games had top bids below 1,000 gems last night, and most of those games have store prices well above the amount that was needed to buy a sack of 1,000 gems at the time. These low-profile auctions actually seemed pretty easy to win, since there weren't many people bidding. However, for many high-profile games, you can expect at least one ridiculous person bidding a ridiculous amount of gems worth more than the game's price. In such a case, bidding is no longer worth it for anyone else except for other crazy people who place high bids just for the satisfaction of winning an auction.
I'm not really sure what's going on with these high rollers. I think you get a Steam badge for winning an auction, but if that's all they want, they could bid on more obscure games whose auctions are easier to win. Instead, they choose popular games and bid so high that they'd be better off using cash. Could it be the combination of a decent game and a badge that causes them to bid higher than a game's value? Or are they just not paying attention? Could they be blinded by the fact that real money is being substituted for artificial money for the purposes of bidding? I'm not sure.
Of course, I don't mean to imply that all these high bidders are using gems straight from the market. That is, I doubt anyone is actually bidding on a $15.00 game immediately after spending way more than $15.00 on the required gems. Many bidders probably bought gems when they were much cheaper, and recycled items to get a lot of their gems for free. However, my concern is not the amount of actual money these people are spending. It's the amount of money they could be getting instead. If, for whatever reason, you have a pile of gems which you could sell for $25.00, and you'd need all of them to bid successfully on a $15.00 game, shouldn't you just sell the gems and buy the game instead, making $10.00 in the process? (Sure, that $10.00 is just store credit, but it's better than nothing.) Better yet, couldn't you sell your gems and then wait until the sale that starts in a couple of days? That $15.00 game might be 50% off, and then you'll have a $17.50 surplus instead.
I guess I shouldn't be surprised that people are making the less rational decision. In a community of millions of people, there are bound to be at least a few hundred idiots. Honestly — forget gems — the fact that people are even willing to go to the market and spend real money on profile backgrounds and emoticons just to keep them is beyond me, but I guess I'm glad they do so. I've bought and sold some things on the Steam market (and, hilariously, made an occasional profit by buying an item and selling it later), but I didn't put any of my own money in. I started with $0.00 and a pile of cards. Now I have fewer cards and a few dollars. I'm happy with that, and I'm not about to spend those dollars on anything but a game. But maybe I just don't "get it" because I'm not an obsessive Steam badge collector.
Like the market itself, the auction is meaningless to me unless I can screw with the system to get free money. In order to win any of the most noteworthy games in this auction, I'd have to buy so many gems that the items in my inventory, whether sold or crafted or recycled, would never cover the cost. The net loss would be greater than if I just sold my cards on the market and bought the game directly from the Steam store. So it sounds like a bad idea. But, again, maybe I just don't get it.
Did I mention that the item with the highest top bid isn't even a game? It's a special profile background. You can't get it any other way, so I guess that makes it priceless, but it's not even permanent. It's only available until January 6th, after which I assume it disappears from the profiles of those who won it. The high bid, when I checked last night, was nearly 400,000 gems. At the time, these gems could have been sold on the market for well over $400.00 instead, and it would have been nearly $500.00 to actually buy them straight from the market. The high bid has been around the same amount every time I've checked since then, even though a new round starts every 45 minutes, so it's not just one person placing such a high bid. Apparently that's just how much people are willing to pay for a pretty profile background which cannot be traded, cannot be resold on the market, and disappears next month. Yeah, clearly I don't get it. I must be missing something here. Maybe the top bidders are all obscenely wealthy people who just don't care.
Anyway, if you're not obscenely rich and you're participating in the auction and you're trying to win a game, I urge you to check the price of the game on which you're bidding as well as the total amount you'd earn by selling all the gems you're planning to bid. If the latter is greater than the former, it's time to stop bidding.
Tuesday, December 16, 2014
Tuesday, December 2, 2014
Interactive Fiction
Whenever I write about a topic I've already covered, I'm in fear of accidentally contradicting myself. My thoughts and feelings change over time, and this is perfectly natural, but each finished piece of writing is static (unless and until I feel compelled to go through the trouble of changing it). Each and every post on this blog is a frozen snapshot of my thoughts and feelings at a particular time, and all of those snapshots are displayed simultaneously in the same place. When I do change my mind, and if I don't have the time or the patience to eradicate my outdated thoughts from wherever they were written, it might just look like I'm saying two different things at once.
For example, I might have been too kind to a certain piece of interactive fiction in this post about games as art. I dismissed the criticism that Dear Esther was bad for its lack of gameplay because it's not a game and therefore should not be judged as a game. Dear Esther, I wrote, should be judged instead as a piece of interactive fiction or as a work of art. A direct quote from the other post: "A valid criticism of Dear Esther should focus on what's there — the writing, the visuals, and the music — rather than obsessing over exactly how it's not a game." Even now, I stand by all of these claims about how Dear Esther should be judged, but if I ever implied that Dear Esther comes up smelling like roses when judged in this manner, I'm about to disagree with my former self.
The writing, the visuals, and the music in Dear Esther are all fine. The lack of traditional gameplay elements is also fine. The lack of meaningful interactivity of any kind, however, is less so. Dear Esther is a walk through a virtual landscape set to music and narration, and the act of walking (but nothing else) is left to the player. The experience is interactive in the sense that the player can choose where to walk and where to look within the confines of Dear Esther's explorable space; the problem is that those confines are limiting to the point where the interactivity becomes nothing but a nuisance.
The player can move freely, but aside from a few wide open areas and a couple of briefly diverging paths, there's only one way to go from the beginning to the end. The game has no real exploration, and the player's actions have no real effect on the story. The experience is essentially linear, and yet the player is forced to interact; one cannot get to the end without walking there, a tedious and potentially frustrating task when there's hardly anything to do along the way. Dear Esther might even be better if it would just play itself.
It's a common problem in story-driven games, so-called interactive fiction, and everything in between. The story is there, and the player input is there, but the potential for truly interactive fiction is lost when the player is unable to affect the narrative. If the story is the most important aspect of a product which insists on being interactive, shouldn't the story itself be interactive? It's no surprise that a truly interactive story is a rarity in the typical video game, which is gameplay-driven and includes a story only as a contextual backdrop no matter how obnoxiously that story shoves itself down the player's throat during unskippable cutscenes, but developers should take more care when adding game elements to a story instead of the reverse.
A couple of weeks ago, I played To the Moon, which tells a nice (albeit awkwardly written) story and contains more than enough gameplay to be considered a game. Unfortunately, while it clearly strives to be interactive fiction first and foremost, the interactivity and the fiction do not blend well. The only genuine gameplay consists of occasional puzzles and a few brief mini-games, while the bulk of the player's time is spent alternating between reading dialogue and making sure to click on all the clickable objects in a given area. The game is so heavy with dialogue and so short on meaningful player input that, per click or keystroke, the majority of one's interaction with the game consists of telling the game to continue to the next line of dialogue. Yet, for all this time spent manually moving through the story — I used the word "tedious" in reference to Dear Esther's walking and I think it applies here as well — there's no real interaction with the story itself. The ending is always the same. Player choices sometimes have an effect on a few lines of dialogue, but that's all.
I get it. The developers wanted to tell a very specific story. Not every story will have branching paths and multiple outcomes. However, if the story is to remain static and immutable, two things need to happen. First, the player needs to be able to step away from the story long enough to have a satisfying amount of control over something. For To the Moon, this would mean a lot more puzzles and mini-games, or perhaps a totally new gameplay element. Second, the player needs to be able to sit back and watch when no meaningful input is required. This would turn Dear Esther into a movie, but I guess that's the whole problem with Dear Esther.
My examples so far are pretty extreme, but the problems in Dear Esther and To the Moon are things that all story-driven games need to avoid, and not all of them do a very good job. In my old post on Alan Wake, I mentioned the frustration of needing to follow a character around or simply idle about while listening to dialogue and waiting for the next scripted event. The same thing happens in Half-Life 2, thanks to the developers' decision to forgo cutscenes for the sake of having everything (including story exposition) happen in real-time. Alan Wake and Half-Life 2 both have plenty of gameplay to keep the player entertained, but the occasional need to sit through real-time in-gameplay dialogue always left me wishing that a skippable cutscene were used instead, especially when continual but meaningless player input was required (e.g., when following another character down a linear path).
I'm actually beginning to think that the hybridization of gameplay mechanics with interactive fiction is a failed experiment, and that the game industry should stop insisting on doing it over and over again. Games with excellent gameplay don't need mediocre stories tacked on in the form of unskippable in-game sit-around-and-listen-to-people-talk scenes from which the player cannot walk away. Meanwhile, the "interactive" requirement of an interactive story cannot be adequately satisfied with badly implemented gameplay mechanics, like mini-games and puzzles that occur with the frequency of a cutscene in a traditional game.
You want to make a shooter? Don't annoy the shooter fans with superfluous dialogue and scripted action sequences. You want to make an interactive story? Don't force your customers to play through stupid shooting sections. Am I wrong? I mean, clearly, there's an acceptable balance somewhere, but not many developers find it. Why can't the video game and the interactive story be two distinct things, each with no obligation to step on the other's turf? I guess it's because mashing the two things together broadens the target demographic. Shooter fans buy it for the gunplay, people who like interactive fiction buy it for the role-playing aspects or the built-in dating simulator, and everyone is just barely happy enough to keep playing.
Or maybe it's because interactive fiction just hasn't matured enough to stand on its own without being shoehorned into a traditional game or having a traditional game shoehorned into it. After all, a sufficiently interactive story is probably hard to write. Giving the player a satisfying amount of control within a well structured story sounds pretty difficult, if we assume the player's control must be over some aspect of the story itself.
Perhaps most disappointing of all are games which seem to be built on the premise of a player-controlled branching storyline but end up being almost entirely linear anyway. Playing through the first season of Telltale's The Walking Dead was a great experience, but a lot of the magic was gone when I realized after the fact that the ending I got was the only ending. Player choices determine which characters live or die and, to some extent, the characters' feelings about each other, but the protagonist and his group go to the same places and do the same things regardless. Ultimately, the only thing that changes, as a result of character deaths and the relationships between those who remain, is the dialogue.
Obviously, not every game needs multiple endings in order to have a satisfying story. However, in The Walking Dead, the player's ability to shape the story by making (often binary) decisions is the primary mode of gameplay. The rest is just occasional puzzles, and some shooting sequences which could rightly be called mini-games. When gameplay consists almost entirely of manipulation of the game's story through dialogue and moral choices, the ability to manipulate the story in a substantial and meaningful way is pretty important. The Walking Dead provides more illusion of choice than actual choice... but I guess I wouldn't have known if I'd only played once and never looked back.
For example, I might have been too kind to a certain piece of interactive fiction in this post about games as art. I dismissed the criticism that Dear Esther was bad for its lack of gameplay because it's not a game and therefore should not be judged as a game. Dear Esther, I wrote, should be judged instead as a piece of interactive fiction or as a work of art. A direct quote from the other post: "A valid criticism of Dear Esther should focus on what's there — the writing, the visuals, and the music — rather than obsessing over exactly how it's not a game." Even now, I stand by all of these claims about how Dear Esther should be judged, but if I ever implied that Dear Esther comes up smelling like roses when judged in this manner, I'm about to disagree with my former self.
The writing, the visuals, and the music in Dear Esther are all fine. The lack of traditional gameplay elements is also fine. The lack of meaningful interactivity of any kind, however, is less so. Dear Esther is a walk through a virtual landscape set to music and narration, and the act of walking (but nothing else) is left to the player. The experience is interactive in the sense that the player can choose where to walk and where to look within the confines of Dear Esther's explorable space; the problem is that those confines are limiting to the point where the interactivity becomes nothing but a nuisance.
The player can move freely, but aside from a few wide open areas and a couple of briefly diverging paths, there's only one way to go from the beginning to the end. The game has no real exploration, and the player's actions have no real effect on the story. The experience is essentially linear, and yet the player is forced to interact; one cannot get to the end without walking there, a tedious and potentially frustrating task when there's hardly anything to do along the way. Dear Esther might even be better if it would just play itself.
It's a common problem in story-driven games, so-called interactive fiction, and everything in between. The story is there, and the player input is there, but the potential for truly interactive fiction is lost when the player is unable to affect the narrative. If the story is the most important aspect of a product which insists on being interactive, shouldn't the story itself be interactive? It's no surprise that a truly interactive story is a rarity in the typical video game, which is gameplay-driven and includes a story only as a contextual backdrop no matter how obnoxiously that story shoves itself down the player's throat during unskippable cutscenes, but developers should take more care when adding game elements to a story instead of the reverse.
A couple of weeks ago, I played To the Moon, which tells a nice (albeit awkwardly written) story and contains more than enough gameplay to be considered a game. Unfortunately, while it clearly strives to be interactive fiction first and foremost, the interactivity and the fiction do not blend well. The only genuine gameplay consists of occasional puzzles and a few brief mini-games, while the bulk of the player's time is spent alternating between reading dialogue and making sure to click on all the clickable objects in a given area. The game is so heavy with dialogue and so short on meaningful player input that, per click or keystroke, the majority of one's interaction with the game consists of telling the game to continue to the next line of dialogue. Yet, for all this time spent manually moving through the story — I used the word "tedious" in reference to Dear Esther's walking and I think it applies here as well — there's no real interaction with the story itself. The ending is always the same. Player choices sometimes have an effect on a few lines of dialogue, but that's all.
I get it. The developers wanted to tell a very specific story. Not every story will have branching paths and multiple outcomes. However, if the story is to remain static and immutable, two things need to happen. First, the player needs to be able to step away from the story long enough to have a satisfying amount of control over something. For To the Moon, this would mean a lot more puzzles and mini-games, or perhaps a totally new gameplay element. Second, the player needs to be able to sit back and watch when no meaningful input is required. This would turn Dear Esther into a movie, but I guess that's the whole problem with Dear Esther.
My examples so far are pretty extreme, but the problems in Dear Esther and To the Moon are things that all story-driven games need to avoid, and not all of them do a very good job. In my old post on Alan Wake, I mentioned the frustration of needing to follow a character around or simply idle about while listening to dialogue and waiting for the next scripted event. The same thing happens in Half-Life 2, thanks to the developers' decision to forgo cutscenes for the sake of having everything (including story exposition) happen in real-time. Alan Wake and Half-Life 2 both have plenty of gameplay to keep the player entertained, but the occasional need to sit through real-time in-gameplay dialogue always left me wishing that a skippable cutscene were used instead, especially when continual but meaningless player input was required (e.g., when following another character down a linear path).
I'm actually beginning to think that the hybridization of gameplay mechanics with interactive fiction is a failed experiment, and that the game industry should stop insisting on doing it over and over again. Games with excellent gameplay don't need mediocre stories tacked on in the form of unskippable in-game sit-around-and-listen-to-people-talk scenes from which the player cannot walk away. Meanwhile, the "interactive" requirement of an interactive story cannot be adequately satisfied with badly implemented gameplay mechanics, like mini-games and puzzles that occur with the frequency of a cutscene in a traditional game.
You want to make a shooter? Don't annoy the shooter fans with superfluous dialogue and scripted action sequences. You want to make an interactive story? Don't force your customers to play through stupid shooting sections. Am I wrong? I mean, clearly, there's an acceptable balance somewhere, but not many developers find it. Why can't the video game and the interactive story be two distinct things, each with no obligation to step on the other's turf? I guess it's because mashing the two things together broadens the target demographic. Shooter fans buy it for the gunplay, people who like interactive fiction buy it for the role-playing aspects or the built-in dating simulator, and everyone is just barely happy enough to keep playing.
Or maybe it's because interactive fiction just hasn't matured enough to stand on its own without being shoehorned into a traditional game or having a traditional game shoehorned into it. After all, a sufficiently interactive story is probably hard to write. Giving the player a satisfying amount of control within a well structured story sounds pretty difficult, if we assume the player's control must be over some aspect of the story itself.
Perhaps most disappointing of all are games which seem to be built on the premise of a player-controlled branching storyline but end up being almost entirely linear anyway. Playing through the first season of Telltale's The Walking Dead was a great experience, but a lot of the magic was gone when I realized after the fact that the ending I got was the only ending. Player choices determine which characters live or die and, to some extent, the characters' feelings about each other, but the protagonist and his group go to the same places and do the same things regardless. Ultimately, the only thing that changes, as a result of character deaths and the relationships between those who remain, is the dialogue.
Obviously, not every game needs multiple endings in order to have a satisfying story. However, in The Walking Dead, the player's ability to shape the story by making (often binary) decisions is the primary mode of gameplay. The rest is just occasional puzzles, and some shooting sequences which could rightly be called mini-games. When gameplay consists almost entirely of manipulation of the game's story through dialogue and moral choices, the ability to manipulate the story in a substantial and meaningful way is pretty important. The Walking Dead provides more illusion of choice than actual choice... but I guess I wouldn't have known if I'd only played once and never looked back.
Labels:
alan wake,
dear esther,
half-life,
the walking dead,
to the moon
Sunday, July 20, 2014
On Killing Thousands of Robots
For the past few months, during my unintended (laziness-induced) hiatus from updating this blog, I've been playing a few different games. I spent some time on Planetside 2, which is fun with friends but isn't something I would ever play without them. I also tried and failed a few more times to make further progress in The Binding of Isaac, whose randomly generated dungeons and mostly unexplained item system are a great source of replayability despite all the frustration that inevitably comes with such a strongly luck-based game.
More importantly, however, I've wasted far too much time playing a game called Hard Reset and also I've fallen back in love/hate with another time-sucker called Ultratron. I didn't expect much, since I didn't pay much — Hard Reset came as part of a $4 bundle from Bundle Stars, and Ultratron came in a $1+ bundle from Humble Bundle — but I've played almost 50 hours of the former and nearly 40 hours of the latter, so I probably got my money's worth.
Hard Reset is a gorgeous first-person shooter with some interesting weapon design, a minimalist heads-up display, and a heavy emphasis on using environmental hazards to destroy the seemingly endless waves of mechanical bad guys. The game was developed by some of the same people who made Painkiller and, like its occult-themed cousin, Hard Reset is often described as an "old-school" shooter — a throwback to the days of Doom and Quake. To some extent, this is accurate. Unlike so many modern video games, this one doesn't try very hard to be a movie. It's not filled with excessive dialogue and exploding set pieces and scripted action sequences. It's non-stop running and gunning all the way through. It's all about challenging the player. It's a real game. It does attempt to tell a story — perhaps it even tries a little too hard — but, just as in Painkiller, it's not a very good story and it should probably be ignored anyway.
Most of the exposition takes place during loading screen cut scenes, drawn and badly animated in the style of a comic book. When you get to one of these loading screens, I suggest that you go and make a sandwich or something. The plot hardly makes any sense, the acting and writing are both profoundly awful, and the protagonist (made immediately ridiculous by a hilariously typical video-game-action-hero voice) was apparently designed by some 12-year-old kid who thinks the unnecessary use of profanity is just the coolest thing ever conceived. For all intents and purposes, Hard Reset doesn't have a plot. It's far more story-driven than Painkiller, but even if you manage to decipher what's going on, you're not going to care in the end.
Where Hard Reset really strays from its old-school shooter roots is in the adoption of some modern gameplay mechanics. If you'll allow one more comparison to Painkiller, I'd like to point out that its demon-slaying protagonist could run circles around almost any enemy... or, more accurately, he could jump circles around any enemy. While running speed was rather slow in Painkiller, the player could accumulate a lot of extra speed by repeatedly jumping. Although this trick was not explained in any tutorial, it was crucial. This high mobility was a staple of Painkiller gameplay, and provided an advantage over the enormous hordes of enemies that would otherwise surround the player. Hard Reset, on the other hand, has painfully slow running speed and no "bunny hopping" ability. Instead, you just get a sprint button which allows a modest boost in speed for only a couple of seconds.
The limited sprinting ability does manage to be useful, but only marginally so. I must stress that "a couple of seconds" is no exaggeration. The protagonist runs out of breath faster than a morbidly obese pack-a-day smoker with one leg. The result is a game that feels maddeningly sluggish to fans of the same fast-paced old-school shooters to which games like Hard Reset and Painkiller draw so many comparisons. The problem is exacerbated by certain enemies whose attacks often seem unavoidable and — I'm convinced — sometimes are. Running at the speed of a tortoise, armed with only two seconds of slightly-faster-than-tortoise sprinting, can be extremely frustrating when you're being bombarded with missiles and you can't think of a way to output enough damage to kill the enemy before his carpet bombing attack kills you.
Even switching to the right gun can be a bit of a chore — a waste of time when there's no time to waste — thanks to the game's interesting but gimmicky weapon system. Most of the first-person shooters of my childhood days would allow the player to carry a totally unrealistic number of weapons (usually the game's entire arsenal) at once for the sake of fun. On the other hand, most modern shooters limit the player's holding capacity (usually to two or three guns) either for the sake of realism, or for ease of access when using a console gamepad instead of a full keyboard, or for the increased challenge that might be faced by the player when he or she is forced to drop a useful rocket launcher in order to pick up a needed sniper rifle. The weapon system in Hard Reset almost seems like a parody of these modern trends in that, technically, only two guns exist in the entire game.
The player is immediately given an assault rifle and a plasma rifle. The gimmick is that each of these guns can be upgraded with up to four additional weapon modes, and most of these weapon modes have both a primary and a secondary function. So you'll get your shotgun, your grenade launcher, your rocket launcher, your lightning gun, and your big laser canon. You'll even be able to hold them all at once. Unfortunately, all of those functions will be crammed into two weapons, which means you often have to switch to the correct gun and then to the correct firing mode instead of just going directly to the weapon you want. This can be a problem when you need to pull the right weapon quickly, especially since the timely use of stunning weapons on fast enemies is all that compensates for the protagonist's lack of speed.
Despite these issues, Hard Reset is pretty decent. You'll need to adapt to the unusual weapon system, and you'll need to get over the fact that the main character can't run twenty feet without taking a breather, and you'll need to pretend the laughable plot doesn't exist... but none of these are reasons not to play. If there's a reason not to play, it's a lack of patience and an unwillingness to lose a lot before winning. Hard Reset is a rarity in modern times — a truly punishing game whose higher difficulty levels will be deemed impossible by many players.
As for me, it seems I'm a masochist. After starting at the easiest difficulty setting, I decided to work my way up to the hardest. Normal mode all right, Hard mode was frustrating, and Insane mode nearly caused me to smash my computer on numerous occasions. At this level, the game becomes a nightmare. The toughest enemies are bullet sponges, practically unavoidable projectile attacks can kill you in seconds, and you're forced to adopt strategies that weren't necessary before. The use of immobilizing weapons, like stasis grenades and EMP bursts, becomes an absolute necessity, and players who fail to utilize environmental hazards like explosive red barrels will find it difficult to overcome the cruel mathematical problem of dealing enough damage to the enemies before inevitably being shredded to pieces. The infinite waves of little enemies that harass you during boss fights are suddenly a real threat, and they'll kill you as often as the bosses themselves do. Splash damage from explosive weapons will kill you even when you think you're behind cover. When playing the higher difficulty modes, you will die a lot, even if it's not your fault. The game isn't fair.
But I beat it anyway. I kept trying even when I thought it was impossible. I responded to the game's unfair tactics with some unfair tactics of my own. I had an all-out battle of patience and reflexes and wits against a computer and won.
The only problem is that Insane mode isn't actually the highest difficulty. There's one more, called Heroic mode. It's even harder. Worse yet, mid-level checkpoints are removed. You have to beat entire levels without screwing up at all, and the levels are long. Throughout most of Insane mode, I died at least once or twice in every fight, and dozens of times in others.
I think this is where I give up. I don't think I'll ever get through Heroic mode. Even if I'm capable, I just lack the willpower. I've played through the entire game four times and it's no longer fun enough to justify that kind of frustration. And that's okay — I'm fine with that — except that my completion of the game will forever be stuck at 98%. There are just two achievements left: one for beating Heroic mode and then, stupidly, another achievement for unlocking all achievements.
It's just so close that I can't not be annoyed.
I don't usually care much about achievements. Some people can't get enough of them, and some people absolutely despise them. Most of the time, I'm indifferent. Optional challenges are great, as long as they're actually fun, and whether those optional challenges come in the form of "achievements" or some other functionally equivalent feature with a less silly name is irrelevant. If I play all the way through a game and I want to play it some more, the additional challenges are something worth considering. Sometimes I decide to ignore them.
But that's hard to do when I've already done all but a couple of them. It's not even about the pride of reaching 100% completion; it's about leaving things unfinished. It's about the fact that achievements on Steam are a permanent record that I have to see whenever I look at the game in my library, even if it's not installed. It's about something being so close to perfection and not being perfected. As far as I'm concerned, 0% is just as good as 100%, but numbers like 1% and 99% are the worst.
I have a similar issue with Ultratron, the other beautiful, chaotic, punishing, evil-robot-themed game I've been playing recently. If you understand that the title is a reference to Robotron: 2084, you probably already know that this one isn't a first-person shooter. Ultratron is a top-down arena shoot-em-up with highly stylized faux retro graphics and a multitude of ridiculous power-ups. Much like Titan Attacks, the over-the-top Space Invaders clone made by the same developer, Ultratron starts out slow and then gradually becomes an unforgiving hell of colorful projectiles and flashy explosions. And it's more addictive than heroin.
I loved Titan Attacks so much that I played it until I had all the achievements unlocked. I thought I loved Ultratron that much too, but a few of the achievements are just stupidly difficult. Once you're past a certain point, the game becomes extremely intense, and you only get a checkpoint every ten levels — after, not before, each boss. Worse yet, some of the most challenging achievements require a certain number of kills, and the counter resets to zero if you start playing from a checkpoint. If you want the game to count your 10,000 kills, you'll have to get them all in a single playthrough without ever dying. It doesn't have to be all in one sitting, since you can save at any time, but that save disappears after it's loaded. Saving is for taking breaks, not for backing up your progress.
I got fed up with Ultratron after a while, and stopped playing it for months, but I'm back at it again. I told myself I was just going to play a few rounds, but now I'm looking at those last few achievements and saying "yeah, I can totally do that." And I probably can't, but 94% is so close to 100% that it's hard not to try. I should also note that we're not really dealing with additional challenges, in this case, since Ultratron is one of those arcade-style game that goes on as long as you can keep winning. The game doesn't have a real "ending" so the completion of achievements is really the only form of "winning" there is.
Sure, it's silly to be so irritated by falling short of 100% completion, but at least I have the sense to know I'm being silly. At least I'm not so determined that I'm willing to keep playing these games past the point where they stop being enjoyable. When I wanted to unlock the last few achievements in L.A. Noire, just for the sake of neatly wrapping up a fairly enjoyable game, I just used a walkthrough for those idiotic scavenger hunt achievements (Star Map, Auto Fanatic, and Hollywoodland) because doing it myself would have taken so long that I would have hated the game by the time I was done with it.
Not everything can be helped with a walkthrough, though. So should I try to beat Ultratron? Should I bother to clear that last hurdle in Hard Reset? Surely I've blown up more than enough robots this year, with a combined 85 hours in these two games alone (plus whatever uncounted hours I spent playing them with Steam in offline mode). I guess the answer is "maybe someday."
Or maybe I'll just use Steam Achievement Manager — a tool whose use I never understood before now — to unlock these practically impossible achievements just so I'm not tempted to waste endless hours actually attempting them. Trying to master everything I play is exactly why I'll never get through my backlog.
More importantly, however, I've wasted far too much time playing a game called Hard Reset and also I've fallen back in love/hate with another time-sucker called Ultratron. I didn't expect much, since I didn't pay much — Hard Reset came as part of a $4 bundle from Bundle Stars, and Ultratron came in a $1+ bundle from Humble Bundle — but I've played almost 50 hours of the former and nearly 40 hours of the latter, so I probably got my money's worth.
Hard Reset is a gorgeous first-person shooter with some interesting weapon design, a minimalist heads-up display, and a heavy emphasis on using environmental hazards to destroy the seemingly endless waves of mechanical bad guys. The game was developed by some of the same people who made Painkiller and, like its occult-themed cousin, Hard Reset is often described as an "old-school" shooter — a throwback to the days of Doom and Quake. To some extent, this is accurate. Unlike so many modern video games, this one doesn't try very hard to be a movie. It's not filled with excessive dialogue and exploding set pieces and scripted action sequences. It's non-stop running and gunning all the way through. It's all about challenging the player. It's a real game. It does attempt to tell a story — perhaps it even tries a little too hard — but, just as in Painkiller, it's not a very good story and it should probably be ignored anyway.
Most of the exposition takes place during loading screen cut scenes, drawn and badly animated in the style of a comic book. When you get to one of these loading screens, I suggest that you go and make a sandwich or something. The plot hardly makes any sense, the acting and writing are both profoundly awful, and the protagonist (made immediately ridiculous by a hilariously typical video-game-action-hero voice) was apparently designed by some 12-year-old kid who thinks the unnecessary use of profanity is just the coolest thing ever conceived. For all intents and purposes, Hard Reset doesn't have a plot. It's far more story-driven than Painkiller, but even if you manage to decipher what's going on, you're not going to care in the end.
Where Hard Reset really strays from its old-school shooter roots is in the adoption of some modern gameplay mechanics. If you'll allow one more comparison to Painkiller, I'd like to point out that its demon-slaying protagonist could run circles around almost any enemy... or, more accurately, he could jump circles around any enemy. While running speed was rather slow in Painkiller, the player could accumulate a lot of extra speed by repeatedly jumping. Although this trick was not explained in any tutorial, it was crucial. This high mobility was a staple of Painkiller gameplay, and provided an advantage over the enormous hordes of enemies that would otherwise surround the player. Hard Reset, on the other hand, has painfully slow running speed and no "bunny hopping" ability. Instead, you just get a sprint button which allows a modest boost in speed for only a couple of seconds.
The limited sprinting ability does manage to be useful, but only marginally so. I must stress that "a couple of seconds" is no exaggeration. The protagonist runs out of breath faster than a morbidly obese pack-a-day smoker with one leg. The result is a game that feels maddeningly sluggish to fans of the same fast-paced old-school shooters to which games like Hard Reset and Painkiller draw so many comparisons. The problem is exacerbated by certain enemies whose attacks often seem unavoidable and — I'm convinced — sometimes are. Running at the speed of a tortoise, armed with only two seconds of slightly-faster-than-tortoise sprinting, can be extremely frustrating when you're being bombarded with missiles and you can't think of a way to output enough damage to kill the enemy before his carpet bombing attack kills you.
Even switching to the right gun can be a bit of a chore — a waste of time when there's no time to waste — thanks to the game's interesting but gimmicky weapon system. Most of the first-person shooters of my childhood days would allow the player to carry a totally unrealistic number of weapons (usually the game's entire arsenal) at once for the sake of fun. On the other hand, most modern shooters limit the player's holding capacity (usually to two or three guns) either for the sake of realism, or for ease of access when using a console gamepad instead of a full keyboard, or for the increased challenge that might be faced by the player when he or she is forced to drop a useful rocket launcher in order to pick up a needed sniper rifle. The weapon system in Hard Reset almost seems like a parody of these modern trends in that, technically, only two guns exist in the entire game.
The player is immediately given an assault rifle and a plasma rifle. The gimmick is that each of these guns can be upgraded with up to four additional weapon modes, and most of these weapon modes have both a primary and a secondary function. So you'll get your shotgun, your grenade launcher, your rocket launcher, your lightning gun, and your big laser canon. You'll even be able to hold them all at once. Unfortunately, all of those functions will be crammed into two weapons, which means you often have to switch to the correct gun and then to the correct firing mode instead of just going directly to the weapon you want. This can be a problem when you need to pull the right weapon quickly, especially since the timely use of stunning weapons on fast enemies is all that compensates for the protagonist's lack of speed.
Despite these issues, Hard Reset is pretty decent. You'll need to adapt to the unusual weapon system, and you'll need to get over the fact that the main character can't run twenty feet without taking a breather, and you'll need to pretend the laughable plot doesn't exist... but none of these are reasons not to play. If there's a reason not to play, it's a lack of patience and an unwillingness to lose a lot before winning. Hard Reset is a rarity in modern times — a truly punishing game whose higher difficulty levels will be deemed impossible by many players.
As for me, it seems I'm a masochist. After starting at the easiest difficulty setting, I decided to work my way up to the hardest. Normal mode all right, Hard mode was frustrating, and Insane mode nearly caused me to smash my computer on numerous occasions. At this level, the game becomes a nightmare. The toughest enemies are bullet sponges, practically unavoidable projectile attacks can kill you in seconds, and you're forced to adopt strategies that weren't necessary before. The use of immobilizing weapons, like stasis grenades and EMP bursts, becomes an absolute necessity, and players who fail to utilize environmental hazards like explosive red barrels will find it difficult to overcome the cruel mathematical problem of dealing enough damage to the enemies before inevitably being shredded to pieces. The infinite waves of little enemies that harass you during boss fights are suddenly a real threat, and they'll kill you as often as the bosses themselves do. Splash damage from explosive weapons will kill you even when you think you're behind cover. When playing the higher difficulty modes, you will die a lot, even if it's not your fault. The game isn't fair.
But I beat it anyway. I kept trying even when I thought it was impossible. I responded to the game's unfair tactics with some unfair tactics of my own. I had an all-out battle of patience and reflexes and wits against a computer and won.
The only problem is that Insane mode isn't actually the highest difficulty. There's one more, called Heroic mode. It's even harder. Worse yet, mid-level checkpoints are removed. You have to beat entire levels without screwing up at all, and the levels are long. Throughout most of Insane mode, I died at least once or twice in every fight, and dozens of times in others.
I think this is where I give up. I don't think I'll ever get through Heroic mode. Even if I'm capable, I just lack the willpower. I've played through the entire game four times and it's no longer fun enough to justify that kind of frustration. And that's okay — I'm fine with that — except that my completion of the game will forever be stuck at 98%. There are just two achievements left: one for beating Heroic mode and then, stupidly, another achievement for unlocking all achievements.
It's just so close that I can't not be annoyed.
I don't usually care much about achievements. Some people can't get enough of them, and some people absolutely despise them. Most of the time, I'm indifferent. Optional challenges are great, as long as they're actually fun, and whether those optional challenges come in the form of "achievements" or some other functionally equivalent feature with a less silly name is irrelevant. If I play all the way through a game and I want to play it some more, the additional challenges are something worth considering. Sometimes I decide to ignore them.
But that's hard to do when I've already done all but a couple of them. It's not even about the pride of reaching 100% completion; it's about leaving things unfinished. It's about the fact that achievements on Steam are a permanent record that I have to see whenever I look at the game in my library, even if it's not installed. It's about something being so close to perfection and not being perfected. As far as I'm concerned, 0% is just as good as 100%, but numbers like 1% and 99% are the worst.
I have a similar issue with Ultratron, the other beautiful, chaotic, punishing, evil-robot-themed game I've been playing recently. If you understand that the title is a reference to Robotron: 2084, you probably already know that this one isn't a first-person shooter. Ultratron is a top-down arena shoot-em-up with highly stylized faux retro graphics and a multitude of ridiculous power-ups. Much like Titan Attacks, the over-the-top Space Invaders clone made by the same developer, Ultratron starts out slow and then gradually becomes an unforgiving hell of colorful projectiles and flashy explosions. And it's more addictive than heroin.
I loved Titan Attacks so much that I played it until I had all the achievements unlocked. I thought I loved Ultratron that much too, but a few of the achievements are just stupidly difficult. Once you're past a certain point, the game becomes extremely intense, and you only get a checkpoint every ten levels — after, not before, each boss. Worse yet, some of the most challenging achievements require a certain number of kills, and the counter resets to zero if you start playing from a checkpoint. If you want the game to count your 10,000 kills, you'll have to get them all in a single playthrough without ever dying. It doesn't have to be all in one sitting, since you can save at any time, but that save disappears after it's loaded. Saving is for taking breaks, not for backing up your progress.
I got fed up with Ultratron after a while, and stopped playing it for months, but I'm back at it again. I told myself I was just going to play a few rounds, but now I'm looking at those last few achievements and saying "yeah, I can totally do that." And I probably can't, but 94% is so close to 100% that it's hard not to try. I should also note that we're not really dealing with additional challenges, in this case, since Ultratron is one of those arcade-style game that goes on as long as you can keep winning. The game doesn't have a real "ending" so the completion of achievements is really the only form of "winning" there is.
Sure, it's silly to be so irritated by falling short of 100% completion, but at least I have the sense to know I'm being silly. At least I'm not so determined that I'm willing to keep playing these games past the point where they stop being enjoyable. When I wanted to unlock the last few achievements in L.A. Noire, just for the sake of neatly wrapping up a fairly enjoyable game, I just used a walkthrough for those idiotic scavenger hunt achievements (Star Map, Auto Fanatic, and Hollywoodland) because doing it myself would have taken so long that I would have hated the game by the time I was done with it.
Not everything can be helped with a walkthrough, though. So should I try to beat Ultratron? Should I bother to clear that last hurdle in Hard Reset? Surely I've blown up more than enough robots this year, with a combined 85 hours in these two games alone (plus whatever uncounted hours I spent playing them with Steam in offline mode). I guess the answer is "maybe someday."
Or maybe I'll just use Steam Achievement Manager — a tool whose use I never understood before now — to unlock these practically impossible achievements just so I'm not tempted to waste endless hours actually attempting them. Trying to master everything I play is exactly why I'll never get through my backlog.
Labels:
achievements,
doom,
hard reset,
l.a. noire,
planetside,
quake,
the binding of isaac,
ultratron
Monday, April 28, 2014
Unskippable Cutscenes & Other Pure Evil
I often say that if a single-player video game isn't worth playing twice then it's not worth playing at all. That might be a slight exaggeration — surely "game worth playing exactly once" does lie somewhere between "game worth playing twice" and "game worth uninstalling" — but honestly, if I get to the end of a game and I don't want to do it again, it probably wasn't very fun. In my opinion, one should be able to enjoy a game multiple times, twice being the bare minimum.
Surely, some people will disagree, and their counterargument will most likely go something like this: "After the first time, you already know how it ends!" Well, yeah, it ends with me winning. If your experience with a game is spoiled because you already know the end of an underlying story, which probably exists only to provide context to some otherwise meaningless depictions of violence, the game isn't a very good one. It's supposed to be a game, after all, not a movie. Do I like story-driven games? Sure. But a game needs to be fun for reasons other than its plot twists. Otherwise, it should have been a movie, and anyone who likes it could have watched a movie instead. Moreover, if there's a game whose plot is the primary appeal, it should have a plot that's good enough for an eventual second run. Lots of people like to watch a good movie more than once, right?
But let's say the plot isn't so great that I want to see it and hear it again. Let's say I'm about to play a game a second time because the actual game was fun, or because I missed a few optional objectives the first time, or because my high score isn't high enough. If that's the case, I shouldn't be forced to see and hear the story a second time, and since no reasonable person would ever disagree with that statement, I don't understand why any video game developer ever thought unskippable cutscenes were a good idea.
I recently finished L.A. Noire. Played well the first time around, this game might not warrant a second play-through, but I missed plenty of clues and botched more than a few interrogations, so I decided to go back for the five-star ratings (and a few achievements) I missed. And it would have been pretty fun if I didn't have to watch every single cutscene a second time. Why can't we skip the cutscenes in L.A. Noire? I have no idea. I've heard that cutscenes in Max Payne 3 are unskippable because the game uses that time to load, but L.A. Noire has separate loading screens, so I doubt it's the same situation. So, disguised loading screens aside, why would a game ever have unskippable cutscenes? It turns out that there are no good reasons.
Sometimes we wonder why and how a serial killer would decide to brutally murder dozens of people. That a human could do something so senselessly awful is almost beyond comprehension. Two possible explanations come to mind: either the serial killer is very simply insane and his or her actions are the result of a mental defect, or true evil exists in the world and it lives inside that person. Unskippable cutscenes are a much lesser crime than repeated homicide but the same two explanations apply and I can't think of any others. The developers of a game must be crazy (or profoundly stupid) to think that a person playing the game for the second time won't be annoyed by the inability to opt out of re-watching something as pointless as a cutscene. On the other hand, if they do know that such a restriction annoys players, the only reason to implement unskippable cutscenes would be to annoy players and that's pure evil. Surely it's a lesser evil than murder, but the evil is still pure because the only conceivable objective is to cause suffering.
Whether it's intentional or not, the unskippable cutscene often acts as an irritating punishment for failure. In games that forbid players from opting out of having a mediocre story shoved down their throats, there always seems to be a difficult boss fight (or a difficult something) preceded by a long and boring scene that's only entertaining once. And you end up seeing it fifteen times. If that's not bad game design, I don't know what is. Fortunately, none of the cutscenes in L.A. Noire are extremely long and no particular part of the game is likely to be repeated numerous times (since nothing in the game is very challenging and there's an option that allows awful players to skip action sequences). Still, there's no reason for unskippable cutscenes, especially since the concept of skipping a cutscene has been around for approximately as long as cutscenes have existed.
I guess the developers think their story is so important that we shouldn't attempt to enjoy the game without it. They think they need to save us from ourselves by preventing us from "accidentally" missing something important. Sometimes, a story is important enough that it shouldn't be skipped; if I saw someone skipping cutscenes while playing Alan Wake for the first time, I'd probably recommend that they just play a different game. But that's no reason to take away options.
This is one of those things that would cause us to boycott a game if only we had an ounce of self control. Unfortunately, we don't. It's unforgivable, but it's still tolerable enough that the developers and publishers are unlikely to suffer financially as a result, so they won't learn any lessons no matter how much I complain.
Surely, some people will disagree, and their counterargument will most likely go something like this: "After the first time, you already know how it ends!" Well, yeah, it ends with me winning. If your experience with a game is spoiled because you already know the end of an underlying story, which probably exists only to provide context to some otherwise meaningless depictions of violence, the game isn't a very good one. It's supposed to be a game, after all, not a movie. Do I like story-driven games? Sure. But a game needs to be fun for reasons other than its plot twists. Otherwise, it should have been a movie, and anyone who likes it could have watched a movie instead. Moreover, if there's a game whose plot is the primary appeal, it should have a plot that's good enough for an eventual second run. Lots of people like to watch a good movie more than once, right?
But let's say the plot isn't so great that I want to see it and hear it again. Let's say I'm about to play a game a second time because the actual game was fun, or because I missed a few optional objectives the first time, or because my high score isn't high enough. If that's the case, I shouldn't be forced to see and hear the story a second time, and since no reasonable person would ever disagree with that statement, I don't understand why any video game developer ever thought unskippable cutscenes were a good idea.
I recently finished L.A. Noire. Played well the first time around, this game might not warrant a second play-through, but I missed plenty of clues and botched more than a few interrogations, so I decided to go back for the five-star ratings (and a few achievements) I missed. And it would have been pretty fun if I didn't have to watch every single cutscene a second time. Why can't we skip the cutscenes in L.A. Noire? I have no idea. I've heard that cutscenes in Max Payne 3 are unskippable because the game uses that time to load, but L.A. Noire has separate loading screens, so I doubt it's the same situation. So, disguised loading screens aside, why would a game ever have unskippable cutscenes? It turns out that there are no good reasons.
Sometimes we wonder why and how a serial killer would decide to brutally murder dozens of people. That a human could do something so senselessly awful is almost beyond comprehension. Two possible explanations come to mind: either the serial killer is very simply insane and his or her actions are the result of a mental defect, or true evil exists in the world and it lives inside that person. Unskippable cutscenes are a much lesser crime than repeated homicide but the same two explanations apply and I can't think of any others. The developers of a game must be crazy (or profoundly stupid) to think that a person playing the game for the second time won't be annoyed by the inability to opt out of re-watching something as pointless as a cutscene. On the other hand, if they do know that such a restriction annoys players, the only reason to implement unskippable cutscenes would be to annoy players and that's pure evil. Surely it's a lesser evil than murder, but the evil is still pure because the only conceivable objective is to cause suffering.
Whether it's intentional or not, the unskippable cutscene often acts as an irritating punishment for failure. In games that forbid players from opting out of having a mediocre story shoved down their throats, there always seems to be a difficult boss fight (or a difficult something) preceded by a long and boring scene that's only entertaining once. And you end up seeing it fifteen times. If that's not bad game design, I don't know what is. Fortunately, none of the cutscenes in L.A. Noire are extremely long and no particular part of the game is likely to be repeated numerous times (since nothing in the game is very challenging and there's an option that allows awful players to skip action sequences). Still, there's no reason for unskippable cutscenes, especially since the concept of skipping a cutscene has been around for approximately as long as cutscenes have existed.
I guess the developers think their story is so important that we shouldn't attempt to enjoy the game without it. They think they need to save us from ourselves by preventing us from "accidentally" missing something important. Sometimes, a story is important enough that it shouldn't be skipped; if I saw someone skipping cutscenes while playing Alan Wake for the first time, I'd probably recommend that they just play a different game. But that's no reason to take away options.
This is one of those things that would cause us to boycott a game if only we had an ounce of self control. Unfortunately, we don't. It's unforgivable, but it's still tolerable enough that the developers and publishers are unlikely to suffer financially as a result, so they won't learn any lessons no matter how much I complain.
Saturday, March 29, 2014
Neat, I'm Insignificant
For some horrible reason, I'm following Polygon on Twitter. Thanks to that mistake, I just came across this brief article about social interaction among us "video gamers" and how we aren't the "antisocial misfits, basement-dwellers and loners" that outsiders supposedly think we are. The headline and the first two paragraphs strongly contest the stereotype against avid video game players. Loners, we're told, are actually the outliers, going against the trend.
Great, right? Sure does sound great. But it's not great, and they're not even convincingly making the case that it's true.
The article — titled Research: Loners are the exception, not the rule, among video gamers — describes a study in which researchers from North Carolina State University, in collaboration with others at York University and the University of Ontario Institute of Technology, observed thousands of video game players and surveyed hundreds more. The results of this study are published in a paper titled Public Displays of Play: Studying Online Games in Physical Settings.
Let's stop there.
Here's what you should be asking yourself: "Online games specifically? Physical settings in particular? Are we really using this study alone to make claims about 'gamers' in general?" Unfortunately. According to Polygon, the findings of the study are based on observations and surveys of MMORPG players at various "public gaming events." If either Polygon or the researchers themselves really do intend to draw conclusions about all consumers of video games based on this information, they're making just about as much sense as some guy who uses limited anecdotal evidence to argue that we are all basement-dwelling antisocial freaks.
It should be no surprise to anyone that the subjects of this study — the fans of massively multiplayer online games who meet up with other fans in person to participate in highly social activities — are highly social. Nobody would suspect they are "loners" of any kind. In other words, this isn't the shocking and counter-intuitive finding that Polygon wants it to be. It's the same kind of obvious, common-sense-confirming, almost insignificant finding that usually comes from sociological studies and experiments.
I'll admit right now that I have neither the time nor the patience to read the study in full, so I'm not sure if this not-so-amazing revelation — that the most socially active type of "gamer" you can imagine is by no means antisocial — is really the only point that the researchers wanted to make. On the other hand, I'm not sure if they really meant to draw outrageous conclusions about all "gamers" either. Judging by the title of the research paper and the abstract, I can only suspect that the conspicuous generalization of the study's findings to all "gamers" was Polygon's own invention.
In a way, I do admire what Polygon is trying to do with his article. They're trying to break down the worst negative stereotypes about "gamers" and to make video games a more socially acceptable hobby by exposing the fallacy of the antisocial, basement-dwelling, lonely game enthusiast. However, the article succeeds only in telling me (erroneously by the way) that I'm not representative of the "gamer" demographic because, in some ways, I am a loner. I'm not a creepy loner with no friends, but I do most often play video games alone. A lot of people who make a hobby out of video games are loners, and there's nothing wrong with that. We're not necessarily antisocial and I'd wager that very few of us actually dwell in basements, but it's not crazy to assume that a large percentage of people who might properly be called "gamers" enjoy playing video games at home by themselves and are less socially active, in the traditional sense, than people who prefer football. I'd conduct my own research if I had nothing better to do.
By making sweeping generalizations about "video gamers" based on the results of a study with no consideration of single-player or otherwise non-MMO games (i.e., almost everything) and no consideration of people who only play at home (i.e., almost everyone), Polygon is saying that those convention-going MMO fans are the only "gamers" who count. That's not really fair.
Furthermore, while I do, again, appreciate the effort to promote video games as a more socially acceptable pastime, I'd rather they didn't. Video games are already in a long transition from dorky to mainstream — in fact, that transition is almost complete — and while this does have benefits, it also causes problems. We've all seen what happens when developers and publishers of video games pander to the same demographic that used to shun video game enthusiasts as losers and nerds. People who actually like video games for reasons other than its new-found status as a trendy thing to do — people who, if you'll pardon the hipsterism, actually liked video games before they were cool — are now shunned by the video game industry, which is more interested in marketing to the casual "I've never played a video game before and I hate a good challenge" crowd.
So no, screw Polygon.
While I'm certain that most video game enthusiasts are far more socially active (and generally more "normal") than the antisocial loser stereotype, I'm also certain that this stereotype is outdated and no one actually believes it anymore. We no longer need use dubious interpretations of potentially biased studies, completely disregarding the majority of actual video game hobbyists, in order to remind people that we're people too.
Great, right? Sure does sound great. But it's not great, and they're not even convincingly making the case that it's true.
The article — titled Research: Loners are the exception, not the rule, among video gamers — describes a study in which researchers from North Carolina State University, in collaboration with others at York University and the University of Ontario Institute of Technology, observed thousands of video game players and surveyed hundreds more. The results of this study are published in a paper titled Public Displays of Play: Studying Online Games in Physical Settings.
Let's stop there.
Here's what you should be asking yourself: "Online games specifically? Physical settings in particular? Are we really using this study alone to make claims about 'gamers' in general?" Unfortunately. According to Polygon, the findings of the study are based on observations and surveys of MMORPG players at various "public gaming events." If either Polygon or the researchers themselves really do intend to draw conclusions about all consumers of video games based on this information, they're making just about as much sense as some guy who uses limited anecdotal evidence to argue that we are all basement-dwelling antisocial freaks.
It should be no surprise to anyone that the subjects of this study — the fans of massively multiplayer online games who meet up with other fans in person to participate in highly social activities — are highly social. Nobody would suspect they are "loners" of any kind. In other words, this isn't the shocking and counter-intuitive finding that Polygon wants it to be. It's the same kind of obvious, common-sense-confirming, almost insignificant finding that usually comes from sociological studies and experiments.
I'll admit right now that I have neither the time nor the patience to read the study in full, so I'm not sure if this not-so-amazing revelation — that the most socially active type of "gamer" you can imagine is by no means antisocial — is really the only point that the researchers wanted to make. On the other hand, I'm not sure if they really meant to draw outrageous conclusions about all "gamers" either. Judging by the title of the research paper and the abstract, I can only suspect that the conspicuous generalization of the study's findings to all "gamers" was Polygon's own invention.
In a way, I do admire what Polygon is trying to do with his article. They're trying to break down the worst negative stereotypes about "gamers" and to make video games a more socially acceptable hobby by exposing the fallacy of the antisocial, basement-dwelling, lonely game enthusiast. However, the article succeeds only in telling me (erroneously by the way) that I'm not representative of the "gamer" demographic because, in some ways, I am a loner. I'm not a creepy loner with no friends, but I do most often play video games alone. A lot of people who make a hobby out of video games are loners, and there's nothing wrong with that. We're not necessarily antisocial and I'd wager that very few of us actually dwell in basements, but it's not crazy to assume that a large percentage of people who might properly be called "gamers" enjoy playing video games at home by themselves and are less socially active, in the traditional sense, than people who prefer football. I'd conduct my own research if I had nothing better to do.
By making sweeping generalizations about "video gamers" based on the results of a study with no consideration of single-player or otherwise non-MMO games (i.e., almost everything) and no consideration of people who only play at home (i.e., almost everyone), Polygon is saying that those convention-going MMO fans are the only "gamers" who count. That's not really fair.
Furthermore, while I do, again, appreciate the effort to promote video games as a more socially acceptable pastime, I'd rather they didn't. Video games are already in a long transition from dorky to mainstream — in fact, that transition is almost complete — and while this does have benefits, it also causes problems. We've all seen what happens when developers and publishers of video games pander to the same demographic that used to shun video game enthusiasts as losers and nerds. People who actually like video games for reasons other than its new-found status as a trendy thing to do — people who, if you'll pardon the hipsterism, actually liked video games before they were cool — are now shunned by the video game industry, which is more interested in marketing to the casual "I've never played a video game before and I hate a good challenge" crowd.
So no, screw Polygon.
While I'm certain that most video game enthusiasts are far more socially active (and generally more "normal") than the antisocial loser stereotype, I'm also certain that this stereotype is outdated and no one actually believes it anymore. We no longer need use dubious interpretations of potentially biased studies, completely disregarding the majority of actual video game hobbyists, in order to remind people that we're people too.
Thursday, February 13, 2014
If You Can't Say Something Nice...
We know all know how the grammatically incorrect saying goes: "...don't say nothing at all." It's pretty sound advice for social interaction. If you're not going to be nice to a person, sometimes it's best to leave them alone and mind your own business instead of causing unnecessary emotional pain or picking an unnecessary fight. I don't think, however, that the cute little rabbit from Bambi had criticism of video games in mind when he spoke his words of wisdom.
When I logged into Steam earlier today, I noticed a new feature: a user-generated tagging system complete with personalized recommendations for my account based on the games in my library and the tags attached to them. I also noticed that some of the players tagging games are very openly opinionated. Some games are being tagged as "bad" and all of its synonyms. Some games are being tagged as "overrated" and BioShock Infinite tops the list. Other examples of criticism-by-tagging are slightly more subtle. The "casual" tag is being placed not only on casual games as they are known in the traditional sense, but also on games deemed too easy by hardcore players who expect legitimate challenge in their games. (Notable examples include the Thief reboot, which is geared toward players who never played the original series or any difficult stealth game, and Call of Duty: Ghosts, the latest in a series of first-person shooters which has long been associated with an immature fan base who would eat garbage if the TV said it was cool).
Kotaku writer Patricia Hernandez noticed it too, and I don't usually comment every time a Kotaku employee writes something that annoys me — I don't have that kind of time — but on this occasion it will serve as a nice excuse to mention a couple of other things that were already on my mind.
"Trolling is definitely a thing in Steam Tags right now," Hernandez writes, and maybe she's not entirely wrong. Surely some tags are being added just for the sake of annoying the fans of certain games, just for laughs. The tags to which she's referring, though, are the ones that merely express a genuine opinion, like "casual" as applied to Dark Souls and "bad" as applied to Gone Home.
I'm not really sure when the meaning of the word "trolling" shifted so drastically. It used to mean saying inflammatory things for the sole purpose of angering other people, especially when the things being said are completely disingenuous. A good troll pretends to be completely serious when he posts deliberately flawed arguments the middle of an otherwise intelligent discussion for the sake of disrupting it. He pretends to be dumber than he is, or more ignorant than he is, because this makes his ignorant comments all the more infuriating, and he keeps this up for as long as possible because the game is essentially over when someone figures out that he's "just trolling." Trolls get attention because of the perception that they're incredibly stupid or incredibly horrible in some other way, not the because of their actual opinions.
But "trolling" adopted a different meaning in mainstream media soon after the mainstream media (thought they) learned the word, and maybe it's because successful trolls on the internet are so good at hiding their true intentions. The whole "pretending" part is conspicuously missing from the outside world's common understanding of what trolling is. Almost any type of online harassment or use of unkind words is, therefore, called "trolling" even if the supposed trolls are, instead of really trolling, just being completely serious and stating their actual opinions (albeit in a rude manner). Those who use the word "trolling" in this context probably wouldn't see through the ruse if an actual troll were trolling them, so maybe I can't blame them for getting it wrong, but I miss the times when words had meaning.
The past few years, I think, have seen the bastardization of the word come to completion. People are now using the word "trolling" even to refer to the expression of just about any unpopular or unacceptable opinion. We're seeing some of it right here in this Kotaku article.
Let's say I've never played Gone Home, but I tag it with the word "bad" just because I know its fans are likely to have a big cry and act like their human rights are being violated, resulting in a ludicrous and humorous display. That's trolling. If I play it and then tag it with the word "bad" because I genuinely think it's bad and should be categorized as such, I'm merely expressing an opinion. There's an important difference that Hernandez doesn't appear to understand. In fact, I'm really not sure if she knows how opinions work at all. The following is an excerpt from the article that comes right after her "trolling" comment:
Let's analyze this bizarre logic. Tagging one game as "bad" is totally fine because it was unpopular (or because fellow Kotaku writer Kirk Hamilton thought it was bad too), but tagging the other game as "bad" is incorrect because... why? Because the right people (perhaps the kind of people who read Kotaku) like it? Because its perceived relative popularity means the alternate opinion doesn't exist? Hernandez seems to think that each game has an objective amount of badness, and that a certain threshold of badness (or of agreement on its badness) must be crossed before we're allowed to call it bad. In other words, "bad" is not allowed to be an individual person's opinion. That's kind of strange because, if you ask anyone who has a firm grasp on the difference between facts and opinions, the fact that it's "not everyone's cup of tea" does mean it's bad — it's bad in the minds of people who would prefer some other tea in a different cup.
A normal person in Hernandez' position would just say "I disagree with these people," and maybe that's what she's trying to say, but if that's the case then she has a very strange way of saying it. She's not simply stating her own opinion about the game; she's suggesting that some opinions are true while other opinions are false. She's saying it's wrong for anyone to tag Gone Home as a bad game on Steam, not only because the tagging system wasn't necessarily meant for opinions, but more importantly because this particular game wasn't as universally unloved as The Walking Dead: Survival Instinct. She's calling it "trolling" and, whether she knows the actual meaning of the word or not, it makes her look like a gigantic moron.
This is just the latest example of a worrying trend in which people take Thumper's advice about saying not-so-nice things and apply it to everything said about their favorite commercial products. In addition to the misuse of the word "trolling" (as usual), Hernandez' article uses words like "unfair" to describe the negative tags placed on games that she likes. Some of the tags being used are definitely uncool — concisely written spoilers and random profanity, for example — but expressing negative (or otherwise unpopular) opinions about a game is not by any means "unfair" and, in my opinion, even using tags to express these opinions doesn't really amount to abuse of the system. It was designed to let players tag games as they see fit. I guess I shouldn't be surprised that "gaming" "journalists" disagree. After all, they're the ones who make a living pumping out reviews with inflated scores because their advertising revenue depends on it, while they push the notion that players who complain about bad games are just displaying a false sense of entitlement.
The most popular tags for Gone Home right now are "not a game," "walking simulator," and "bad." Hernandez thinks these tags aren't helpful, but they are if a player wants to know if a game is worth buying. Since tags are more visible if they're more popular, even tags like "good" and "bad" are just about as helpful as the user scores on Metacritic. Tags like "not a game" and "walking simulator" are meant to be humorous but they do give players an idea of what Gone Home is like. They're informative even if they're exaggerations. The "not a game" tag is sure to be the most controversial, but people have been accusing Gone Home of not being a game since it was released, and it's a valid criticism. We don't get to say it's unfair just because it hurts the developers' or fans' feelings.
I sincerely hope that Valve doesn't side with the crybabies at Kotaku by moderating all traces of opinion out of the tagging system. If the people running Steam didn't want opinions to show up in user-generated tags, they shouldn't have implemented the feature at all. Games on Steam are already sorted by genre, and they could have just expanded upon this if they only wanted a dry, boring and sterile categorization system devoid of anything subjective.
It looks like Valve is indeed moderating the user-generated tags, and on second thought I really can't blame them for not wanting strongly negative descriptors attached to products on their own store. (Tags were never meant to be used as mini-reviews, so I can't even call it a scandal as long as no one is tampering with the actual user review system.) Apparently tags like "bad" are no more, and even the popular "not a game" tag has vanished. As of right now, though, Gone Home is still classified as a "walking simulator" first and foremost, and I think it's pretty hilarious.
When I logged into Steam earlier today, I noticed a new feature: a user-generated tagging system complete with personalized recommendations for my account based on the games in my library and the tags attached to them. I also noticed that some of the players tagging games are very openly opinionated. Some games are being tagged as "bad" and all of its synonyms. Some games are being tagged as "overrated" and BioShock Infinite tops the list. Other examples of criticism-by-tagging are slightly more subtle. The "casual" tag is being placed not only on casual games as they are known in the traditional sense, but also on games deemed too easy by hardcore players who expect legitimate challenge in their games. (Notable examples include the Thief reboot, which is geared toward players who never played the original series or any difficult stealth game, and Call of Duty: Ghosts, the latest in a series of first-person shooters which has long been associated with an immature fan base who would eat garbage if the TV said it was cool).
Kotaku writer Patricia Hernandez noticed it too, and I don't usually comment every time a Kotaku employee writes something that annoys me — I don't have that kind of time — but on this occasion it will serve as a nice excuse to mention a couple of other things that were already on my mind.
"Trolling is definitely a thing in Steam Tags right now," Hernandez writes, and maybe she's not entirely wrong. Surely some tags are being added just for the sake of annoying the fans of certain games, just for laughs. The tags to which she's referring, though, are the ones that merely express a genuine opinion, like "casual" as applied to Dark Souls and "bad" as applied to Gone Home.
I'm not really sure when the meaning of the word "trolling" shifted so drastically. It used to mean saying inflammatory things for the sole purpose of angering other people, especially when the things being said are completely disingenuous. A good troll pretends to be completely serious when he posts deliberately flawed arguments the middle of an otherwise intelligent discussion for the sake of disrupting it. He pretends to be dumber than he is, or more ignorant than he is, because this makes his ignorant comments all the more infuriating, and he keeps this up for as long as possible because the game is essentially over when someone figures out that he's "just trolling." Trolls get attention because of the perception that they're incredibly stupid or incredibly horrible in some other way, not the because of their actual opinions.
But "trolling" adopted a different meaning in mainstream media soon after the mainstream media (thought they) learned the word, and maybe it's because successful trolls on the internet are so good at hiding their true intentions. The whole "pretending" part is conspicuously missing from the outside world's common understanding of what trolling is. Almost any type of online harassment or use of unkind words is, therefore, called "trolling" even if the supposed trolls are, instead of really trolling, just being completely serious and stating their actual opinions (albeit in a rude manner). Those who use the word "trolling" in this context probably wouldn't see through the ruse if an actual troll were trolling them, so maybe I can't blame them for getting it wrong, but I miss the times when words had meaning.
The past few years, I think, have seen the bastardization of the word come to completion. People are now using the word "trolling" even to refer to the expression of just about any unpopular or unacceptable opinion. We're seeing some of it right here in this Kotaku article.
Let's say I've never played Gone Home, but I tag it with the word "bad" just because I know its fans are likely to have a big cry and act like their human rights are being violated, resulting in a ludicrous and humorous display. That's trolling. If I play it and then tag it with the word "bad" because I genuinely think it's bad and should be categorized as such, I'm merely expressing an opinion. There's an important difference that Hernandez doesn't appear to understand. In fact, I'm really not sure if she knows how opinions work at all. The following is an excerpt from the article that comes right after her "trolling" comment:
Let's start with the "bad" tag. It does have games notorious for being poor—The Walking Dead: Survival Instinct, for example. It's hard to argue the quality of that game. Gone Home, though? It's not everyone's cup of tea, but that doesn't mean it's bad!Really? It doesn't? One doesn't get to call something bad when one doesn't like it?
Let's analyze this bizarre logic. Tagging one game as "bad" is totally fine because it was unpopular (or because fellow Kotaku writer Kirk Hamilton thought it was bad too), but tagging the other game as "bad" is incorrect because... why? Because the right people (perhaps the kind of people who read Kotaku) like it? Because its perceived relative popularity means the alternate opinion doesn't exist? Hernandez seems to think that each game has an objective amount of badness, and that a certain threshold of badness (or of agreement on its badness) must be crossed before we're allowed to call it bad. In other words, "bad" is not allowed to be an individual person's opinion. That's kind of strange because, if you ask anyone who has a firm grasp on the difference between facts and opinions, the fact that it's "not everyone's cup of tea" does mean it's bad — it's bad in the minds of people who would prefer some other tea in a different cup.
A normal person in Hernandez' position would just say "I disagree with these people," and maybe that's what she's trying to say, but if that's the case then she has a very strange way of saying it. She's not simply stating her own opinion about the game; she's suggesting that some opinions are true while other opinions are false. She's saying it's wrong for anyone to tag Gone Home as a bad game on Steam, not only because the tagging system wasn't necessarily meant for opinions, but more importantly because this particular game wasn't as universally unloved as The Walking Dead: Survival Instinct. She's calling it "trolling" and, whether she knows the actual meaning of the word or not, it makes her look like a gigantic moron.
This is just the latest example of a worrying trend in which people take Thumper's advice about saying not-so-nice things and apply it to everything said about their favorite commercial products. In addition to the misuse of the word "trolling" (as usual), Hernandez' article uses words like "unfair" to describe the negative tags placed on games that she likes. Some of the tags being used are definitely uncool — concisely written spoilers and random profanity, for example — but expressing negative (or otherwise unpopular) opinions about a game is not by any means "unfair" and, in my opinion, even using tags to express these opinions doesn't really amount to abuse of the system. It was designed to let players tag games as they see fit. I guess I shouldn't be surprised that "gaming" "journalists" disagree. After all, they're the ones who make a living pumping out reviews with inflated scores because their advertising revenue depends on it, while they push the notion that players who complain about bad games are just displaying a false sense of entitlement.
The most popular tags for Gone Home right now are "not a game," "walking simulator," and "bad." Hernandez thinks these tags aren't helpful, but they are if a player wants to know if a game is worth buying. Since tags are more visible if they're more popular, even tags like "good" and "bad" are just about as helpful as the user scores on Metacritic. Tags like "not a game" and "walking simulator" are meant to be humorous but they do give players an idea of what Gone Home is like. They're informative even if they're exaggerations. The "not a game" tag is sure to be the most controversial, but people have been accusing Gone Home of not being a game since it was released, and it's a valid criticism. We don't get to say it's unfair just because it hurts the developers' or fans' feelings.
I sincerely hope that Valve doesn't side with the crybabies at Kotaku by moderating all traces of opinion out of the tagging system. If the people running Steam didn't want opinions to show up in user-generated tags, they shouldn't have implemented the feature at all. Games on Steam are already sorted by genre, and they could have just expanded upon this if they only wanted a dry, boring and sterile categorization system devoid of anything subjective.
Update (February 15, 2014):
It looks like Valve is indeed moderating the user-generated tags, and on second thought I really can't blame them for not wanting strongly negative descriptors attached to products on their own store. (Tags were never meant to be used as mini-reviews, so I can't even call it a scandal as long as no one is tampering with the actual user review system.) Apparently tags like "bad" are no more, and even the popular "not a game" tag has vanished. As of right now, though, Gone Home is still classified as a "walking simulator" first and foremost, and I think it's pretty hilarious.
Labels:
bioshock,
call of duty,
gone home,
steam,
the walking dead,
thief
Monday, January 27, 2014
Midlife Crisis, Part 4
I won't pretend that the following is some kind of revelation brought on by my somewhat recent decision to splurge on a shiny new computer. I've known it for a long time, and I've probably mentioned it before, but right now — at the beginning of a new year and the end of a string of posts regarding my PC-building experience and all the second-guessing involved — just seems like a pretty good time to bring it up.
Playing video games doesn't seem nearly as fun as it used to be.
And it's not just because I've grown up. It certainly does have something to do with my lack of free time as an actual adult who works for a living, but it's not a change of my own personality that makes it more difficult to sit back and enjoy what used to be incredibly entertaining. I mean, let's face it, I'm still a kid on the inside. I'm in my mid-20s but I have a blog about video games and I still think playing games (in moderation) is a perfectly good use of time. That alone, according to almost anyone, probably makes me a man-child in a man-man's body. I think I've grown up enough to convince people that I've grown up, but I'd still rather play a video game than read a newspaper, and I'd rather argue about video games than argue about politics. As far as I know, this isn't so abnormal for guys my age. Incidentally, I'm told I belong to an entire generation of man-children who don't know how to be real men — that I'm the product of the downfall of society and that it's probably the internet's fault — but that's a discussion for another day.
The problem isn't that I've grown out of video games. If I really had, there would be no problem at all and I wouldn't care enough to write this. The problem, in fact, is a bunch of problems.
A lot of my friends have grown out of video games, and I have fewer people with whom to enjoy my pastime of choice. Society as a whole thinks I should grow out of video games, so unless I work in the industry (which I do not), I can't openly express my enthusiasm for the medium without inviting all sorts of assumptions about my character (only some of which are true). Have I ever mentioned that I'm using a pseudonym? For the same reason I don't just go ahead and list "gaming" as a skill on my résumé, I don't really want potential employers to find this blog when they do a quick background check on me. It's both irrelevant and potentially damaging. So is most of what I've ever posted on Facebook, so I should really double-check my privacy settings or just delete my account.
Playing video games, in some ways, has become a pretty lonely activity, and it's not just because so many of my friends have left the party. It's also because I'm interested primarily in single-player games these days, and nobody wants to watch me play them, not that I expect them to. Oddly enough, in my youth I felt that video games were very much a spectator sport. Enjoying a new single-player game with my two brothers didn't necessarily involve taking turns, but those were the days when we had nothing better to do on a weekend than watch each other get mad at the final boss in some nearly impossible 2D sidescroller. People watching people play video games isn't actually such a weird thing — search for "Let's Play" on YouTube and see for yourself — but everyone I know who still has any interest in video games only seems to like massively multiplayer online stuff anyway. So screw me and my apparently bad taste, I guess.
To some extent, my trouble with maintaining an interest in any of the recent games I've played might also be a case of unrealistic expectations. In comparing today's games to those of my apparently idyllic childhood, the verdict is always "meh." Feelings of nostalgia make objectivity impossible and I have plenty of those feelings for video games I played when I was 10 years old. Maybe I just think I'm having less fun because nothing can live up to that rose-tinted version of reality. It could also be that, having played so many games over the years, I'm less easily impressed. The first time I played a first-person shooter, it was really fantastic. After a hundred of them, it takes something really crazy to keep me interested.
This post probably wouldn't be complete if I didn't at least half-heartedly entertain the notion that games themselves are actually getting worse, but I think what's more important is that modern games are always made to cater to people who never played a game before. The depth I crave would probably alienate new players, and the mandatory tutorial levels that new players need are boring me.
All of these are contributing factors, but my lack of free time is by far the most significant. I work 40 to 50 hours a week doing something I don't enjoy, and when I get home it's hard to do anything but sleep. I spend the majority of my free time with my girlfriend, who doesn't care that I play video games but doesn't care to play them herself, so it's not something we can do together. When I do get some time alone to waste, it's rarely more than a couple of hours, and rarely can I find the motivation to start a new game when I'm on such a tight schedule. So I just end up browsing the web or loading a save in the middle of something I've already finished. Even on the weekends, the prospect of going back to work on Monday is so distracting that it's hard to enjoy anything at all. You know, I think I'm just depressed. This isn't helping to shrink my rapidly growing backlog of things I really want to play eventually.
Maybe part of me thought that buying a new PC would somehow fix all of this. I guess it has, at least a little, since there are more games that I can actually play and I'm excited about playing them. I have to admit, I've enjoyed replaying Crysis with the graphics turned up to crazy (even though this only gets me around 40 frames per second most of the time). But I haven't had the time or the patience to dive into L.A. Noire or Mirror's Edge or Dead Space, all of which are on my Steam account with only a few minutes logged. Maybe, one of these lazy Sundays, I'll have a chance to make some progress in one of them, and maybe the experience won't be ruined by thoughts of the impending Monday.
Playing video games doesn't seem nearly as fun as it used to be.
And it's not just because I've grown up. It certainly does have something to do with my lack of free time as an actual adult who works for a living, but it's not a change of my own personality that makes it more difficult to sit back and enjoy what used to be incredibly entertaining. I mean, let's face it, I'm still a kid on the inside. I'm in my mid-20s but I have a blog about video games and I still think playing games (in moderation) is a perfectly good use of time. That alone, according to almost anyone, probably makes me a man-child in a man-man's body. I think I've grown up enough to convince people that I've grown up, but I'd still rather play a video game than read a newspaper, and I'd rather argue about video games than argue about politics. As far as I know, this isn't so abnormal for guys my age. Incidentally, I'm told I belong to an entire generation of man-children who don't know how to be real men — that I'm the product of the downfall of society and that it's probably the internet's fault — but that's a discussion for another day.
The problem isn't that I've grown out of video games. If I really had, there would be no problem at all and I wouldn't care enough to write this. The problem, in fact, is a bunch of problems.
A lot of my friends have grown out of video games, and I have fewer people with whom to enjoy my pastime of choice. Society as a whole thinks I should grow out of video games, so unless I work in the industry (which I do not), I can't openly express my enthusiasm for the medium without inviting all sorts of assumptions about my character (only some of which are true). Have I ever mentioned that I'm using a pseudonym? For the same reason I don't just go ahead and list "gaming" as a skill on my résumé, I don't really want potential employers to find this blog when they do a quick background check on me. It's both irrelevant and potentially damaging. So is most of what I've ever posted on Facebook, so I should really double-check my privacy settings or just delete my account.
Playing video games, in some ways, has become a pretty lonely activity, and it's not just because so many of my friends have left the party. It's also because I'm interested primarily in single-player games these days, and nobody wants to watch me play them, not that I expect them to. Oddly enough, in my youth I felt that video games were very much a spectator sport. Enjoying a new single-player game with my two brothers didn't necessarily involve taking turns, but those were the days when we had nothing better to do on a weekend than watch each other get mad at the final boss in some nearly impossible 2D sidescroller. People watching people play video games isn't actually such a weird thing — search for "Let's Play" on YouTube and see for yourself — but everyone I know who still has any interest in video games only seems to like massively multiplayer online stuff anyway. So screw me and my apparently bad taste, I guess.
To some extent, my trouble with maintaining an interest in any of the recent games I've played might also be a case of unrealistic expectations. In comparing today's games to those of my apparently idyllic childhood, the verdict is always "meh." Feelings of nostalgia make objectivity impossible and I have plenty of those feelings for video games I played when I was 10 years old. Maybe I just think I'm having less fun because nothing can live up to that rose-tinted version of reality. It could also be that, having played so many games over the years, I'm less easily impressed. The first time I played a first-person shooter, it was really fantastic. After a hundred of them, it takes something really crazy to keep me interested.
This post probably wouldn't be complete if I didn't at least half-heartedly entertain the notion that games themselves are actually getting worse, but I think what's more important is that modern games are always made to cater to people who never played a game before. The depth I crave would probably alienate new players, and the mandatory tutorial levels that new players need are boring me.
All of these are contributing factors, but my lack of free time is by far the most significant. I work 40 to 50 hours a week doing something I don't enjoy, and when I get home it's hard to do anything but sleep. I spend the majority of my free time with my girlfriend, who doesn't care that I play video games but doesn't care to play them herself, so it's not something we can do together. When I do get some time alone to waste, it's rarely more than a couple of hours, and rarely can I find the motivation to start a new game when I'm on such a tight schedule. So I just end up browsing the web or loading a save in the middle of something I've already finished. Even on the weekends, the prospect of going back to work on Monday is so distracting that it's hard to enjoy anything at all. You know, I think I'm just depressed. This isn't helping to shrink my rapidly growing backlog of things I really want to play eventually.
Maybe part of me thought that buying a new PC would somehow fix all of this. I guess it has, at least a little, since there are more games that I can actually play and I'm excited about playing them. I have to admit, I've enjoyed replaying Crysis with the graphics turned up to crazy (even though this only gets me around 40 frames per second most of the time). But I haven't had the time or the patience to dive into L.A. Noire or Mirror's Edge or Dead Space, all of which are on my Steam account with only a few minutes logged. Maybe, one of these lazy Sundays, I'll have a chance to make some progress in one of them, and maybe the experience won't be ruined by thoughts of the impending Monday.
Labels:
crysis,
dead space,
l.a. noire,
midlife crisis,
mirror's edge
Subscribe to:
Posts (Atom)