Showing posts with label the walking dead. Show all posts
Showing posts with label the walking dead. Show all posts

Tuesday, December 2, 2014

Interactive Fiction

Whenever I write about a topic I've already covered, I'm in fear of accidentally contradicting myself. My thoughts and feelings change over time, and this is perfectly natural, but each finished piece of writing is static (unless and until I feel compelled to go through the trouble of changing it). Each and every post on this blog is a frozen snapshot of my thoughts and feelings at a particular time, and all of those snapshots are displayed simultaneously in the same place. When I do change my mind, and if I don't have the time or the patience to eradicate my outdated thoughts from wherever they were written, it might just look like I'm saying two different things at once.

For example, I might have been too kind to a certain piece of interactive fiction in this post about games as art. I dismissed the criticism that Dear Esther was bad for its lack of gameplay because it's not a game and therefore should not be judged as a game. Dear Esther, I wrote, should be judged instead as a piece of interactive fiction or as a work of art. A direct quote from the other post: "A valid criticism of Dear Esther should focus on what's there — the writing, the visuals, and the music — rather than obsessing over exactly how it's not a game." Even now, I stand by all of these claims about how Dear Esther should be judged, but if I ever implied that Dear Esther comes up smelling like roses when judged in this manner, I'm about to disagree with my former self.

The writing, the visuals, and the music in Dear Esther are all fine. The lack of traditional gameplay elements is also fine. The lack of meaningful interactivity of any kind, however, is less so. Dear Esther is a walk through a virtual landscape set to music and narration, and the act of walking (but nothing else) is left to the player. The experience is interactive in the sense that the player can choose where to walk and where to look within the confines of Dear Esther's explorable space; the problem is that those confines are limiting to the point where the interactivity becomes nothing but a nuisance.

The player can move freely, but aside from a few wide open areas and a couple of briefly diverging paths, there's only one way to go from the beginning to the end. The game has no real exploration, and the player's actions have no real effect on the story. The experience is essentially linear, and yet the player is forced to interact; one cannot get to the end without walking there, a tedious and potentially frustrating task when there's hardly anything to do along the way. Dear Esther might even be better if it would just play itself.

It's a common problem in story-driven games, so-called interactive fiction, and everything in between. The story is there, and the player input is there, but the potential for truly interactive fiction is lost when the player is unable to affect the narrative. If the story is the most important aspect of a product which insists on being interactive, shouldn't the story itself be interactive? It's no surprise that a truly interactive story is a rarity in the typical video game, which is gameplay-driven and includes a story only as a contextual backdrop no matter how obnoxiously that story shoves itself down the player's throat during unskippable cutscenes, but developers should take more care when adding game elements to a story instead of the reverse.

A couple of weeks ago, I played To the Moon, which tells a nice (albeit awkwardly written) story and contains more than enough gameplay to be considered a game. Unfortunately, while it clearly strives to be interactive fiction first and foremost, the interactivity and the fiction do not blend well. The only genuine gameplay consists of occasional puzzles and a few brief mini-games, while the bulk of the player's time is spent alternating between reading dialogue and making sure to click on all the clickable objects in a given area. The game is so heavy with dialogue and so short on meaningful player input that, per click or keystroke, the majority of one's interaction with the game consists of telling the game to continue to the next line of dialogue. Yet, for all this time spent manually moving through the story — I used the word "tedious" in reference to Dear Esther's walking and I think it applies here as well — there's no real interaction with the story itself. The ending is always the same. Player choices sometimes have an effect on a few lines of dialogue, but that's all.

I get it. The developers wanted to tell a very specific story. Not every story will have branching paths and multiple outcomes. However, if the story is to remain static and immutable, two things need to happen. First, the player needs to be able to step away from the story long enough to have a satisfying amount of control over something. For To the Moon, this would mean a lot more puzzles and mini-games, or perhaps a totally new gameplay element. Second, the player needs to be able to sit back and watch when no meaningful input is required. This would turn Dear Esther into a movie, but I guess that's the whole problem with Dear Esther.

My examples so far are pretty extreme, but the problems in Dear Esther and To the Moon are things that all story-driven games need to avoid, and not all of them do a very good job. In my old post on Alan Wake, I mentioned the frustration of needing to follow a character around or simply idle about while listening to dialogue and waiting for the next scripted event. The same thing happens in Half-Life 2, thanks to the developers' decision to forgo cutscenes for the sake of having everything (including story exposition) happen in real-time. Alan Wake and Half-Life 2 both have plenty of gameplay to keep the player entertained, but the occasional need to sit through real-time in-gameplay dialogue always left me wishing that a skippable cutscene were used instead, especially when continual but meaningless player input was required (e.g., when following another character down a linear path).

I'm actually beginning to think that the hybridization of gameplay mechanics with interactive fiction is a failed experiment, and that the game industry should stop insisting on doing it over and over again. Games with excellent gameplay don't need mediocre stories tacked on in the form of unskippable in-game sit-around-and-listen-to-people-talk scenes from which the player cannot walk away. Meanwhile, the "interactive" requirement of an interactive story cannot be adequately satisfied with badly implemented gameplay mechanics, like mini-games and puzzles that occur with the frequency of a cutscene in a traditional game.

You want to make a shooter? Don't annoy the shooter fans with superfluous dialogue and scripted action sequences. You want to make an interactive story? Don't force your customers to play through stupid shooting sections. Am I wrong? I mean, clearly, there's an acceptable balance somewhere, but not many developers find it. Why can't the video game and the interactive story be two distinct things, each with no obligation to step on the other's turf? I guess it's because mashing the two things together broadens the target demographic. Shooter fans buy it for the gunplay, people who like interactive fiction buy it for the role-playing aspects or the built-in dating simulator, and everyone is just barely happy enough to keep playing.

Or maybe it's because interactive fiction just hasn't matured enough to stand on its own without being shoehorned into a traditional game or having a traditional game shoehorned into it. After all, a sufficiently interactive story is probably hard to write. Giving the player a satisfying amount of control within a well structured story sounds pretty difficult, if we assume the player's control must be over some aspect of the story itself.

Perhaps most disappointing of all are games which seem to be built on the premise of a player-controlled branching storyline but end up being almost entirely linear anyway. Playing through the first season of Telltale's The Walking Dead was a great experience, but a lot of the magic was gone when I realized after the fact that the ending I got was the only ending. Player choices determine which characters live or die and, to some extent, the characters' feelings about each other, but the protagonist and his group go to the same places and do the same things regardless. Ultimately, the only thing that changes, as a result of character deaths and the relationships between those who remain, is the dialogue.

Obviously, not every game needs multiple endings in order to have a satisfying story. However, in The Walking Dead, the player's ability to shape the story by making (often binary) decisions is the primary mode of gameplay. The rest is just occasional puzzles, and some shooting sequences which could rightly be called mini-games. When gameplay consists almost entirely of manipulation of the game's story through dialogue and moral choices, the ability to manipulate the story in a substantial and meaningful way is pretty important. The Walking Dead provides more illusion of choice than actual choice... but I guess I wouldn't have known if I'd only played once and never looked back.

Thursday, February 13, 2014

If You Can't Say Something Nice...

We know all know how the grammatically incorrect saying goes: "...don't say nothing at all." It's pretty sound advice for social interaction. If you're not going to be nice to a person, sometimes it's best to leave them alone and mind your own business instead of causing unnecessary emotional pain or picking an unnecessary fight. I don't think, however, that the cute little rabbit from Bambi had criticism of video games in mind when he spoke his words of wisdom.

When I logged into Steam earlier today, I noticed a new feature: a user-generated tagging system complete with personalized recommendations for my account based on the games in my library and the tags attached to them. I also noticed that some of the players tagging games are very openly opinionated. Some games are being tagged as "bad" and all of its synonyms. Some games are being tagged as "overrated" and BioShock Infinite tops the list. Other examples of criticism-by-tagging are slightly more subtle. The "casual" tag is being placed not only on casual games as they are known in the traditional sense, but also on games deemed too easy by hardcore players who expect legitimate challenge in their games. (Notable examples include the Thief reboot, which is geared toward players who never played the original series or any difficult stealth game, and Call of Duty: Ghosts, the latest in a series of first-person shooters which has long been associated with an immature fan base who would eat garbage if the TV said it was cool).

Kotaku writer Patricia Hernandez noticed it too, and I don't usually comment every time a Kotaku employee writes something that annoys me — I don't have that kind of time — but on this occasion it will serve as a nice excuse to mention a couple of other things that were already on my mind.

"Trolling is definitely a thing in Steam Tags right now," Hernandez writes, and maybe she's not entirely wrong. Surely some tags are being added just for the sake of annoying the fans of certain games, just for laughs. The tags to which she's referring, though, are the ones that merely express a genuine opinion, like "casual" as applied to Dark Souls and "bad" as applied to Gone Home.

I'm not really sure when the meaning of the word "trolling" shifted so drastically. It used to mean saying inflammatory things for the sole purpose of angering other people, especially when the things being said are completely disingenuous. A good troll pretends to be completely serious when he posts deliberately flawed arguments the middle of an otherwise intelligent discussion for the sake of disrupting it. He pretends to be dumber than he is, or more ignorant than he is, because this makes his ignorant comments all the more infuriating, and he keeps this up for as long as possible because the game is essentially over when someone figures out that he's "just trolling." Trolls get attention because of the perception that they're incredibly stupid or incredibly horrible in some other way, not the because of their actual opinions.

But "trolling" adopted a different meaning in mainstream media soon after the mainstream media (thought they) learned the word, and maybe it's because successful trolls on the internet are so good at hiding their true intentions. The whole "pretending" part is conspicuously missing from the outside world's common understanding of what trolling is. Almost any type of online harassment or use of unkind words is, therefore, called "trolling" even if the supposed trolls are, instead of really trolling, just being completely serious and stating their actual opinions (albeit in a rude manner). Those who use the word "trolling" in this context probably wouldn't see through the ruse if an actual troll were trolling them, so maybe I can't blame them for getting it wrong, but I miss the times when words had meaning.

The past few years, I think, have seen the bastardization of the word come to completion. People are now using the word "trolling" even to refer to the expression of just about any unpopular or unacceptable opinion. We're seeing some of it right here in this Kotaku article.

Let's say I've never played Gone Home, but I tag it with the word "bad" just because I know its fans are likely to have a big cry and act like their human rights are being violated, resulting in a ludicrous and humorous display. That's trolling. If I play it and then tag it with the word "bad" because I genuinely think it's bad and should be categorized as such, I'm merely expressing an opinion. There's an important difference that Hernandez doesn't appear to understand. In fact, I'm really not sure if she knows how opinions work at all. The following is an excerpt from the article that comes right after her "trolling" comment:
Let's start with the "bad" tag. It does have games notorious for being poor—The Walking Dead: Survival Instinct, for example. It's hard to argue the quality of that game. Gone Home, though? It's not everyone's cup of tea, but that doesn't mean it's bad!
Really? It doesn't? One doesn't get to call something bad when one doesn't like it?

Let's analyze this bizarre logic. Tagging one game as "bad" is totally fine because it was unpopular (or because fellow Kotaku writer Kirk Hamilton thought it was bad too), but tagging the other game as "bad" is incorrect because... why? Because the right people (perhaps the kind of people who read Kotaku) like it? Because its perceived relative popularity means the alternate opinion doesn't exist? Hernandez seems to think that each game has an objective amount of badness, and that a certain threshold of badness (or of agreement on its badness) must be crossed before we're allowed to call it bad. In other words, "bad" is not allowed to be an individual person's opinion. That's kind of strange because, if you ask anyone who has a firm grasp on the difference between facts and opinions, the fact that it's "not everyone's cup of tea" does mean it's bad — it's bad in the minds of people who would prefer some other tea in a different cup.

A normal person in Hernandez' position would just say "I disagree with these people," and maybe that's what she's trying to say, but if that's the case then she has a very strange way of saying it. She's not simply stating her own opinion about the game; she's suggesting that some opinions are true while other opinions are false. She's saying it's wrong for anyone to tag Gone Home as a bad game on Steam, not only because the tagging system wasn't necessarily meant for opinions, but more importantly because this particular game wasn't as universally unloved as The Walking Dead: Survival Instinct. She's calling it "trolling" and, whether she knows the actual meaning of the word or not, it makes her look like a gigantic moron.

This is just the latest example of a worrying trend in which people take Thumper's advice about saying not-so-nice things and apply it to everything said about their favorite commercial products. In addition to the misuse of the word "trolling" (as usual), Hernandez' article uses words like "unfair" to describe the negative tags placed on games that she likes. Some of the tags being used are definitely uncool — concisely written spoilers and random profanity, for example — but expressing negative (or otherwise unpopular) opinions about a game is not by any means "unfair" and, in my opinion, even using tags to express these opinions doesn't really amount to abuse of the system. It was designed to let players tag games as they see fit. I guess I shouldn't be surprised that "gaming" "journalists" disagree. After all, they're the ones who make a living pumping out reviews with inflated scores because their advertising revenue depends on it, while they push the notion that players who complain about bad games are just displaying a false sense of entitlement.

The most popular tags for Gone Home right now are "not a game," "walking simulator," and "bad." Hernandez thinks these tags aren't helpful, but they are if a player wants to know if a game is worth buying. Since tags are more visible if they're more popular, even tags like "good" and "bad" are just about as helpful as the user scores on Metacritic. Tags like "not a game" and "walking simulator" are meant to be humorous but they do give players an idea of what Gone Home is like. They're informative even if they're exaggerations. The "not a game" tag is sure to be the most controversial, but people have been accusing Gone Home of not being a game since it was released, and it's a valid criticism. We don't get to say it's unfair just because it hurts the developers' or fans' feelings.

I sincerely hope that Valve doesn't side with the crybabies at Kotaku by moderating all traces of opinion out of the tagging system. If the people running Steam didn't want opinions to show up in user-generated tags, they shouldn't have implemented the feature at all. Games on Steam are already sorted by genre, and they could have just expanded upon this if they only wanted a dry, boring and sterile categorization system devoid of anything subjective.



Update (February 15, 2014):


It looks like Valve is indeed moderating the user-generated tags, and on second thought I really can't blame them for not wanting strongly negative descriptors attached to products on their own store. (Tags were never meant to be used as mini-reviews, so I can't even call it a scandal as long as no one is tampering with the actual user review system.) Apparently tags like "bad" are no more, and even the popular "not a game" tag has vanished. As of right now, though, Gone Home is still classified as a "walking simulator" first and foremost, and I think it's pretty hilarious.

Monday, October 29, 2012

Sandy & Steam Sale

While a deadly hurricane named Sandy lays waste to the east coast of the United States, threatening to rain all over my favorite holiday (which is now only two days away), Steam has begun its Halloween Sale. Sadly, it lasts only from now until Wednesday, and surely many affected by the storm will be without power for the entirety of the event. I'm still fortunate enough to have power where I am, but the weather has been getting steadily worse since late last night, so that might not last.

For those who can shop online this week, there are some nice discounts. I'm seeing a lot of "-75%" tags. The games currently "featured," however, don't seem to have greater discounts than the other five dozen games on sale. Perhaps the word "featured" just means new or popular, or maybe the featured games are chosen randomly and cycled throughout the sale. After all, the games on the "featured" list make up about a third of the games that are marked down for the duration of this three-day sale.

In any case, you'll want to make sure you check out the "All Halloween Games on Sale" list, located just below the "Featured Games on Sale" list on this page. Otherwise you might miss out something good.

While I'm here, I might as well come up with my own list of noteworthy games, based on my own crazy and possibly worthless opinions. First, I'd like to point out that some of the games on sale are those I mentioned in my last post on Wednesday:
  • F.E.A.R. (with its two expansion packs included) is only $2.49, which is just painful for me to look at, since I paid $50 for the game back in 2005, and then bought the expansions separately for at least $30 each. Still, I loved the game so much that I have no buyer's remorse, not even after seeing it go for two bucks and change. Needless to say, I'd argue that F.E.A.R. is worth buying right now, if you're into paranormal first-person shooters. (The rest of the F.E.A.R. series is on sale as well, but I'm not so crazy about those sequels.)
  • The Painkiller Complete Pack is going for $7.49. That's a bit more than I paid for the Complete Pack a year ago, but there were fewer games included at the time.) Strangely, only a couple of the games — Resurrection and Recurring Evil are on sale individually. The result is that buying the whole pack is actually cheaper than buying the first game, Painkiller: Black Edition, alone.
    Update: Scratch that. It looks like all of the individual Painkiller games are now 75% off, which means Painkiller: Black Edition is only $2.49. I still think the bundle is a fair price, but if you're unsure of how you feel about this particular brand of first-person shooter, I'd recommend buying only the original game, since most of the sequels are mediocre at best.
  • Killing Floor is $4.99, which is normal during any Steam sale, so I wouldn't hold your breath waiting for it to get much cheaper. It's also in the middle of its Hillbilly Horror Event for Halloween, which goes until November 6, so all of the zombies are dressed up like... well, hillbillies. It's a lot of fun, especially if you have some friends with whom to team up and play.
  • Alan Wake is marked down to $14.99, and Alan Wake's American Nightmare is only $7.49. I've seen them go for cheaper, but you might have to wait until the winter sale for that to happen again.
And a few other things worth mentioning:
  • The Walking Dead is down to $14.99. It's not a huge discount, but I've only heard good things about this game, and I've been seriously thinking about adding it to my collection.
  • Amnesia: The Dark Descent is currently $4.99, while each of the Penumbra games are $2.49. (Oddly, the Penumbra Collector Pack is $4.99, which is one cent more than the combined cost of the two included Penumbra games.)
  • Magicka is $2.49. It's a hilarious game and I love it. I just wish it were better optimized. It tends to run like crap on my computer while much prettier games work perfectly.
  • Zombie Driver HD is marked down to $4.99 after a 50% discount. The original Zombie Driver, which I got for $2.49 a while ago, is a lot of fun, and I can only assume that this updated version is at least as good. Unfortunately, it really is just an updated version of the original — not a sequel — so you might want to think twice about getting it if you already have the standard edition. Owners of the original game are supposed to get a 50% discount, but that doesn't seem to stack with the Halloween Sale discount, which is really a shame.
  • I wanted to buy Rage, but even with the current discount, it's still $9.99. I'll be waiting a little longer for the price to drop below $5, but I don't expect everyone to be as stingy as I am.
  • Each of the S.T.A.L.K.E.R. games — which are fantastic if you have a decent computer and don't mind installing a couple of bug-fixing mods — are on sale as well: Shadow of Chernobyl for $9.99, Clear Sky for $4.99, and Call of Pripyat for $7.49. (As with the Penumbra series, there seems to be a bug in the pricing of the S.T.A.L.K.E.R. Bundle, which costs one cent more than the price of the included games, Shadow of Chernobyl and Call of Pripyat.)
  • I've been waiting for the Overlord Complete Pack to go on sale for a while, so I just might pick it up now for $4.99. (I've never played it, but it kinda reminds me of a more diabolic Pikmin.)
  • The Dead Space games are each $4.99, which seems pretty cool. I've never played them, but you can't go horribly wrong for five bucks. Just make sure you don't buy the Dead Space Pack, since, again, it costs one cent more than the combined price of the individual games. At first I thought this was a bug, but now I think it's just plain carelessness.
  • Predictably, the Left 4 Dead series is on sale, as is just about every game with the word "zombie" in the title — and there are far too many to name. Some of them look cute, others look like shovelware. Just beware the deceptive power of tempting discounts on awful products.
I should mention that there are Halloween deals on Amazon and Origin as well. I haven't checked them out in detail just yet, so I can't say whether they're better or worse than the current steam sale, but every option is worth considering. While I might be slightly biased in favor of Steam (because my friends are on it), I encourage you all — as always — to shop around before spending any money.

Wednesday, September 12, 2012

What Makes Video Games Fun?

A lot of "hardcore gamers" (regardless of whether they identify themselves as such) will tell you that the video game industry is in sad shape. It's not just because of the past decade's unfortunate shift toward increasingly more intrusive digital rights management, or the recent trend of releasing "extra" downloadable content on day one to encourage thoughtless and irresponsible pre-orders, or the deliberate efforts to use both DRM and DLC to destroy the used game market. Rather, it's because they think that too many of the games being released today are crap.

And they're not just talking about shovelware that nobody buys. This is popular crap. So what's up with all the hate? Well, it should be no surprise that the games which tend to attract the most violently negative attention are always the popular ones. After all, if you want to complain about a genre, a feature, a console, or a developer, you pick a popular game as an example, and then you claim that the chosen game means the downfall of gaming as we know it. This has been happening for a long time. But in the past few years, I've been reluctant to shrug it off as the usual fanboyism, hipsterism, and attention-seeking antics of a vocal minority. It's more likely indicative of something else.

As I see it, this backlash is due to recent changes in the industry which aren't entirely imaginary. The industry is, in fact, changing, and not just in response to the emergence of nearly ubiquitous high-speed internet service, which facilitates digital distribution and piracy alike. Video games have changed also because of their growing audience. Thanks to cell phones, social networking sites, and a few other things which should never have games on them, games have crossed farther into the mainstream than ever before. Meanwhile, those who played video games back when it was an obscure hobby reserved only for children and computer geeks have grown up, and some of them are still playing. It's only understandable that some of these old-schoolers would be a bit shocked by the current state of things.

So, what is the current state of things?

It's complicated, and there are a lot of little topics I'd like to bring up — e.g., how girls went from "eww, you play video games, you're such a nerd" to "hey, I can be a gamer too" and "tee hee, I'm such a nerd" — but most of these things are too far off-topic and will have to wait for some other week. Simply put, if I can allow myself to get to the point, casual games and social networking have taken over. It's not hard to see that this is an expected (and perhaps necessary) consequence of video games getting a slice of that mainstream pie.

Games directed at casual players get a lot of hate, particularly from the more "hardcore" gamers, many of whom grew up when video games were considerably less forgiving than the ones made today. For these players, the whole point of a game is to provide a challenge. Winning should be a struggle; that's what makes it so satisfying. This is why they fail to understand the casual audience. More importantly, this is why they're angered not only by strictly casual games but also by the perceived "casualization" of modern games as a whole.

Are the majority of today's video games a lot easier than the ones of my childhood? You bet. But is this really a terrible thing? Not necessarily. Difficult games still exist, and we should keep in mind that a lot of older games were only hard because of their lack of a save feature. (Wouldn't a lot of modern games be damn near impossible to beat if saving weren't an option?) Other old games were stupidly hard because of poor design, and still others were intentionally made difficult because they were short and would have been beaten too quickly if they weren't frustratingly hard to finish. (Truly master Super Mario Bros. and you can beat it in less than five minutes; without using warp zones, it can still be done in less than half an hour.) Thanks to the wonders of modern technology, playtime can be extended in ways that don't involve dying repeatedly, and games can be entertaining for reasons other than sheer difficulty.

So now we get to ask an interesting question. What actually makes video games fun? Some say it's the challenge, while others will say it's the story/characters/immersion (for single-player games) or the social experience (for multiplayer games). Still others, I suspect, would say they just like to blow things up. In reality, for most people, it's a combination of all of the above.

How much, in particular, should difficulty matter? From the developer's point of view, a game should be difficult enough to entertain the experienced players — to let them know that winning takes effort so that winning feels good — but easy enough to avoid alienating the casual players who might not even bother to finish a game if it frustrates them at all. Personally, I think most developers have done a pretty good job of accomplishing this. Say what you will about the harm caused by pandering to the casual audience, but most games worth playing have multiple difficulty levels, the easiest of which is usually tame enough for "casuals" and the hardest of which is usually a challenge for anyone who never played the game before. Nobody should be disappointed unless a developer makes a serious miscalculation.

This is why I was surprised to see such a negative reaction to this article on Kotaku a little more than a week ago, in which Luke Plunkett gives a fairly reasonable rebuttal to Assassin's Creed III lead designer Alex Hutchinson's (rather preposterous) claim that "easy mode often ruins games." (It's kind of funny because Assassin's Creed, a game with only one difficulty, isn't that hard, and the same is true of all the sequels I've played.) I'm not a big fan of Kotaku, nor am I a fan of Luke Plunkett, but I have to agree with him here. At least, I agree with his headline. A game can't be ruined by a difficulty setting.

I'm willing to say that the "easy mode" of a game can often be the worst version of that game, as Hutchinson claims, but the inclusion of an easy mode surely doesn't spoil the whole game unless it's the only mode available. Don't like easy mode? Play on hard. If the harder settings are still too easy, or if they do nothing but make the game more tedious, you've picked a bad game. If the harder settings are locked until the easier ones are completed, you better hope the easier settings are hard enough to keep you entertained for a single playthrough; otherwise, you've picked a bad game. Bad game design happens, but if you're blaming it solely on the inclusion of an "easy" mode, you're probably overlooking a deeper problem.

Still, I won't say I agree with Plunkett completely, since he has entirely different reasons for disagreeing with Hutchinson's argument. Specifically, he makes it abundantly clear that he doesn't care about difficulty at all, and that he plays story-driven games only for the story. He probably wouldn't mind if a game like Assassin's Creed III consisted of no interaction besides "press X to continue." And if you're like this, you probably should ask yourself why you're playing games at all, rather than watching movies or reading books. If, on the other hand, you can appreciate the unique things that games have to offer, instead of just complaining that everything is too hard, then your idea of "fun" is just as valid as that of the hardcore gamer dude who plays everything on the hardest setting and skips all the cutscenes.

So where do I stand?

Let's just say I was more than a little annoyed by the fact that it's literally impossible to lose in the 2008 version of Prince of Persia. I won't go so far as to say that the protagonist of a game needs to be able to die, and "losing" is hardly a setback in any game with a save option (assuming you use it often enough), but being automatically revived after every fall in PoP 2008 seemed like a step down from the rewind system in The Sands of Time, which actually required some minimal skill and had limits. If there's no consequence for falling off a cliff, the sense of danger and suspense is gone and the game becomes only tedious where it might otherwise have been exciting.

On the other hand, you know I'm a sucker for story-driven games, and the need for a genuine challenge can be subverted by decision-making and role-playing elements. Since Choose Your Own Adventure books were terrible and there's no equivalent in the movie world, I think it's pretty safe to say that the existing technology used for video games is the ideal medium for straight-up interactive fiction. I see no reason not to take advantage of this. The problem is that what might be described most accurately as interactive fiction, and not as a traditional video game, will nevertheless remain stuck under the category of video games. This tends to generate all the wrong expectations. Story-driven titles are often criticized for having too much "story" and not enough "game" (even if the developer's primary objective, admittedly, was to tell a story).

Regardless of how far a developer decides to take the storytelling aspect of a product, and regardless of what you call it, the fact is that difficulty (and, indeed, gameplay itself) often matters less when story matters more, and if you're looking for a serious challenge, you should probably stay away from plot-driven games, even ones like Assassin's Creed. They make it difficult enough that any given mission might take two or three attempts, but they know they're not serving the hardcore crowd exclusively. Sometimes, though, I think the hardcore crowd still hasn't caught on.

Wednesday, August 8, 2012

Alan Wake & Cinematic Games

A few days ago, I finished playing Alan Wake. I'd previously mentioned the game in an earlier post about movies based on video games; although I hadn't yet owned the game at the time, I had heard it was very story-driven, and perhaps, therefore, an ideal candidate for a film adaptation. Then again, as I pointed out before, such a movie serves no real purpose if the game already functions as an interactive movie by itself. Alan Wake is, in fact, what you might call a very cinematic game; while the term "cinematic" has often been used as a meaningless buzzword by the industry in recent years, it's fitting in this case. Not surprisingly, there has been some (wishful) expectation of an Alan Wake feature film. Though nothing has been announced, it almost seems bound to happen.

Much like the Assassin's Creed franchise (which spawned the short films Lineage, Ascendance, and Embers), Alan Wake has already branched out into the realm of live-action entertainment, and this is pretty easy to do when so many of the game's characters are modeled on the actors who play them. Bright Falls, the promotional web series that serves as a prequel to Alan Wake, somehow manages to be worth watching, and I have to say it's considerably more unsettling than the actual game.


For the moment, however, I'd like to forget about the predictable attempts to push the franchise into other media, such as movies and books, and focus instead on the game itself. Don't wait for a score at the end, though, since it's not my intention to write a proper review. I don't really see the point, since the game is already old enough that I'd surely be the millionth guy reviewing it. While it's still relevant enough to have its place in a more general discussion cinematic games (and I'll get to that shortly), it's not unfair to say that Alan Wake is yesterday's news. This is usually what happens by the time I get around to playing a game, since I'm strongly opposed to paying full price for anything.

Despite the game's age, however, I'm not as far behind the times as you might assume. While the Xbox 360 version, released in May 2010, is already more than two years old, the PC version (which I recently purchased) wasn't released until February of this year. Although a PC version was originally planned at the time of the game's announcement, the game was published by Microsoft, and selfish Microsoft wanted the game to be exclusive to its own Xbox 360 console. Apparently, this changed only after lots of nagging by Alan Wake's developers at Remedy Entertainment, who still wanted to release a PC version of the game despite the juicy exclusivity deal. It took a while, but Microsoft finally agreed, and the PC version sold well even though the console version had already been around for nearly two years.

Since the personal computer is my game platform of choice — and, more importantly, since I don't even have my own Xbox 360 — I had to wait for the port. Fortunately, once the PC version was released, it didn't take long for the price to drop low enough to get my attention. During the recent "summer sale" on Steam, I picked up Alan Wake (including DLC), along with the sequel/spin-off Alan Wake's American Nightmare, for a combined $9.99. I haven't played the latter, but the first game alone was, in this writer's opinion, worth at least one crisp Alexander Hamilton, give or take a penny.

In short, the game is pretty fun. After hearing so much about its plot-driven nature and so little about its gameplay, I feared it would be disappointing as a game, and notable only as some kind of casually interactive storytelling machine. I've heard as much about several recent titles, most notably Jurassic Park: The Game and The Walking Dead, both by the (appropriately named) developer Telltale Games. To my surprise, my fears about Alan Wake were unfounded.

The combat is seemingly very simple — dodge attacks, weaken bad guys with flashlight, shoot bad guys with gun, repeat — but there is some unexpected complexity in the subtleties of managing multiple enemies at once, and in using the environment to your advantage. More importantly, there is some real challenge involved; you'll occasionally find yourself getting cornered and chopped to pieces after the simplest mistake on the easiest difficulty setting. (The gameplay isn't actually difficult, per se, once you figure out what you're doing, but you will have to learn things the hard way if you don't learn them quickly.) Additionally, whether you think this matters or not, the combat just looks so freakin' cool. It's entertaining enough, at the very least, to stave off boredom for the duration of a single play-through.

But I fear that Alan Wake's great balance of enjoyable story and exciting gameplay is an exception to the rule, and beyond that first run through the game, things can still get tedious. (I should mention, by the way, that when I say I finished Alan Wake, I mean to say I finished it completely. I beat the game on every difficulty level, found every hidden item, and unlocked every achievement. Don't ask me why I do this with every game I play; I guess I'm a masochist.) But in Alan Wake, the lack of replay value doesn't stem from repetitive combat, or even from spoiled plot twists. Playing a second time is tedious because, in its attempt to be "cinematic," Alan Wake includes a lot of dialogue and other brief but mandatory breaks in normal gameplay.

While the cutscenes can be skipped, a lot the dialogue falls outside of these cutscenes. Characters will talk (and talk and talk) to you, as you walk around and explore your surroundings during the non-combat sequences, and you're not always able to ignore them. Occasionally you'll even be instructed to follow a character, as he or she slowly plods around, revealing bits of the plot via typically one-sided conversation — which, on your second or third play-through, you won't really care to hear. The story is fantastic, but hey, it's the same story every time.

I'm using Alan Wake as an example, but these are issues that plague a lot of story-driven games, to varying degrees — even first-person shooters like Half-Life 2 and more action-oriented games like Assassin's Creed. In each case, many players will praise the plot, the characters, the acting, the soundtrack, and the aesthetics, while the rest will see these things as harmful distractions from what really matters: the challenge and the complexity of the game.

Perhaps Alan Wake in particular has some immunity to this common criticism, since it's no secret that the game aims to be as much like a TV show as possible. Divided into episodes, each ending with a theme song and beginning with a recap of prior events ("previously on Alan Wake..."), the game might as well have been a television miniseries. Take out the episodic interludes and it still might as well have been a movie. If you don't want like your games to be cinematic and movie-like, you probably wouldn't play a game like Alan Wake on purpose. The game is rather transparent about what it is, and players know what to expect, so you don't hear a lot of complaints that gameplay has, arguably, taken a back seat to plot and style and other cinematic silliness.

Ironically, one of the major problems with Alan Wake, and other similarly plot-driven games, is actually the result of misguided attempts to retain as much "game" in these shameless interactive movies as possible. All of the major plot and character development could have been confined to skippable cutscenes, but instead, we play through a lot of it. In Alan Wake, this accounts for a lot of lost replay value. Outside of the scary monster-shooting parts (i.e., during the day), you're left with little to do but walk from point A to point B, admire the scenery, listen to characters talk at you, and position the virtual "camera" at different angles while you wait for things to happen. It might make you feel more like a director than a player, and, unfortunately, this is fun exactly once.

There's something to be said for storytelling in games, but unless you're the type of person who can watch the same movie ten times in a row and love it every time, you probably won't find yourself playing Alan Wake repeatedly. When I just want to shoot things, I always go back to Killing Floor or something else with minimal character development and maximal carnage. That way, I won't have to sit through mushy romance stuff in between fights.

It's not that I have anything against story-driven games. As I said, Alan Wake was enjoyable to say the least. However, the best story-driven games are those which tell a story in a non-intrusive way. Sometimes this means condensing the heavy plot development into cutscenes which the player can opt out of watching, but this tends to cause a sharp separation between the game we play and the story we hear. An often better solution, if the developer wants the game and the story to meet seamlessly, is to have dialogue occur during normal gameplay without stopping the gameplay, or to show the player what's going on through subtle cues without having the protagonist's sidekick stop and explain everything. It's a classic case of "show versus tell" (or perhaps "let-the-player-find-it versus shove-it-in-the-player's-face").

The player shouldn't be forced to sit and listen to dialogue, or watch a ten-minute cutscene, or follow a character around at a snail's pace for the sake of plot development, because if a game is riddled with these kinds of tedious, non-gameplay moments, the best gameplay in the world can hardly make multiple replays worthwhile. I'm sure, however, that Alan Wake's developers were aware of this, for at least they gave us the ability to skip past cutscenes and rudely walk away from some of the less important conversations.

It would even seem that someone on the development team isn't too fond of excessive dialogue in games... that is, unless this in-game encounter between Alan Wake and a more-than-slightly crazy video game designer named Emerson is just an attempt at self-deprecating humor:


Emerson makes a good point, even if he's too insane to know it.

But characters (and toasters) who talk, talk, talk, all the time, aren't the only problem with games that attempt to provide some kind of cinematic experience. Bad camera angles, sluggish controls, and frequent breaks in gameplay are all symptoms, and Alan Wake suffers a little from all of them in its attempt to look cool. As far as the controls are concerned, I am grateful that the developers patched the game with a "Direct Aiming" option to make the game more suitable for mouse and keyboard controls, but there's still some delay when jumping and performing other actions, and I'm fairly sure it's not just a performance issue on my computer. It's just a consequence of the game's smooth character animations.

More natural character movements often necessitate less natural gameplay, and while Alan Wake was never meant to be a platformer, this does make the game somewhat frustrating. Once the novelty of playing such a realistic-looking game wears off, you'll wish Mr. Wake could just turn on a dime and jump at a moment's notice like your other video game heroes.

Eventually, you will get used to the controls, and even the awkward camera angle, but those frequent breaks in gameplay — which usually involve the camera moving to focus on some far-off object or event, often to show the player where to go — still make replaying familiar sections a snore-fest for impatient players such as myself.

There will come a time when video game developers will need to realize that video games are not movies. I hope they also realize that trying to imitate movies is not the only way to tell a good story. For decades, stories have been an integral part of video games. We've come to expect some kind of story, especially in horror/mystery games like Alan Wake. But the video game industry has long been unable to drop the habit of turning to movies as the inspiration for their storytelling techniques, and as developers strive to make games even more "cinematic" (and otherwise more visually impressive) with every passing year, they seem to be losing sight of what actually makes them fun.