Showing posts with label assassin's creed. Show all posts
Showing posts with label assassin's creed. Show all posts

Sunday, December 22, 2013

Midlife Crisis, Part 3

The most frustrating thing about having a hobby is that you never really have time for one unless you're unemployed and lonely. For better or for worse, I'm neither. This was the case before I bought my new PC, and it's still the case now that I've gotten most of my games installed on it. There will always be weekends, and I have a few hours of downtime after work each weekday, but it becomes more clear every time a new game is released that I'm going to die of old age before I get to finish every game that I deem worth playing. Such is the price I pay for attempting to have a life on the side.

So far, I've actually spent more time fiddling with my PC than playing games on it. Lately, this fiddling has been the enjoyable kind; I've been installing all the software I need, rearranging my desktop icons like the truly obsessive-compulsive person I am, and more generally setting things up just how I like them. For the first few weekends of my PC's existence, however, I had nothing but trouble.

First, I didn't bother getting a wireless network adapter because a stationary computer should ideally be placed where an ethernet cable can reach it. Unfortunately, I needed the computer to be in another room temporarily. To remedy the situation, I tried using something I already had in my closet — a D-Link wireless USB adapter. It worked pretty well until my network started slowing down or crashing every time I tried to use a lot of bandwidth (i.e., by downloading a Steam game). I'm still not sure what the problem was; maybe there was some kind of incompatibility with the router, or maybe something more complicated was going on. Maybe it was my computer's fault, somehow. Fortunately, I don't really need to figure it out, since I'm using a wired internet connection now and I don't really have any need for Wi-Fi (let alone the D-Link adapter) in the near future.

Other problems included a couple of random blue screen errors (most likely caused by an AMD video card driver which I've updated) and various problems with various games. The original Assassin's Creed, for example, refused to start when I first installed it, and I'm not even sure how I fixed the problem. I'd tried a few things, given up, and turned off the computer, and when I tried launching the game again later, it worked just fine. (Actually, I had to turn on compatibility mode for Windows Vista because I was getting a black screen where the opening cut scene should have been, but that's hardly an issue. As often as compatibility mode fails, it should always be the default first move if an old game does something weird.)

Compatibility mode for Windows 98 / Windows ME was also the initial solution for the Steam version of the original Max Payne, which failed to launch even though the process was visible in the task manager. However, even after the game launched, some of the music was gone and the sound effects were severely messed up. Fortunately, some nice guy created his own patch to fix the problem. It sucks that the original developers of old games like Max Payne aren't willing to invest the time and money to solve these problems themselves (especially when they're still selling these old games alongside their sequels on digital services like Steam), and the amateurs who pick up the slack are true heroes.

I'm reminded of Command & Conquer: The First Decade, a box set of a dozen games from the series. A couple of official patches were released, but not all of the bugs were fixed, so fans started patching it up themselves. The unofficial 1.03 patch, a collection of bug fixes and other features, was absolutely essential for anyone who had this particular Command & Conquer box set. But it's not just the occasional issue with an outdated game that often necessitates a third-party fix.

Now that I have a good computer, my older games don't even come close to pushing the graphics card to its limits, which means most of these games will needlessly run at a frame rate much higher than my monitor's refresh rate. Usually, this just causes screen tearing. In extreme cases, I can even hear what sounds like coil whine, an irritating whistling noise coming from inside the computer (not the speakers). This happens on the main menu screens of F.E.A.R. and some other games, presumably because the computer is able to render thousands of frames per second when there isn't much to display.

Turning on a game's Vsync feature (preferably with triple buffering enabled as well) fixes these problems, but a few of my games don't have a working Vsync feature. Each of the games in the S.T.A.L.K.E.R. trilogy, for example, has an option for Vsync in the settings, but in all three games it does nothing. It's straight-up broken. The optimal solution would be to force Vsync and triple buffering through the control panel software of ones graphics card, but AMD cards can't do this for certain games on Windows 7, and it's my understanding that both Microsoft and AMD are to blame for that. Even with Vsync set to "always on" in Catalyst Control Center, I was getting stupidly high frame rates in S.T.A.L.K.E.R.: Shadow of Chernobyl.

Then I heard about D3DOverrider, a little tool included in an old freeware program called RivaTuner. It's made to enable Vsync and triple buffering in software that's missing one or both options, and it works like a charm. Despite S.T.A.L.K.E.R.'s broken Vsync feature, and despite Catalyst Control Center's inability to fix the problem, D3DOverrider gets the job done. Now I'm getting a fairly consistent 60 frames per second, instead of hundreds of frames in-game and thousands of frames on the menu. No more vertical tearing and more no quiet-but-irritating coil whine.

That other first-person shooter set in a post-apocalyptic Eastern Europe, Metro 2033, has its own share of issues, namely that a lot of useful options don't show up in its menu and have to be toggled on or off by editing a few configuration files in Notepad, and it also appears to have a broken Vsync feature. In this case, not even D3DOverrider appears to be solving the problem. Fortunately, the game's poor optimization means that it doesn't always exceed 60 frames per second at the highest graphics settings anyway, making Vsync mostly unnecessary. People with more powerful systems might have to keep on looking for solutions.

All of this is pretty frustrating, but troubleshooting is to be expected when playing games on a PC, especially when the games are relatively old and the operating system is relatively new. I guess I should just be glad that most of the common problems can be solved.

"But if only you'd bought a console," some would say, "your games would just work." That's the favorite argument in favor of consoles. They just work. But now that the short-lived phenomenon of backwards compatibility has gone out the window with PlayStation 4 and Xbox One, I don't think it's a fair argument. Most of the problems with PC games arise when one is trying to have a nostalgic experience by playing an old game on a new system, and the other problems are usually the fault of careless developers.

I guess we should all be glad that PC games work at all, considering that our "gaming computers" are not standardized like all the millions of identical Xbox One and PlayStation 4 consoles. Since I'm not a game developer, I can only imagine how difficult it must be to ensure that a game is going to work consistently on so many hardware configurations. Maybe I shouldn't be so upset that games like S.T.A.L.K.E.R. have a few broken features, or that games like Max Payne continue to be sold without being updated for the current version of Windows. On the other hand, it's harder to forgive professional developers for an imperfect product when presumably amateur developers are able to correct the imperfections without being paid.

Update: It seems that, since I originally wrote this post, S.T.A.L.K.E.R. was actually updated with a frame rate cap of 60 fps. I'm shocked that such an old game was actually updated, to be honest, but apparently some people with expensive computers were burning out their video cards by leaving the game paused (thereby allowing the game to run at hundreds or thousands of frames per second for long periods of time). Terrifying.

Wednesday, October 31, 2012

Happy Halloween! Have Some Spoilers

Halloween is here, and Hurricane Sandy has passed my home, leaving it at least 99% intact. Meanwhile, I haven't much to write, since I already posted this week. So maybe I'll do what a lot of other bloggers do and comment on the news.

Where do I start? Well, Assassin's Creed III came out yesterday... for consoles. For some reason, PC players have to wait until November 20. Suddenly I feel like a second-class citizen. No doubt, if asked about this, the publishers would mutter something about piracy. But even if they were being sincere, I'd have no sympathy for them, since they've already gotten more money than they likely deserve.

Remember when I mentioned Assassin's Creed III in my tirade against the entire concept of pre-ordering video games? I can't shake the feeling that you've all let me down, since the game has broken Ubisoft's pre-order records. I can see why this might happen; it's a highly anticipated sequel in a popular franchise. Furthermore, by offering a "season pass" for upcoming downloadable content at a modest 25% discount, Ubisoft was clearly doing its best to make "buy it now and ask questions later" sound a lot less crazy than it is. But it's still crazy, especially when Ubisoft refuses to release a playable demo.

And then there's news that the game will feature microtransactions (which, in this case, seems to mean trading real-life money for in-game currency which may or may not provide players with an unfair advantage in multiplayer matches). The same people who pre-ordered the game at full price without even trying it first, and then pre-ordered a bunch of DLC packs without even knowing what they would contain, probably won't be able to resist partaking in this final moneygrab... that is, unless they very quickly become disappointed in the game after they've played it.

The critics' reviews are mostly positive, of course, but when the game's ending was (predictably) uploaded to YouTube as early as two days before the official release date, a lot of potential customers found at least one reason not to buy the game. Some fans of the series, even without taking the time to play through this final act, seem to be upset about how the story turns out, and there are plenty of comments along the lines of "wow, that sucked, I'm so glad I didn't spend $60 on this."

For the record, I haven't watched the video below in full, because I might yet decide to play this game (once all the reviews are in and a bribe-free consensus on its quality has been reached, once the price has dropped to a reasonable level, and perhaps once there's some kind of special edition for PC with all that silly DLC included). Needless to say, however, it does contain spoilers.


The conclusion of Desmond Miles' story arc has prompted not only a lot of complaints, but also some comparisons to the ending of Mass Effect 3... which, by the way, was poorly received, largely because the game's multiple endings were so similar. (The video below explains the frequent jokes about the different endings being the same except for the choice of color.)


From what I've heard, Assassin's Creed III doesn't have multiple endings, and the reasons that some people hate its singular ending are accordingly much different. Although (as I've pointed out) I haven't watched the ending myself, it sounds like part of the problem is another ridiculous plot twist out of left field. In a way, this isn't surprising. When a story like that of the Assassin's Creed series relies so heavily on plot twists and cliffhangers, two things happen. First, the story ends up feeling too random or inconsistent, usually after some "grand-finale" plot twist employed in a last-ditch effort to truly surprise players; second, the writers get so attuned to raising questions and leaving things unresolved that they forget how to answer all those questions without resorting to some stupid cop-out.

The one constant here — besides, of course, each game being the conclusion of a trilogy — is that hardcore fans of each franchise felt cheated. After sinking so much money into a series, that hurts.

A similar thing happened recently with Halo 4, whose ending wound up on YouTube a couple of weeks ago, despite the game's official release date of November 6. Microsoft has been scrambling to take the videos down for obvious reasons, and it looks like they've been somewhat successful, because I can't seem to find a high-quality version. (The videos below, which I found and embedded in a playlist, are likely to vanish soon.)


An allegedly sub-par ending, combined with the most banal and corny tagline of all time, has drawn a lot of laughs at the expense of this futuristic shooter. (An Ancient Evil Awakens? Seriously? Correct me if I'm wrong, but that was a cliché long before the Halo 4 marketing team got a hold of it.) I'm not sure if I can fully grasp just how good or bad this ending is, because I haven't played a Halo game since the second installment, but some fans were upset about it. (Perhaps too upset, considering that, with Halo 5 and Halo 6 supposedly in the works, it's not really an ending at all.)

Ultimately, none of this really matters, as long as the bulk of the gameplay is enjoyable... at least, that's the theory. But video game sequels are almost always advertised as continuations of a story, with cinematic trailers featuring minimal gameplay footage. Perhaps the assumption is that we already know what the gameplay is like, since we played the previous game, or that we shouldn't need to ask about the gameplay because any product with a sufficient amount of hype is worth pre-ordering, no questions asked. In any case, the result is a game advertised on the basis of its plot, and purchased primarily by owners of the previous titles, who want to know how the story ends. A bad ending, therefore, is pretty hard to ignore.

So how does the industry avoid disappointing fans with bad endings? Simply to write better endings might not be the answer; that's easier said than done, and the quality of an ending is ultimately a subjective thing. (A vocal minority, at least, will always complain, no matter what.) I think a better solution is to stop making so many sequels — to create more stand-alone games to be judged on their own merit rather than allowing so many new releases to ride on the hype generated by their predecessors — and, by extension, to stop making games that end in cliffhangers in anticipation of sequels that haven't been written. They should stop deliberately writing stories that span multiple games, thereby forcing players finish an entire trilogy (tetralogy, pentalogy, hexalogy, etc.) to find out if the final "ending" to a given story arc is any good.

They won't listen to me, though; cliffhangers are a fantastic way to make money.

In other news, Painkiller: Hell & Damnation is out today. Unfortunately, given that it's just about the newest game on Steam, it hasn't joined the rest of the Painkiller series as part of their Halloween Sale. That's not really a big deal, though, since they've set the standard price at a reasonable $20 instead of jumping straight to $60 regardless of quality like most publishers/developers do.

Wednesday, October 3, 2012

Stop Pre-Ordering Games

I've mentioned before, in passing, my deep loathing — shared by many — for day-one DLC and pre-order bonuses.

I'm not going to pretend that the downloadable content of today is fundamentally different from the expansion packs of old; in theory, they're very similar. Expansion packs would either add content to an existing game, or act as a continuation of the game in the form of additional levels, but they were typically not as "big" (or as expensive) as the game itself. DLC almost always follows this example, albeit with a different delivery method and, thus, fewer constraints. With no discs and no shipping, selling everything in smaller pieces is no inconvenience to the publisher, which is why we're seeing ever smaller DLC "expansions" with (ideally) smaller prices than those of traditional expansion packs.

But there's another difference. While the traditional expansion pack was typically released some time after the base game, DLC is often available immediately. No doubt the industry believes this is a great thing, but not everyone agrees.

When DLC is released concurrently with the base game, people inevitably jump to the conclusion that this "extra" content belongs in the game itself, but that it was removed, and sold separately, for the sake of squeezing more money out of customers... like a car salesman selling you everything but the steering wheel and then demanding extra cash for the "extra" part. Of course, "day-one DLC" doesn't really mean that the publisher took a finished game from the developer and broke it up to be sold in pieces. It's likely that most games with DLC additions were meant to be sold this way from the very beginning, and were developed with this in mind. However, developing a game with DLC in mind still means to many that the base game will be inherently incomplete. I think we can all admit that this isn't necessarily true — a lot of these games still feel "complete" even without all the (mostly useless) add-ons — but appearances and first impressions, whether or not they're accurate, are pretty important.

Personally, I don't mind if a developer or publisher wants to sell a game in pieces. I usually ignore DLC unless I'm absolutely in love with a game and feel a compulsive need to experience every bit of it. Furthermore, most DLC consists of strictly non-essential content. Sometimes, this means purely cosmetic changes to a game, such as the character packs in Killing Floor, and I think this is a pretty harmless way for the developer to earn a few extra bucks from anyone actually willing to throw away their money for such a frivolous thing. I certainly don't feel compelled to buy this stuff, so I don't feel like I'm missing out on anything.

However, the same can't be said of DLC that would, for example, add extra weapons to a first-person shooter, or extra levels to the campaign mode of a story-driven game. I suspect a lot of players — completionists especially — feel that, when they buy a game, they need to own the whole game, and this drives them to pay for half a dozen little expansions that can add up to a lot of cash.

Ultimately, each of us is responsible for how we spend our own money; nobody is shoving extra content down our throats and forcing us to buy it. But when this so-called "DLC" is available on release day — and sometimes even included on the game disc, just awaiting authorization — it's typically seen as a part of the game for which we thought we already paid, not as an optional expansion to it, and uninformed customers tend to get pretty upset when they find out. While this is the source of a lot of controversy, I think it's also exactly what the publishers want. The idea that an integral part of the original game has been taken away to be sold separately is what makes us hate day-one DLC... but it's also what makes us buy it. It's a shame that the average consumer doesn't have the willpower to boycott a product.

While it wouldn't be completely crazy for me to say that DLC itself is downright evil, I don't think that's a very constructive thing to do. First of all, DLC itself isn't the problem. We're the problem. If the game industry is doing something wrong, it's partly because we reinforced that behavior with our purchases. Second of all, it's not DLC that we should hate, but rather the host of generally evil business practices that come along with it. For example, so-called day-one DLC is often used as an incentive for pre-ordering a game, or even for pre-ordering the game from a specific retail outlet. And instead of buying a game, and then buying an expansion if we really liked the game, we're encouraged to buy a game and all of its additional content at once — before the game is even released.

Welcome to the wonderful world of pre-orders and pre-order bonuses. No, don't think, just hand over your wallets.

Some DLC was just announced for Assassin's Creed III — a game which, by the way, hasn't yet been released — and all five of the upcoming DLC packs can be purchased with a $30 season pass. Add that to the usual price tag of $60 for the base game, and you've got quite a large purchase. Yes, the Gold Edition of the game (which includes this season pass) is a whopping $90. Of course, there's a benefit to buying this season pass; it's significantly cheaper than buying each DLC pack separately, for a total of $40. But I'd much rather wait until a year after release — when the game and its DLC are cheaper, and when I know whether the game is worth playing — before I spend any money.

You've probably guessed that I think pre-ordering is a horrible idea and that anyone who pre-orders anything is a mindless sheep. You guessed right. Naturally, the whole concept of a "season pass" for DLC is, to me, a bit absurd. It's essentially a pre-order for DLC which, like the game itself, might not even be good. The fact that the game is a sequel makes it all slightly less crazy — fans of the series have a pretty good idea of what the game will be like — but it doesn't seem like a great investment either way. When you pre-order not only a $60 game but also $30 worth of DLC on top of it, you're betting a whole lot of money that the game won't suck. Why not wait until after it's released so you can read some reviews and get maybe a better price? What's the benefit of pre-ordering?

In the old days, the only reason for pre-ordering was to reserve a copy of a highly anticipated game for which supply was expected to fall short of demand. It guaranteed that you'd get your game on release day instead of waiting for the next shipment while all your friends played the game without you. But the industry likes pre-ordering for another reason. It makes their sales figures look better. They get to say they sold a hundred thousand copies of their game on the first day. They get to say their game went gold before it was even released.

In the context of modern PC gaming, the word "supply" is meaningless. Just about every PC game can be downloaded; there are no shipments, and copies of a game are unlimited. So why should anyone pre-order a downloadable game? I think the industry asked itself this question and came up with an answer: pre-order bonuses. Not only do they make the absurdity of pre-purchasing a downloadable game seem a bit less absurd; they also make the foolish act of pre-ordering physical copies even more tempting.

The fact that developers would spend their time making DLC exclusively for those who pre-purchase the game — content which the rest of their fans may or may not be able to access at a later date — says a lot about the industry, namely how much value they place in those pre-orders. Could it really be all about inflating those first-day sales figures? Or could it be that they desperately want us to buy their games before anyone gets to find out if those games are worth playing? Why anyone would pay $60 for a game that hasn't even been reviewed yet is beyond me, but the industry has put a lot of effort into convincing people to do it.

Meanwhile, very few demos are being released these days, and I can't help but wonder if this is because developers are afraid that fewer people will spend money if they see what their games are like first-hand. Clearly, at the very least, they don't believe that releasing a demo has any benefit anymore, since they've already figured out how to convince millions of consumers to buy their product without even waiting for the critics to have their say.

What I'm really getting at, here, is that people who pre-purchase games are irresponsible and reckless. They're also harming the industry, and the industry is helping them do it. As consumers, we communicate with developers and publishers primarily through our purchases. No doubt the people who make video games occasionally hear our opinions, if we're loud enough, but what they really care about is where our money goes. If you hate a game after you buy it, they still have your money, and your opinion isn't going to hurt them unless you convince others not to buy the game.

So stop pre-ordering games you've never played. Stop telling developers "yes, this game is great" before you know it to be true.

I'd like to tell you all to stop buying new games entirely, since paying $60 for a new game is just a waste of money if it's going to be 75% off on Steam or Amazon less than a year after its release. Of course, there's always the argument that multiplayer games are most fun during the height of their popularity (i.e., before the community moves on to better things) and that waiting too long to play them means missing out on the fun. But if an online community dies so fast that you need to buy the game on day one to get in on the action, the game is probably terrible anyway.

Maybe if we all think a little more carefully about our purchases, developers will focus more on making games enjoyable and worthwhile, instead of coming up with a thousand other ways to get our money more quickly and more often.

Wednesday, September 12, 2012

What Makes Video Games Fun?

A lot of "hardcore gamers" (regardless of whether they identify themselves as such) will tell you that the video game industry is in sad shape. It's not just because of the past decade's unfortunate shift toward increasingly more intrusive digital rights management, or the recent trend of releasing "extra" downloadable content on day one to encourage thoughtless and irresponsible pre-orders, or the deliberate efforts to use both DRM and DLC to destroy the used game market. Rather, it's because they think that too many of the games being released today are crap.

And they're not just talking about shovelware that nobody buys. This is popular crap. So what's up with all the hate? Well, it should be no surprise that the games which tend to attract the most violently negative attention are always the popular ones. After all, if you want to complain about a genre, a feature, a console, or a developer, you pick a popular game as an example, and then you claim that the chosen game means the downfall of gaming as we know it. This has been happening for a long time. But in the past few years, I've been reluctant to shrug it off as the usual fanboyism, hipsterism, and attention-seeking antics of a vocal minority. It's more likely indicative of something else.

As I see it, this backlash is due to recent changes in the industry which aren't entirely imaginary. The industry is, in fact, changing, and not just in response to the emergence of nearly ubiquitous high-speed internet service, which facilitates digital distribution and piracy alike. Video games have changed also because of their growing audience. Thanks to cell phones, social networking sites, and a few other things which should never have games on them, games have crossed farther into the mainstream than ever before. Meanwhile, those who played video games back when it was an obscure hobby reserved only for children and computer geeks have grown up, and some of them are still playing. It's only understandable that some of these old-schoolers would be a bit shocked by the current state of things.

So, what is the current state of things?

It's complicated, and there are a lot of little topics I'd like to bring up — e.g., how girls went from "eww, you play video games, you're such a nerd" to "hey, I can be a gamer too" and "tee hee, I'm such a nerd" — but most of these things are too far off-topic and will have to wait for some other week. Simply put, if I can allow myself to get to the point, casual games and social networking have taken over. It's not hard to see that this is an expected (and perhaps necessary) consequence of video games getting a slice of that mainstream pie.

Games directed at casual players get a lot of hate, particularly from the more "hardcore" gamers, many of whom grew up when video games were considerably less forgiving than the ones made today. For these players, the whole point of a game is to provide a challenge. Winning should be a struggle; that's what makes it so satisfying. This is why they fail to understand the casual audience. More importantly, this is why they're angered not only by strictly casual games but also by the perceived "casualization" of modern games as a whole.

Are the majority of today's video games a lot easier than the ones of my childhood? You bet. But is this really a terrible thing? Not necessarily. Difficult games still exist, and we should keep in mind that a lot of older games were only hard because of their lack of a save feature. (Wouldn't a lot of modern games be damn near impossible to beat if saving weren't an option?) Other old games were stupidly hard because of poor design, and still others were intentionally made difficult because they were short and would have been beaten too quickly if they weren't frustratingly hard to finish. (Truly master Super Mario Bros. and you can beat it in less than five minutes; without using warp zones, it can still be done in less than half an hour.) Thanks to the wonders of modern technology, playtime can be extended in ways that don't involve dying repeatedly, and games can be entertaining for reasons other than sheer difficulty.

So now we get to ask an interesting question. What actually makes video games fun? Some say it's the challenge, while others will say it's the story/characters/immersion (for single-player games) or the social experience (for multiplayer games). Still others, I suspect, would say they just like to blow things up. In reality, for most people, it's a combination of all of the above.

How much, in particular, should difficulty matter? From the developer's point of view, a game should be difficult enough to entertain the experienced players — to let them know that winning takes effort so that winning feels good — but easy enough to avoid alienating the casual players who might not even bother to finish a game if it frustrates them at all. Personally, I think most developers have done a pretty good job of accomplishing this. Say what you will about the harm caused by pandering to the casual audience, but most games worth playing have multiple difficulty levels, the easiest of which is usually tame enough for "casuals" and the hardest of which is usually a challenge for anyone who never played the game before. Nobody should be disappointed unless a developer makes a serious miscalculation.

This is why I was surprised to see such a negative reaction to this article on Kotaku a little more than a week ago, in which Luke Plunkett gives a fairly reasonable rebuttal to Assassin's Creed III lead designer Alex Hutchinson's (rather preposterous) claim that "easy mode often ruins games." (It's kind of funny because Assassin's Creed, a game with only one difficulty, isn't that hard, and the same is true of all the sequels I've played.) I'm not a big fan of Kotaku, nor am I a fan of Luke Plunkett, but I have to agree with him here. At least, I agree with his headline. A game can't be ruined by a difficulty setting.

I'm willing to say that the "easy mode" of a game can often be the worst version of that game, as Hutchinson claims, but the inclusion of an easy mode surely doesn't spoil the whole game unless it's the only mode available. Don't like easy mode? Play on hard. If the harder settings are still too easy, or if they do nothing but make the game more tedious, you've picked a bad game. If the harder settings are locked until the easier ones are completed, you better hope the easier settings are hard enough to keep you entertained for a single playthrough; otherwise, you've picked a bad game. Bad game design happens, but if you're blaming it solely on the inclusion of an "easy" mode, you're probably overlooking a deeper problem.

Still, I won't say I agree with Plunkett completely, since he has entirely different reasons for disagreeing with Hutchinson's argument. Specifically, he makes it abundantly clear that he doesn't care about difficulty at all, and that he plays story-driven games only for the story. He probably wouldn't mind if a game like Assassin's Creed III consisted of no interaction besides "press X to continue." And if you're like this, you probably should ask yourself why you're playing games at all, rather than watching movies or reading books. If, on the other hand, you can appreciate the unique things that games have to offer, instead of just complaining that everything is too hard, then your idea of "fun" is just as valid as that of the hardcore gamer dude who plays everything on the hardest setting and skips all the cutscenes.

So where do I stand?

Let's just say I was more than a little annoyed by the fact that it's literally impossible to lose in the 2008 version of Prince of Persia. I won't go so far as to say that the protagonist of a game needs to be able to die, and "losing" is hardly a setback in any game with a save option (assuming you use it often enough), but being automatically revived after every fall in PoP 2008 seemed like a step down from the rewind system in The Sands of Time, which actually required some minimal skill and had limits. If there's no consequence for falling off a cliff, the sense of danger and suspense is gone and the game becomes only tedious where it might otherwise have been exciting.

On the other hand, you know I'm a sucker for story-driven games, and the need for a genuine challenge can be subverted by decision-making and role-playing elements. Since Choose Your Own Adventure books were terrible and there's no equivalent in the movie world, I think it's pretty safe to say that the existing technology used for video games is the ideal medium for straight-up interactive fiction. I see no reason not to take advantage of this. The problem is that what might be described most accurately as interactive fiction, and not as a traditional video game, will nevertheless remain stuck under the category of video games. This tends to generate all the wrong expectations. Story-driven titles are often criticized for having too much "story" and not enough "game" (even if the developer's primary objective, admittedly, was to tell a story).

Regardless of how far a developer decides to take the storytelling aspect of a product, and regardless of what you call it, the fact is that difficulty (and, indeed, gameplay itself) often matters less when story matters more, and if you're looking for a serious challenge, you should probably stay away from plot-driven games, even ones like Assassin's Creed. They make it difficult enough that any given mission might take two or three attempts, but they know they're not serving the hardcore crowd exclusively. Sometimes, though, I think the hardcore crowd still hasn't caught on.

Wednesday, September 5, 2012

Perfectionism: No Fun Allowed

I've always been a perfectionist. If I can't do something right, I don't like to do it; when I attempt any kind of work, I obsess over the details until it's just right.

I'm still not sure whether this is a good thing.

In the context of work and school, it translates to effort and dedication, but more often than not, it also slows me down. Sure, it helped me impress my art teacher in high school when most of the other students couldn't give less of a damn, and it earned me some nice grades elsewhere because I wasn't content to turn in half-assed work. Unfortunately, I think, it seems to have gotten a lot worse over the years. By the time I was (briefly) studying physics in graduate school, I found myself wasting precious time writing long solutions to complex problem sets neatly instead of getting them done quickly. As a result, I slept too little and stressed too much.

In the context of video games, my perfectionist tendencies make me a so-called completionist. If I care at all about the game I'm playing, I have a burning desire to collect every item, unlock every achievement, kill every enemy, find every secret, complete every side-quest, or get the highest possible rating on every level.

The Dangers of Completionism


When I played Metroid Prime — a fantastic game, by the way — I couldn't resist picking up every single missile expansion and energy tank. Maybe I wouldn't have cared if not for the way the game kept track of these things and displayed them as a completion percentage, taunting the mildly obsessive among us. Getting to the end of the game and seeing anything less than 100% felt to me like a minor failure. Of course, missile expansions and energy tanks are pretty useful, so the satisfaction of truly "finishing" the game wasn't the only motivation for finding them. I have no reasonable excuse, however, for scanning every creature, every item, and every bit of Pirate Data and Chozo Lore to fill up the in-game logbook. My only reward for doing so, in the end, was access to a couple of unlockable art galleries. But it wasn't about concept art; it was about not leaving things unfinished.

Only afterwards did I realize that I would have enjoyed the game a lot more if I didn't fixate on finding every little secret. I can't even go back to the game now, because I made myself sick of it.

Games like Metroid Prime are a nightmare for completionists, but we play them anyway because we're all masochists. The really terrible part is that setting aside the carefree enjoyment of the game for the sake of a cruel meta-game in which you pick up a hundred hidden items really isn't as bad as it gets. (With the help of a good walkthrough, if you're not too proud to use it, you can complete even the most tedious item-hunting quest with relative ease.) Being a completionist becomes a real problem when the additional challenges we choose (or need) to undertake are so difficult that untold hours are swallowed up by dozens of consecutive, futile attempts with no discernible progress. In the time I wasted getting gold medals on every level of Rogue Leader and its sequel Rebel Strike, I could have played all the way through several other games. I guess the benefit here is that being a perfectionist saved me some money; I got more time out of these games than anyone ever should.

The Need to Achieve


And what of achievements? I'm no fan, and it's not just because of my wacky theory that they're partly responsible for the decline of cheat codes in single-player games. I think achievements cheapen the sense of accomplishment we're supposed to feel when we do well in a game. A lot of developers have fallen into the habit of giving the player an achievement for every little task, like finishing the first level, or killing ten bad guys, or essentially — in rare and truly embarrassing cases — starting the game. (Only sometimes is this actually meant to be amusing.)

In my opinion, anything that necessarily happens during the course of a normal play-through should never be worth an achievement, but developers so often disagree. In Portal 2, fourteen of the achievements (pictured right) are unlocked simply by playing the single-player campaign. Obviously, there are other achievements in the game, but the player shouldn't need to be periodically congratulated for making regular progress.

Achievements, when done correctly, present extra challenges to the player. But even then, achievements teach players that nothing is worth doing unless there's a prize. We're not encouraged to make our own fun and set our own goals; we're encouraged to complete an arbitrary set of tasks, which may or may not include completing the game itself, attempting the harder difficulty settings, or doing anything genuinely entertaining.

But despite my philosophical objections to the idea of achievement hunting, I can't resist, especially if I only have a few achievements left after I beat the game. Unfortunately, those last few achievements tend to be the hard ones. But hey, you can't just leave the game 99% complete. You can't just leave one achievement locked. Right? Seriously, I can't be the only person who finds this absolutely intolerable.

After beating Trine, I spent far too long attempting a flawless run through the last level on the hardest difficulty to get a surprisingly difficult achievement. (I thought this game was casual!) When I played Alan Wake, I never would have bothered collecting a hundred (useless) coffee thermoses scattered throughout the game if there weren't an achievement for doing so. I even carried that damned garden gnome all the way through Half-Life 2: Episode Two. (Please kill me.)


Too Much of a Bad Thing


But even I have limits; a few of the achievements in Torchlight, for example, are just too hard or too much of a grind. They're far from impossible to get, but the game will stop being fun long before you get them, and if you play for the achievements, you'll become suicidal in no time. (Big fans of the game might disagree; most of the achievements will be unlocked naturally if you're okay with playing the game for 150+ hours, but catching 1000 fish just isn't worth anyone's time.)

Similarly, I have no interest in finding every flag in Assassin's Creed, or every feather in Assassin's Creed II, and I don't know why anyone ever would. Even as a hopeless completionist, I can usually tell when attaining 100% completion in a game will lead to more frustration than satisfaction. There's already so much (repetitive) stuff to do in the Assassin's Creed games that I can't imagine why they thought it would be a good idea to throw in a few hundred useless collectibles as well.

Just to bother me, I'm sure.

Collectible items and other tertiary objectives can be good for replay value, but when they extend the playtime beyond the point where the game loses all appeal and becomes a chore — when even a completionist such as myself doesn't want to try — it's just bad game design.

Self-Imposed Perfection


Being a perfectionist doesn't just mean being a completionist. My first play-through of Deus Ex took twice as long as it should have taken, but only because I developed a terrible habit of loading quicksaves constantly, not to avoid dying but to avoid wasting lockpicks, multitools, medkits, and ammo. If I missed a few times while trying to shoot a guy in the face, I couldn't just roll with it and keep going. I went back and tried again. If I picked open a lock and there was nothing useful behind that door, I loaded my save. (And of course, at the end of the game, my inventory was full of stuff I never got to use, but item hoarding is another issue entirely.)

My tendency to needlessly replay sections of a game is probably worst when friendly NPCs can be killed by the enemies. Even if their survival doesn't affect me in the slightest, I often feel the need to keep them alive, and I'm more than willing to reload a save if even a single one of them die. (This used to happen a lot when I played S.T.A.L.K.E.R.: Shadow of Chernobyl, but eventually I learned that it's sometimes best to save my ammo, let my fellow stalkers die, and scavenge their bodies afterwards. Such is life in the Zone.)

Reloading a save when you haven't lost might seem strange, depending on your play style, but some games encourage this type of behavior with optional objectives that are easily botched. Take the Hitman series, for example. You could choose to walk into nearly any mission with a big gun and simply shoot up the place, but the highest ratings are reserved for players who never get seen, fire no unnecessary shots, and kill no one but the primary targets.


This usually isn't easy, because save scumming isn't an option. The first Hitman game doesn't allow saves in mid-level, and the sequels only allow a certain number of saves per mission, depending on difficulty level. This makes perfecting a mission even more painful, and in my opinion, it's another example of bad game design. While I can see why they would want to prevent players from abusing the save system (thereby adding some real difficulty and making the game more "hardcore"), this is kind of a cruel thing to do with such a slow-paced game that involves so much trial-and-error. If you don't save often enough, you might end up repeating several minutes of sneaking at a snail's pace to get back to where you were.

Somehow, I did manage to master every mission in the second and third games, but I don't recommend it. Having to kill a guy and dispose of his body on the fly because he saw you picking a lock is fun, but in the interest of earning the highest rating, I always had to start over instead. When you try to play Hitman perfectly, it's tedious and time-consuming, and essentially requires you to memorize each map. No fun allowed.

Fixing Bad Habits


As a result of all this, my extensive backlog of unfinished games is only slightly longer than the list of games I've been meaning to replay without hitting the quickload button and without going off-course to satisfy my obsessive completion disorder. (The S.T.A.L.K.E.R. games are near the top of that list, but I'd also like to replay those when I get a better computer, which isn't happening any time soon.) Games are more fun when they're played at a natural pace, and I wish it weren't so hard for me to ignore the little distractions along the way.

The best advice I can give to fellow perfectionists, after some soul searching of my own, is the following:

1) Get a screwdriver and pry the quickload button off of your keyboard. Alternatively, I suppose, you could simply go to the control settings and unmap the quickload function. If you can't unmap it, just remap it to a key on the far side of the keyboard, and then promptly forget which key that is. Quicksaving constantly is fine — I won't judge you — but you shouldn't be reloading a save unless you die.

2) Play through the game as quickly as you can; do only the bare minimum. This is normally something I'd discourage, because I believe that games should be enjoyed, not rushed. But if you're getting bored with games before you finish them because you're spending so much time trying to do every side-quest or collect all the items, stop it. Start over. Enjoy the game at its intended pace before you ruin it by attempting a frustrating scavenger hunt. These things are there for your second play-through, and if the game isn't good enough to warrant a second play-through, the optional stuff isn't worth your time.

3) Don't read the list of achievements before you play the game. If you read them, you'll try to get them. Achievement hunting is for replay value, and if it's your first priority, you need to rethink your entire outlook on life. Again, if the game isn't good enough to warrant a second play-through, the achievements aren't worth your time.

Wednesday, August 8, 2012

Alan Wake & Cinematic Games

A few days ago, I finished playing Alan Wake. I'd previously mentioned the game in an earlier post about movies based on video games; although I hadn't yet owned the game at the time, I had heard it was very story-driven, and perhaps, therefore, an ideal candidate for a film adaptation. Then again, as I pointed out before, such a movie serves no real purpose if the game already functions as an interactive movie by itself. Alan Wake is, in fact, what you might call a very cinematic game; while the term "cinematic" has often been used as a meaningless buzzword by the industry in recent years, it's fitting in this case. Not surprisingly, there has been some (wishful) expectation of an Alan Wake feature film. Though nothing has been announced, it almost seems bound to happen.

Much like the Assassin's Creed franchise (which spawned the short films Lineage, Ascendance, and Embers), Alan Wake has already branched out into the realm of live-action entertainment, and this is pretty easy to do when so many of the game's characters are modeled on the actors who play them. Bright Falls, the promotional web series that serves as a prequel to Alan Wake, somehow manages to be worth watching, and I have to say it's considerably more unsettling than the actual game.


For the moment, however, I'd like to forget about the predictable attempts to push the franchise into other media, such as movies and books, and focus instead on the game itself. Don't wait for a score at the end, though, since it's not my intention to write a proper review. I don't really see the point, since the game is already old enough that I'd surely be the millionth guy reviewing it. While it's still relevant enough to have its place in a more general discussion cinematic games (and I'll get to that shortly), it's not unfair to say that Alan Wake is yesterday's news. This is usually what happens by the time I get around to playing a game, since I'm strongly opposed to paying full price for anything.

Despite the game's age, however, I'm not as far behind the times as you might assume. While the Xbox 360 version, released in May 2010, is already more than two years old, the PC version (which I recently purchased) wasn't released until February of this year. Although a PC version was originally planned at the time of the game's announcement, the game was published by Microsoft, and selfish Microsoft wanted the game to be exclusive to its own Xbox 360 console. Apparently, this changed only after lots of nagging by Alan Wake's developers at Remedy Entertainment, who still wanted to release a PC version of the game despite the juicy exclusivity deal. It took a while, but Microsoft finally agreed, and the PC version sold well even though the console version had already been around for nearly two years.

Since the personal computer is my game platform of choice — and, more importantly, since I don't even have my own Xbox 360 — I had to wait for the port. Fortunately, once the PC version was released, it didn't take long for the price to drop low enough to get my attention. During the recent "summer sale" on Steam, I picked up Alan Wake (including DLC), along with the sequel/spin-off Alan Wake's American Nightmare, for a combined $9.99. I haven't played the latter, but the first game alone was, in this writer's opinion, worth at least one crisp Alexander Hamilton, give or take a penny.

In short, the game is pretty fun. After hearing so much about its plot-driven nature and so little about its gameplay, I feared it would be disappointing as a game, and notable only as some kind of casually interactive storytelling machine. I've heard as much about several recent titles, most notably Jurassic Park: The Game and The Walking Dead, both by the (appropriately named) developer Telltale Games. To my surprise, my fears about Alan Wake were unfounded.

The combat is seemingly very simple — dodge attacks, weaken bad guys with flashlight, shoot bad guys with gun, repeat — but there is some unexpected complexity in the subtleties of managing multiple enemies at once, and in using the environment to your advantage. More importantly, there is some real challenge involved; you'll occasionally find yourself getting cornered and chopped to pieces after the simplest mistake on the easiest difficulty setting. (The gameplay isn't actually difficult, per se, once you figure out what you're doing, but you will have to learn things the hard way if you don't learn them quickly.) Additionally, whether you think this matters or not, the combat just looks so freakin' cool. It's entertaining enough, at the very least, to stave off boredom for the duration of a single play-through.

But I fear that Alan Wake's great balance of enjoyable story and exciting gameplay is an exception to the rule, and beyond that first run through the game, things can still get tedious. (I should mention, by the way, that when I say I finished Alan Wake, I mean to say I finished it completely. I beat the game on every difficulty level, found every hidden item, and unlocked every achievement. Don't ask me why I do this with every game I play; I guess I'm a masochist.) But in Alan Wake, the lack of replay value doesn't stem from repetitive combat, or even from spoiled plot twists. Playing a second time is tedious because, in its attempt to be "cinematic," Alan Wake includes a lot of dialogue and other brief but mandatory breaks in normal gameplay.

While the cutscenes can be skipped, a lot the dialogue falls outside of these cutscenes. Characters will talk (and talk and talk) to you, as you walk around and explore your surroundings during the non-combat sequences, and you're not always able to ignore them. Occasionally you'll even be instructed to follow a character, as he or she slowly plods around, revealing bits of the plot via typically one-sided conversation — which, on your second or third play-through, you won't really care to hear. The story is fantastic, but hey, it's the same story every time.

I'm using Alan Wake as an example, but these are issues that plague a lot of story-driven games, to varying degrees — even first-person shooters like Half-Life 2 and more action-oriented games like Assassin's Creed. In each case, many players will praise the plot, the characters, the acting, the soundtrack, and the aesthetics, while the rest will see these things as harmful distractions from what really matters: the challenge and the complexity of the game.

Perhaps Alan Wake in particular has some immunity to this common criticism, since it's no secret that the game aims to be as much like a TV show as possible. Divided into episodes, each ending with a theme song and beginning with a recap of prior events ("previously on Alan Wake..."), the game might as well have been a television miniseries. Take out the episodic interludes and it still might as well have been a movie. If you don't want like your games to be cinematic and movie-like, you probably wouldn't play a game like Alan Wake on purpose. The game is rather transparent about what it is, and players know what to expect, so you don't hear a lot of complaints that gameplay has, arguably, taken a back seat to plot and style and other cinematic silliness.

Ironically, one of the major problems with Alan Wake, and other similarly plot-driven games, is actually the result of misguided attempts to retain as much "game" in these shameless interactive movies as possible. All of the major plot and character development could have been confined to skippable cutscenes, but instead, we play through a lot of it. In Alan Wake, this accounts for a lot of lost replay value. Outside of the scary monster-shooting parts (i.e., during the day), you're left with little to do but walk from point A to point B, admire the scenery, listen to characters talk at you, and position the virtual "camera" at different angles while you wait for things to happen. It might make you feel more like a director than a player, and, unfortunately, this is fun exactly once.

There's something to be said for storytelling in games, but unless you're the type of person who can watch the same movie ten times in a row and love it every time, you probably won't find yourself playing Alan Wake repeatedly. When I just want to shoot things, I always go back to Killing Floor or something else with minimal character development and maximal carnage. That way, I won't have to sit through mushy romance stuff in between fights.

It's not that I have anything against story-driven games. As I said, Alan Wake was enjoyable to say the least. However, the best story-driven games are those which tell a story in a non-intrusive way. Sometimes this means condensing the heavy plot development into cutscenes which the player can opt out of watching, but this tends to cause a sharp separation between the game we play and the story we hear. An often better solution, if the developer wants the game and the story to meet seamlessly, is to have dialogue occur during normal gameplay without stopping the gameplay, or to show the player what's going on through subtle cues without having the protagonist's sidekick stop and explain everything. It's a classic case of "show versus tell" (or perhaps "let-the-player-find-it versus shove-it-in-the-player's-face").

The player shouldn't be forced to sit and listen to dialogue, or watch a ten-minute cutscene, or follow a character around at a snail's pace for the sake of plot development, because if a game is riddled with these kinds of tedious, non-gameplay moments, the best gameplay in the world can hardly make multiple replays worthwhile. I'm sure, however, that Alan Wake's developers were aware of this, for at least they gave us the ability to skip past cutscenes and rudely walk away from some of the less important conversations.

It would even seem that someone on the development team isn't too fond of excessive dialogue in games... that is, unless this in-game encounter between Alan Wake and a more-than-slightly crazy video game designer named Emerson is just an attempt at self-deprecating humor:


Emerson makes a good point, even if he's too insane to know it.

But characters (and toasters) who talk, talk, talk, all the time, aren't the only problem with games that attempt to provide some kind of cinematic experience. Bad camera angles, sluggish controls, and frequent breaks in gameplay are all symptoms, and Alan Wake suffers a little from all of them in its attempt to look cool. As far as the controls are concerned, I am grateful that the developers patched the game with a "Direct Aiming" option to make the game more suitable for mouse and keyboard controls, but there's still some delay when jumping and performing other actions, and I'm fairly sure it's not just a performance issue on my computer. It's just a consequence of the game's smooth character animations.

More natural character movements often necessitate less natural gameplay, and while Alan Wake was never meant to be a platformer, this does make the game somewhat frustrating. Once the novelty of playing such a realistic-looking game wears off, you'll wish Mr. Wake could just turn on a dime and jump at a moment's notice like your other video game heroes.

Eventually, you will get used to the controls, and even the awkward camera angle, but those frequent breaks in gameplay — which usually involve the camera moving to focus on some far-off object or event, often to show the player where to go — still make replaying familiar sections a snore-fest for impatient players such as myself.

There will come a time when video game developers will need to realize that video games are not movies. I hope they also realize that trying to imitate movies is not the only way to tell a good story. For decades, stories have been an integral part of video games. We've come to expect some kind of story, especially in horror/mystery games like Alan Wake. But the video game industry has long been unable to drop the habit of turning to movies as the inspiration for their storytelling techniques, and as developers strive to make games even more "cinematic" (and otherwise more visually impressive) with every passing year, they seem to be losing sight of what actually makes them fun.

Wednesday, July 25, 2012

Video Games & Movies

Dozens of films inspired by video games have come and gone over the years, and they're rarely worth your time. It's for this reason that I was in no hurry to start "blogging" when I heard, a couple of weeks ago, that the (supposedly) upcoming Assassin's Creed movie had become a bit too real with the casting of an actual, famous, relevant actor, Michael Fassbender, for the lead role. Needless to say, I'm a bit late to this party.

So why bring it up now? Well, it seems to me that now is as good a time as any to discuss the making-movies-based-on-video-games trend in general, since we've all had plenty of time to process the latest news of this particular game-to-film adaptation. We've gone through the initial excitement of imagining some of our favorite characters appearing in a big-budget movie, the sobering realization that nearly all game-to-film conversions are mediocre at best and that the best part of the game was actually playing it, and perhaps a resurgence of hope that this movie could be the one that makes up for all the bad ones that came before. As for me, that last part might not apply. I'm finding myself increasingly confused by the absurdity of taking a concept designed for an interactive medium and translating it to a medium which involves no interaction whatsoever. It hardly ever works, but they keep doing it.

Might the plot from Assassin's Creed make a good movie? Sure. Will it add anything of value to the franchise? Only if it's more fun than watching someone play the game, and one could argue that a lot of video-game-inspired cash-grab movies fail this test.

Part of me wants to believe that an Assassin's Creed movie could work, but the rest of me knows how unlikely this is. It's not my intention to hate on any particular franchise or developer, but things didn't go so well the last time they tried to make a movie inspired by the story from an Ubisoft video game. Not even Jake Gyllenhaal could save Prince of Persia: The Sands of Time, which might have been okay for an action movie if only they had dropped the mind-numbingly obvious (and stupid) parallels to the Iraq War... and pretty much everything else in the script.

An uninteresting plot can be ignored amidst the special effects and gratuitous violence and perhaps a smoking-hot (I mean "talented") female actress, but a downright stupid plot is just too distracting and can ruin a movie entirely. Perhaps I was also a bit overly annoyed by the lack of resemblance between this movie and the video game I so enjoyed, but hey, I can't pretend to be unbiased. And why should I? Wasn't I the intended audience?

In all fairness, I suppose we should be glad that the writers of this Prince of Persia film hadn't decided to follow the storyline of its namesake with deadly precision, since that would necessitate killing all but three characters within the first five minutes, one of whom would then be absent for most of the story. In fact, like many video games (though, perhaps, mostly the older ones), Prince of Persia: The Sands of Time doesn't have very much "story" at all. What's there is very good, for a video game, but it's not enough (and not appropriate) for a full-length movie.

Sure, the game itself takes many hours to complete — there are tons of monsters to fight, some platforming puzzles to solve, and some character development via dialogue during the completion of those puzzles — but the important parts of the story are told through a few cutscenes which don't add up to a whole lot. Of course, they might have instead used the collective plot from the entire "Sands of Time trilogy" (encompassing the sequels Warrior Within and The Two Thrones) — in fact, the film they released did borrow minor elements from all three games — but the disjointed plot you would get by trying to fully combine these three stories probably wouldn't have made a very good film either.

All of this, however, makes me wonder why they ever decided to make a movie based on this game if the story would have to be changed to the point where the script hardly even resembled the source material. If not for the familiar title, as well as the fact that the time-travel-enabling device happens to be a dagger and the fact that the main character is a Persian prince (albeit an adopted one), I never would have guessed that the film was inspired by one of my favorite video games. Actually, given that the film was published by Disney and that the main character is street-rat-turned-royalty and receives magical powers from an ancient artifact, I might have assumed instead that it was some kind of re-imagining of Aladdin. Not even the character names would have given it away, since none of the names used in the film appeared in any of the Prince of Persia games. It almost seems ridiculous to keep the title.

But of course they're going to keep the title, because a movie based on a video game typically has no attractive qualities other than its association with a popular franchise. The only people who see these movies are fans of the respective video games, parents of those fans, other old people who don't know what they're getting themselves into, and, of course, girls who care more about the lead actor than the subject matter. ("Um, a video game? Whatever, nerd, I just want to see Jake Gyllenhaal's abs.") The first two groups are arguably the most important, so filmmakers continue to draw inspiration from video games even though these movies usually turn out to be garbage and subsequently draw ridicule upon the franchises from which they spawned. The movies don't need to be good; they just need to be good enough that you, the video game fan, purchase a non-refundable ticket. In other words, while the critics might scoff, it's a neat way to make a quick buck... that is, unless your name is Uwe Boll and you just produced and directed a film based on BloodRayne.

The fact that lots of people played a stupid video game with a sexy vampire doesn't mean a movie based on its characters and aesthetic will turn a profit, so it's all kind of risky. Unfortunately, coming up with an original idea is riskier, and more expensive. That's why so many of the movies released so far this millennium are one or more of the following:
  1. an adaptation of a novel or short story,
  2. an adaptation of a graphic novel or comic book,
  3. an adaptation of a TV show or cartoon,
  4. an adaptation of a video game,
  5. an adaptation of a theatrical play or musical,
  6. a sequel or "prequel" to a previous movie,
  7. a remake or "reimagining" of a previous movie,
  8. borderline plagiarism, or
  9. crap.
Video game adaptations are particularly problematic because most would argue that the primary function of a video game, like any game, is to provide entertaining gameplay, rather than to tell a story. As such, most video games don't have awesome storylines, and that's okay if the games are still fun. What's so often substantially less than okay is taking a plot which exists solely for player motivation and using it as the inspiration for something that can't be played.

But we might, someday, see a genuinely good video-game-to-feature-film adaptation. After all, a lot of modern games are practically interactive movies already. Some video game fans will tell you that this is the end of "gaming" as we know it, but let's not get carried away just yet. While it's true that we often get stuck with less challenging, less sophisticated gameplay in exchange for a more "cinematic" experience, the industry has churned out a decent number of story-driven games which miraculously nail the winning combination of worthwhile gameplay and an engaging narrative.

Ironically, it's usually the heavily gameplay-driven titles, fun as hell to play but often lacking in depth, that end up having films named after them (see Doom). As a result, these films are mostly the trashy action/horror sort, which only sometimes do well at the box office and almost never do well with the critics. Why not make a movie out of a strongly story-driven game like Metal Gear Solid or Alan Wake or... Assassin's Creed?

Maybe it's finally happening.

Although it's too early to tell if the film will ever be made — a famous name attached to a project does not necessarily guarantee its completion — it's an interesting possibility. A film based on Assassin's Creed could actually follow the plot of the game rather closely without sucking. Furthermore, a film based on the Assassin's Creed franchise just makes a whole lot of sense. The publisher has already branched out into every medium they can afford to exploit. In addition to the five games that make up the core of the series, and a bunch of handheld/mobile spin-offs, they've released several books, some comics, a Facebook application, and even a few short films.

The irritating part is that some of these tie-ins occupy their own space in the series' alternate history, rather than simply re-telling or expanding the story from an existing game. It gives me the sense that I'm missing part of the expansive story if I don't check out all this peripheral stuff (including the one comic which was only printed in French). The film, if they're serious about making it, could be the same way. Rather than basing it on an existing game, they might instead stick it chronologically between two existing games, or give us a new story with a new protagonist and only subtle ties to the familiar story we've all been following. All we know so far is that Ubisoft wants to retain as much creative control as possible, which means they probably won't screw up their own canon.

In other words, it might be cool. Certainly it can't be worse than the short film Ubisoft made to promote the original game's first sequel.


I'm still not getting my hopes up, however, because the very thing that makes an Assassin's Creed feature film seem so natural is exactly what makes it kind of pointless. The game is "cinematic" enough. Modern video games, for the sake of becoming more mainstream, attempt to emulate movies, so creating a movie based on one accomplishes nothing but an upgrade of visual effects and the complete removal of interactivity. Films based on video games are primarily for the video games' fans, and fans of video games don't often wish they could experience a video game without actually having to play it. Playing it is the whole point. (Perhaps somewhere out there is a screenplay based on a game with an amazing story and horrible gameplay — a film adaptation of a game that should have been a film all along — but no one is going to see a movie based on a game that basically sucked.) The only way to make it worth watching is to make the story new and original, but then the fans will say they liked it better how it was.

When fans of a game care too much about the canon, a movie based on it is almost sure to fail in their eyes. Game-to-film adaptations are often despised by the game's fans for changes to the story and, more generally, for not living up to unrealistic expectations. Meanwhile, those unfamiliar with a game often don't care enough to see the film at all. It's for this reason that I'm always surprised by the few adaptations that actually do well (see, for example, Resident Evil... although I guess you can't go wrong with zombies).

Finally, whether the filmmakers appease the hardcore fans and create a faithful adaptation, or take a risk and try to improve upon the narrative, or do something completely off-the-wall and unrelated to the game (see Final Fantasy: The Spirits Within), they also have to deal with the fact that the film's association with a video game can easily do more harm than good. It might snag all the fans, but it might also alienate everyone else, namely the older audiences who think they're too grown-up for video games and, by extension, too sophisticated for such a movie. Then there's everyone else who didn't play the game, everyone who hates the game, et cetera.

But I hope for Ubisoft's sake that I'm just being pessimistic. Maybe all they need to make it work are good actors, good writers, and a good director. It's pretty clear that a lot of game-to-film adaptations have none of these things.