Showing posts with label max payne. Show all posts
Showing posts with label max payne. Show all posts

Sunday, August 25, 2019

Adventures in Linux Gaming

Last month I wrote about Playnite and GOG Galaxy 2.0. What I forgot to mention is that, for me personally, the major downside of each is the lack of Linux support.

I've been using Windows 7 for a long time. I didn't like the look of Windows 8 when it came out, and never saw the point in switching. Having used Windows 10 at work, I definitely have no interest in using it on my home PC, regardless of all the spying it supposedly does. I just don't enjoy using it. I suppose I don't particularly enjoy using Windows 7 either; I've had my share of problems with it. It just happens to be the best version of Windows which is still officially supported. Unfortunately, with that support ending at the start of next year, I would be forced to "upgrade" to a worse version of Windows if I want to continue to get security updates.

Do I really need security updates? I don't know. Is Windows 10 really worse or am I just biased? I guess it's a little of both. Ultimately, though, it probably doesn't matter. I've been somewhat unhappy with Windows in general, and... well, Linux is free.

Switching to Linux, and Why You Should, Maybe


Switching to Linux is seen by some as a daunting task, and perhaps there's good reason for those with no experience with Linux to be just a bit intimidated. To those terrified Linux beginners, I would recommend a user-friendly distribution like Linux Mint. That's what I'm using, despite several years of experience with Linux as a software engineer. It's just incredibly convenient. The Cinnamon desktop environment is fairly Windows-like, and Linux Mint comes with nearly all the features you need in order to avoid ever touching the command line if you don't want to. The installer is also very easy to use. Some casual PC users might not know what all of the options mean, but for those users, I think the defaults are probably fine.

Of course, no matter how automated the installer and how fully-featured the desktop environment, every Linux user will inevitably run into some kind of problem. The very same is true on Windows; the difference is that, when you search the internet for a solution to a problem on Windows, you'll almost certainly find a solution posted by someone who uses the same version of Windows. There are many Linux distributions, though; if you're not using one of the popular ones, you might find solutions which aren't quite what you need, or you might not even find any reliable documentation of the exact problem you're experiencing.

When I turn to the internet for help with installing something or fixing some error on Linux, the solutions I find are usually tailored to Ubuntu, which is fine because Linux Mint is based on Ubuntu, but it's not always the case. Sometimes the answers I find are all about Debian or Fedora. Sometimes it doesn't matter. Other times it does. I'm not saying that troubleshooting on Linux is a nightmare. These days, it really isn't. However, the fact that there isn't just one Linux comes with some inherent problems, such as this one, and troubleshooting does require a bit more patience.

But the fact that there are so many distributions means that, if you hit a roadblock and absolutely nothing works, you could always just try a different distribution.

How I Ended Up with Linux Mint


When I first decided to install Linux on my home computer, I tried Ubuntu. The installation process was mostly easy. I had some difficulty setting up dual-boot with Windows 7, but it wasn't Ubuntu's fault. It was because, while attempting to install Ubuntu from a USB stick, and I had inadvertently booted to my USB stick in UEFI mode while my Windows installation was using legacy BIOS. If you don't know what that means, don't feel bad; neither did I.

The short version of this story is that, when I booted my USB stick in legacy BIOS mode, the installation and dual-boot set-up went smoothly. However, I found Ubuntu's default desktop environment difficult to customize and annoying to use in general. Even after a bit of research, I couldn't figure out if my problems were the result of bugs or just obtuse design. Ultimately, for this reason and others, I decided that Ubuntu just wasn't for me.

That's when I tried installing Debian. The installation process seemed to work perfectly. Then I tried to boot to Debian, and nothing happened. I got a black screen. I did some research but there was really no way for me to try any of the solutions proposed to others who had similar problems, because I didn't even have a working terminal. I couldn't enter any commands. I assume that it was a video driver issue, and I could have attempted to fix it, but I decided that it wasn't worth my time. I had heard good things about Linux Mint, and I hadn't tried it yet, so I dropped Debian like a hot turd and started downloading Linux Mint.

That turned out to be the right decision; Linux Mint was easy to install, it booted up just fine, and to this day I've had no significant problems with it... except when trying to run games that were never meant to run on Linux.

Gaming On Linux


It always takes me a while to get to the point.

Linux Mint is great as a general-use desktop operating system. It comes with Firefox, LibreOffice, a media player, etc. But how is it for playing video games? Well, I'd say it's just as good for games as any other Linux distribution, and gaming on Linux today is better than ever, thanks in part to Steam.

Steam Play and Proton


A Linux version of the Steam client has been available since 2013, allowing Steam users to play any games which happened to have official Linux versions. It was only about a year ago, however, that Steam rolled out an update to the cross-platform Steam Play feature, allowing Windows games to be installed using the Linux client and providing a Wine-based compatibility tool called Proton which allows many of those games to run on Linux with very little effort from the user.

Installing Steam on Linux Mint doesn't require any command line usage, nor does it even require a web browser. You just open the Software Manager, search for Steam, select the first result, and click the install button. Meanwhile, enabling Steam Play for all games is just a matter of checking a box in the Steam settings menu. If I remember correctly, this option is disabled by default, and initially Steam Play is enabled only for officially supported games, which is no surprise; it's sensible for anything that isn't guaranteed to work to be disabled by default. But the option isn't hard to find, and often no effort is required to get games to work even if they're not officially supported.

ProtonDB tracks how well Steam games work with Proton by aggregating user-submitted reports. Games without native Linux support are rated on a scale of Borked (meaning it won't run at all) to Platinum (meaning it runs perfectly out-of-the-box), with three ratings (Bronze, Silver, and Gold) in between. I'll let the statistics on ProtonDB speak for themselves, but they appear to indicate that the majority of games are playable.

My own personal experience with running Steam games on Linux has been better than expected. Of course, I've been using ProtonDB as a resource since the beginning, and I haven't bothered to install games which are definitively rated Borked. In general, I've gravitated more toward the games with higher ratings. Therefore, I can't claim that the games I've tried playing on Linux via Proton, of which there are about a dozen, are a random sample. However, even when I've tried to play games rated Bronze or Silver, I've been mostly successful, as I've found solutions in ProtonDB's comments to some of the minor problems I've encountered. And the only game rated Borked which I've really gotten the urge to play since installing Linux Mint is L.A. Noire, and for such games, I still have Windows 7 installed on my other hard drive.

I won't describe my experience with every game in detail, but the first Windows-only game I played on Linux Mint was Max Payne, and... frankly, it just worked. It worked perfectly, actually. The only difficulty I had was not with the game itself but rather with the unofficial widescreen patch, and it was only a momentary setback. The comments on ProtonDB quickly set me straight; I added WINEDLLOVERRIDES="d3d8=n,b" %command% to the game's launch options in Steam and even the widescreen patch worked perfectly.

Similarly, Max Payne 2 works perfectly in Linux, and getting the unofficial widescreen patch to work simply requires adding WINEDLLOVERRIDES="d3d9=n,b" %command% to the game's launch options. Playing Max Payne 3 was a bit more difficult; initially, it wouldn't launch. Following the advice of comments on ProtonDB, I added PROTON_USE_WINED3D11=1 %command% to the game's launch options, and it worked, but with some bugs. The minor issues with Max Payne 3 which remain can probably be fixed, perhaps by trying one of the other Proton versions offered by Steam or by further modifying the runtime configuration, but I had only installed it for testing purposes anyway. I was really more interested in playing the first two games in the series, so I didn't spend much time troubleshooting the third.

Non-Steam Games and Wine


Despite needing the occasional web search to find the correct configuration with which to run a certain game, Steam Play with Proton is actually so convenient it's easier than ever for me to ignore the games I bought from other online stores such as GOG. While there are games on GOG with official Linux support, the GOG Galaxy client does not have a Linux version (despite a lot of GOG users wanting one). Downloading games directly from the GOG web site isn't hard, but clients such as Steam and Galaxy do offer a lot of convenience.

Furthermore, although I could probably use Wine or other tools to play many of the Windows-only games from my GOG account on Linux, it would require more effort than running Windows games via Steam, which very often just works automatically. I did install Wine with the intention of playing non-Steam Windows games, but I just haven't used it yet, because Proton — when it works, which it usually does — is just so effortless. If I'm trying to decide what to play, and I've narrowed down my choices to one GOG game and one Steam game, I'm likely to pick whichever is easiest to run on Linux, and that's going to be the Steam game nine times out of ten.

That might change a bit when I get around to trying Lutris, a game client not associated with any particular store, which also claims to reduce installation of many Windows games on Linux to a zero-effort, one click process. It looks like a promising solution for playing some of my GOG games on Linux. For what it's worth, though, installing Lutris is one extra step. I'm going to have to be a jerk and say that Steam still makes it easier by having Proton integration built in to its own client. So many of my games were already working on Linux as soon as I installed Steam that I haven't taken the time to use much of anything else.

So I don't have much to say about running Windows games with Wine, via Lutris or otherwise. I plan on experimenting with it eventually, and once I've done so, I'll probably write a sequel to this post. Right now, though, I'm looking at a huge Steam library and a high success rate with using Proton with very little tweaking, so I probably won't be straying away from Steam very often, except for the sake of experimentation. When I just want to play a game, Proton is often the best way to make it happen.

The Classics


There is one category of games for which I've already strayed outside the Steam bubble: Old-school shooters. I've got to have them.

The Steam versions of The Ultimate DOOM, DOOM II: Hell on Earth, Final DOOM, Heretic, HeXen: Beyond Heretic, and HeXen: Deathkings of the Dark Citadel all run in Proton. They were all, in fact, officially tested by Valve with specific versions of Proton, so that they show up in my Steam library with labels like "Proton [version number] selected by Valve testing" and will run with the indicated Proton version instead of the default I selected in the global Steam settings.

I find this rather amusing because the Steam versions of these games run through DOSBox which, if I'm not mistaken, has a Linux version. However, I'm not suprised that the game's publisher, id Software, hasn't made the effort to repackage these old games with the Linux version of DOSBox for an official Linux release on Steam, especially given that most users who are computer-savvy enough to use Linux will just take the game files downloaded via Steam and run them in a source port instead of DOSBox anyway.

That's what I did with all of these games, immediately after installing them. There's a Linux version of GZDoom, which isn't a suitable source port for anyone who wants the games to run exactly as they did in the '90s, but it's good enough for me. Like any sane person, I did disable major gameplay options which were not in the original games (such as vertical freelook in the DOOM games, jumping in DOOM and Heretic, and ridiculous stuff like crouching), but the graphical upgrades don't bother me.

Getting GZDoom to launch with the correct options for each game was a bit of a hassle, but no more than it was on Windows. The only real problem is that GZDoom, on my system, encounters some kind of error when I close it, which might have something to do with the fact that it doesn't seem to save changes to the gzdoom.ini file unless I enter the writeini console command while running GZDoom. Knowing the workaround, I'm not really bothered by it.

Satisfied with the DOOM and Heretic/HeXen games, I moved on to Wolfenstein 3D and its expansion Spear of Destiny, and installed the source port ECWolf. To my surprise, it seems to work perfectly despite being, in my estimation, less widely used than GZDoom. So then I moved on to the Marathon trilogy, available as freeware since 2005, and attempted to install the source port, Aleph One. That, unfortunately, could have gone more smoothly.

Not Everything is User-Friendly


The official Aleph One web site has some pretty basic instructions for installing the Linux version: Unpack the .tar.bz2 file and, in the unpacked directory, run ./configure && make && make install. Unpacking the file was easy, but I started having problems as soon as I ran the configure command. I won't bother going into all of the messy details, but I went through several iterations of trying the installation, which would fail due to missing dependencies, and then installing those dependencies.

Synaptic Package Manager, included in Linux Mint, makes finding and installing missing packages about as easy as it can be; doing it on the command line using apt or apt-get isn't very hard either. But I had never needed to hunt down dependencies like this. When installing software through Linux Mint's Software Manager, or even when installing a program using apt install, all dependencies are installed automatically (which I now regard as a miracle). Installing software manually, however, isn't quite so easy. Perhaps I was to blame for not knowing what all of the dependencies were beforehand, but it was getting frustrating.

When I finally got Aleph One installed, I was able to launch Marathon, but it printed some errors to the screen and there was no HUD. Searching the internet for a solution, it turned out I was still missing some "optional" dependencies, and installing those made things worse; my next attempt at installing Aleph One failed outright, seemingly due to a dependency which I had already installed.

That's when a comment on the Aleph One GitHub repository, posted only eight minutes before I saw it, clued me in to the fact that I had gone down the wrong rabbit hole. The easier and more reliable way to install Aleph One is to clone the Git repo, go into the root level folder, and run ./autogen.sh followed by make and make install. I had to install git (which I've used extensively but not at home) and I was still missing a couple of dependencies required by the autogen.sh script, but it told me exactly what the dependencies were, so installing them was just a matter of running the correct sudo apt install command.

So I've got Marathon and its sequels working on Linux now, but installing Aleph One took more than an hour after all was said and done. I've learned a lot, though, and ultimately the solution to every problem was revealed with a simple internet search. This story could scare people away from trying to play games on Linux, but it's important to keep in mind that I had this much trouble only because I was trying to play a 25-year-old game. Doing that on Windows isn't always easy either.

I still haven't installed the classic Quake games, and I think those are next. I can only hope installing their source ports will be a bit easier.

Conclusion


I guess the moral of this story is that installing and running games on Linux can take a bit of patience, but I only came close to running out of patience when I tried to do things that only an old-school game enthusiast determined to use Linux at all costs would ever try to do. With Steam, many of the games that work on Linux will "just work" and, for non-Steam games, I'm told that Wine isn't very hard to use either. However, I do know just enough about Wine to know that it requires some effort. The average consumer expects things to work with a single mouse click. They want one-click install and one-click launch of every game.

For games with native Linux support, you can have that. For the rest, Steam comes pretty close to providing that level of convenience. Until other stores like GOG implement some kind of compatibility layer like Proton into their own clients, Steam will dominate Linux gaming for the same reason that Windows dominates gaming in general: It's just easier.

Monday, April 28, 2014

Unskippable Cutscenes & Other Pure Evil

I often say that if a single-player video game isn't worth playing twice then it's not worth playing at all. That might be a slight exaggeration — surely "game worth playing exactly once" does lie somewhere between "game worth playing twice" and "game worth uninstalling" — but honestly, if I get to the end of a game and I don't want to do it again, it probably wasn't very fun. In my opinion, one should be able to enjoy a game multiple times, twice being the bare minimum.

Surely, some people will disagree, and their counterargument will most likely go something like this: "After the first time, you already know how it ends!" Well, yeah, it ends with me winning. If your experience with a game is spoiled because you already know the end of an underlying story, which probably exists only to provide context to some otherwise meaningless depictions of violence, the game isn't a very good one. It's supposed to be a game, after all, not a movie. Do I like story-driven games? Sure. But a game needs to be fun for reasons other than its plot twists. Otherwise, it should have been a movie, and anyone who likes it could have watched a movie instead. Moreover, if there's a game whose plot is the primary appeal, it should have a plot that's good enough for an eventual second run. Lots of people like to watch a good movie more than once, right?

But let's say the plot isn't so great that I want to see it and hear it again. Let's say I'm about to play a game a second time because the actual game was fun, or because I missed a few optional objectives the first time, or because my high score isn't high enough. If that's the case, I shouldn't be forced to see and hear the story a second time, and since no reasonable person would ever disagree with that statement, I don't understand why any video game developer ever thought unskippable cutscenes were a good idea.

I recently finished L.A. Noire. Played well the first time around, this game might not warrant a second play-through, but I missed plenty of clues and botched more than a few interrogations, so I decided to go back for the five-star ratings (and a few achievements) I missed. And it would have been pretty fun if I didn't have to watch every single cutscene a second time. Why can't we skip the cutscenes in L.A. Noire? I have no idea. I've heard that cutscenes in Max Payne 3 are unskippable because the game uses that time to load, but L.A. Noire has separate loading screens, so I doubt it's the same situation. So, disguised loading screens aside, why would a game ever have unskippable cutscenes? It turns out that there are no good reasons.

Sometimes we wonder why and how a serial killer would decide to brutally murder dozens of people. That a human could do something so senselessly awful is almost beyond comprehension. Two possible explanations come to mind: either the serial killer is very simply insane and his or her actions are the result of a mental defect, or true evil exists in the world and it lives inside that person. Unskippable cutscenes are a much lesser crime than repeated homicide but the same two explanations apply and I can't think of any others. The developers of a game must be crazy (or profoundly stupid) to think that a person playing the game for the second time won't be annoyed by the inability to opt out of re-watching something as pointless as a cutscene. On the other hand, if they do know that such a restriction annoys players, the only reason to implement unskippable cutscenes would be to annoy players and that's pure evil. Surely it's a lesser evil than murder, but the evil is still pure because the only conceivable objective is to cause suffering.

Whether it's intentional or not, the unskippable cutscene often acts as an irritating punishment for failure. In games that forbid players from opting out of having a mediocre story shoved down their throats, there always seems to be a difficult boss fight (or a difficult something) preceded by a long and boring scene that's only entertaining once. And you end up seeing it fifteen times. If that's not bad game design, I don't know what is. Fortunately, none of the cutscenes in L.A. Noire are extremely long and no particular part of the game is likely to be repeated numerous times (since nothing in the game is very challenging and there's an option that allows awful players to skip action sequences). Still, there's no reason for unskippable cutscenes, especially since the concept of skipping a cutscene has been around for approximately as long as cutscenes have existed.

I guess the developers think their story is so important that we shouldn't attempt to enjoy the game without it. They think they need to save us from ourselves by preventing us from "accidentally" missing something important. Sometimes, a story is important enough that it shouldn't be skipped; if I saw someone skipping cutscenes while playing Alan Wake for the first time, I'd probably recommend that they just play a different game. But that's no reason to take away options.

This is one of those things that would cause us to boycott a game if only we had an ounce of self control. Unfortunately, we don't. It's unforgivable, but it's still tolerable enough that the developers and publishers are unlikely to suffer financially as a result, so they won't learn any lessons no matter how much I complain.

Sunday, December 22, 2013

Midlife Crisis, Part 3

The most frustrating thing about having a hobby is that you never really have time for one unless you're unemployed and lonely. For better or for worse, I'm neither. This was the case before I bought my new PC, and it's still the case now that I've gotten most of my games installed on it. There will always be weekends, and I have a few hours of downtime after work each weekday, but it becomes more clear every time a new game is released that I'm going to die of old age before I get to finish every game that I deem worth playing. Such is the price I pay for attempting to have a life on the side.

So far, I've actually spent more time fiddling with my PC than playing games on it. Lately, this fiddling has been the enjoyable kind; I've been installing all the software I need, rearranging my desktop icons like the truly obsessive-compulsive person I am, and more generally setting things up just how I like them. For the first few weekends of my PC's existence, however, I had nothing but trouble.

First, I didn't bother getting a wireless network adapter because a stationary computer should ideally be placed where an ethernet cable can reach it. Unfortunately, I needed the computer to be in another room temporarily. To remedy the situation, I tried using something I already had in my closet — a D-Link wireless USB adapter. It worked pretty well until my network started slowing down or crashing every time I tried to use a lot of bandwidth (i.e., by downloading a Steam game). I'm still not sure what the problem was; maybe there was some kind of incompatibility with the router, or maybe something more complicated was going on. Maybe it was my computer's fault, somehow. Fortunately, I don't really need to figure it out, since I'm using a wired internet connection now and I don't really have any need for Wi-Fi (let alone the D-Link adapter) in the near future.

Other problems included a couple of random blue screen errors (most likely caused by an AMD video card driver which I've updated) and various problems with various games. The original Assassin's Creed, for example, refused to start when I first installed it, and I'm not even sure how I fixed the problem. I'd tried a few things, given up, and turned off the computer, and when I tried launching the game again later, it worked just fine. (Actually, I had to turn on compatibility mode for Windows Vista because I was getting a black screen where the opening cut scene should have been, but that's hardly an issue. As often as compatibility mode fails, it should always be the default first move if an old game does something weird.)

Compatibility mode for Windows 98 / Windows ME was also the initial solution for the Steam version of the original Max Payne, which failed to launch even though the process was visible in the task manager. However, even after the game launched, some of the music was gone and the sound effects were severely messed up. Fortunately, some nice guy created his own patch to fix the problem. It sucks that the original developers of old games like Max Payne aren't willing to invest the time and money to solve these problems themselves (especially when they're still selling these old games alongside their sequels on digital services like Steam), and the amateurs who pick up the slack are true heroes.

I'm reminded of Command & Conquer: The First Decade, a box set of a dozen games from the series. A couple of official patches were released, but not all of the bugs were fixed, so fans started patching it up themselves. The unofficial 1.03 patch, a collection of bug fixes and other features, was absolutely essential for anyone who had this particular Command & Conquer box set. But it's not just the occasional issue with an outdated game that often necessitates a third-party fix.

Now that I have a good computer, my older games don't even come close to pushing the graphics card to its limits, which means most of these games will needlessly run at a frame rate much higher than my monitor's refresh rate. Usually, this just causes screen tearing. In extreme cases, I can even hear what sounds like coil whine, an irritating whistling noise coming from inside the computer (not the speakers). This happens on the main menu screens of F.E.A.R. and some other games, presumably because the computer is able to render thousands of frames per second when there isn't much to display.

Turning on a game's Vsync feature (preferably with triple buffering enabled as well) fixes these problems, but a few of my games don't have a working Vsync feature. Each of the games in the S.T.A.L.K.E.R. trilogy, for example, has an option for Vsync in the settings, but in all three games it does nothing. It's straight-up broken. The optimal solution would be to force Vsync and triple buffering through the control panel software of ones graphics card, but AMD cards can't do this for certain games on Windows 7, and it's my understanding that both Microsoft and AMD are to blame for that. Even with Vsync set to "always on" in Catalyst Control Center, I was getting stupidly high frame rates in S.T.A.L.K.E.R.: Shadow of Chernobyl.

Then I heard about D3DOverrider, a little tool included in an old freeware program called RivaTuner. It's made to enable Vsync and triple buffering in software that's missing one or both options, and it works like a charm. Despite S.T.A.L.K.E.R.'s broken Vsync feature, and despite Catalyst Control Center's inability to fix the problem, D3DOverrider gets the job done. Now I'm getting a fairly consistent 60 frames per second, instead of hundreds of frames in-game and thousands of frames on the menu. No more vertical tearing and more no quiet-but-irritating coil whine.

That other first-person shooter set in a post-apocalyptic Eastern Europe, Metro 2033, has its own share of issues, namely that a lot of useful options don't show up in its menu and have to be toggled on or off by editing a few configuration files in Notepad, and it also appears to have a broken Vsync feature. In this case, not even D3DOverrider appears to be solving the problem. Fortunately, the game's poor optimization means that it doesn't always exceed 60 frames per second at the highest graphics settings anyway, making Vsync mostly unnecessary. People with more powerful systems might have to keep on looking for solutions.

All of this is pretty frustrating, but troubleshooting is to be expected when playing games on a PC, especially when the games are relatively old and the operating system is relatively new. I guess I should just be glad that most of the common problems can be solved.

"But if only you'd bought a console," some would say, "your games would just work." That's the favorite argument in favor of consoles. They just work. But now that the short-lived phenomenon of backwards compatibility has gone out the window with PlayStation 4 and Xbox One, I don't think it's a fair argument. Most of the problems with PC games arise when one is trying to have a nostalgic experience by playing an old game on a new system, and the other problems are usually the fault of careless developers.

I guess we should all be glad that PC games work at all, considering that our "gaming computers" are not standardized like all the millions of identical Xbox One and PlayStation 4 consoles. Since I'm not a game developer, I can only imagine how difficult it must be to ensure that a game is going to work consistently on so many hardware configurations. Maybe I shouldn't be so upset that games like S.T.A.L.K.E.R. have a few broken features, or that games like Max Payne continue to be sold without being updated for the current version of Windows. On the other hand, it's harder to forgive professional developers for an imperfect product when presumably amateur developers are able to correct the imperfections without being paid.

Update: It seems that, since I originally wrote this post, S.T.A.L.K.E.R. was actually updated with a frame rate cap of 60 fps. I'm shocked that such an old game was actually updated, to be honest, but apparently some people with expensive computers were burning out their video cards by leaving the game paused (thereby allowing the game to run at hundreds or thousands of frames per second for long periods of time). Terrifying.

Monday, June 3, 2013

The Problem with Trading Card Games

Scrolls, the upcoming collectible-card-based strategy game from Minecraft developer Mojang, enters its open beta phase today. Essentially, this means you can buy the not-quite-finished game for less than its full price and start playing early while they work out the bugs and make improvements. Some part of me wants to partake in this, because the game looks pretty interesting (and because we all know how playing Minecraft quickly became the most popular thing since breathing), but the rest of me doesn't want to touch this game with a thirty-nine-and-a-half-foot pole. It looks fun, but I'm conflicted.

Collectible card games are an interesting thing. I want to love them, because I think they're so cool in theory. I wanted to love Magic: The Gathering, the original trading card game published by Wizards of the Coast back in 1993. The concept of such a card game is ingenious, both for its unique gameplay and — let's be honest — for its potential to rake in huge wads of cash for the owners.

Over the past 20 years, new sets of Magic cards have been released on a regular basis, and the total number of different cards seems to have surpassed 13,000, yet the game still remains accessible to newcomers. New rules are introduced all the time, and the balance is tweaked with each new expansion, but the earliest cards can still be used (outside of certain tournaments) together with the ones printed today. The number of possible combinations in a 60-card deck isn't even worth counting.

The ability of each player to create his or her own customized deck of cards, drawing from a collection unlike that of any opponent, is what makes this type of game so fun to play. Unfortunately, this makes the gameplay inherently imbalanced, unless we consider the start of the collection process to be the true beginning of any given match (and that's a stretch). Even then, a game like Magic too often requires continual monetary investment if you want to remain competitive, and this feature (while I'd like to call it a flaw) is by design. I played Magic for a brief period of time, several years ago, and my cards might have been only half-decent back then, but they're total garbage now. More powerful cards and better gameplay mechanics are created with each expansion to keep players spending their money. Of course.

There's also a certain threshold of monetary investment required in order to become competitive in the first place, and that threshold is probably going to scale in proportion to the size of your opponent's paycheck. Things might be balanced within a group if everyone involved cares enough to go on eBay to buy selectively the individual cards they need for one of a few strategies deemed viable at the expert level, but this isn't always affordable. Meanwhile, for more casual play in which most cards are obtained from random packs, the guy who wins most often is going to be the guy who spent the most money on his collection. The three pillars of succeeding in Magic: The Gathering are building a good deck, making the right in-game decisions, and (perhaps most importantly) owning better cards than the other guy (which is where the "collectible" aspect comes in).

When a video game affords even the smallest advantage to a player who spends extra money (e.g., through micro-transactions), we call it "pay-to-win" (even if this isn't literally true) and we hate it because it feels so wrong. It is wrong, because the delicate balance of the game in question is either compromised or completely destroyed. Being at a disadvantage sucks, and if you give in and buy your way to the top then the challenge is gone and the game quickly becomes pointless. (In the most extreme cases, you've essentially just paid to see the words "you win" on your screen, so congratulations on doing that.)

A lesser form of pay-to-win merely allows players to spend some extra money to skip past a seemingly endless grind, as is the case in many so-called "free-to-play" games. This doesn't necessarily destroy the game's balance of power (because the advantages being bought can also be earned through dozens of hours of play), but it does highlight the major flaws already present in the game. If a person wants to pay more money simply to get less gameplay, the game probably sucks (and the person playing it probably hasn't realized there's nothing left to do if you're not grinding).

In the video game world, all of this is positively awful, but most collectible card games are pay-to-win by nature. Sure, they're fun to play if you're up against someone whose skill level and deck quality are in the same league as yours, but if you play against a guy whose collection of cards is twice as big (and twice as expensive) then it's completely unfair.

When I first heard of Magic: The Gathering Online prior to its release in 2002, I thought it might be a little more fair (and affordable) than its tabletop equivalent. I assumed (or at least hoped) that each player would be given access to the same pool of cards, or perhaps that better cards might be unlocked by winning matches, or something. At the very least, I naively believed that players wouldn't have to buy all of their virtual cards at the same price as physical ones because... well, you know, because they're not real cards. Unfortunately, Magic: The Gathering Online is identical to the original card game except that the cards aren't made of card stock and ink.

Duels of the Planeswalkers looks like a nice alternative, even with its relatively small number of cards, until you realize that you can't even build your own deck. This is no surprise, though, since Wizards of the Coast doesn't want this game to be a viable alternative. Duels of the Planeswalkers is meant to draw in new players and get them hooked, so they become frustrated by the lack of deck-building options and graduate to buying packs of cards, be they physical or digital. The virtual cards in Magic: The Gathering Online, despite being virtual, have monetary value because Wizards of the Coast doesn't let you do whatever you want with them. Artificial scarcity makes them seem as rare as the physical cards printed in limited runs on actual paper.

Digital game distributor Steam recently unveiled its own trading card meta-game, which is still in beta, and it's proving to be a nice example of how such artificial scarcity can make something desirable even if it has no real value, no purpose, and no practical function.

Players with access to the beta test can earn virtual trading cards for their Steam Community accounts by logging play time in certain Steam games. These currently include Borderlands 2, Counter-Strike: Global Offensive, Don't Starve, Half-Life 2, and Portal 2, as well as the free-to-play games Dota 2 and Team Fortress 2 (but only if you spend money on them). You can get up to four cards per game just by playing, while eight cards from a single game comprise a complete set. The fact that you can only earn half of any set on your own means that trading (or buying from other players) is a necessity.

Once you get a complete set, those eight cards can be turned into a badge and some other items. The badge is good for nothing at all, while the other goodies that come with it are mostly vanity items, like emoticons and points to "level up" your Steam Community account. (There's also a chance of getting a coupon, but my experience with Steam coupons is that the discounts they offer are less impressive than the ones you see during a typical sale.) The whole thing seems pretty dumb, but you can already see cards for sale on the Steam marketplace, and that doesn't usually happen unless people are buying. There's also a demand for those vanity items. Apparently, some users even made a profit by buying lots of cards and then selling the goodies that come with each badge.

In general, things that were specifically made to be collected usually don't have a lot of real value to collectors. However, if you turn that collection process into a game — even if it's a stupid one — people go nuts. If people are willing to spend real money on virtual trading cards just so they can earn virtual badges and virtual emoticons and level up their Steam accounts for virtual bragging rights, it should be no surprise if the same people are willing to spend money on virtual trading cards that give them an actual advantage in an online game. I can't really blame Wizards of the Coast for taking advantage of this kind of behavior. But when the game is a competitive one, I just don't like the idea of buying victories, even if it's done in an indirect and convoluted way.

A true trading card game, even if its entirely virtual, is going to have some level of imbalance. If each player draws cards from a unique collection, it's never going to be completely fair. All of this might be okay, however, if everything were unlockable through in-game actions and accomplishments. Naturally, I was hopeful when I first saw Scrolls; the official website tells us items at the in-game store can be bought with the gold earned by playing matches, and this presumably includes new cards (called "scrolls" because it sounds so much cooler). However, a "small selection" of items can also be bought with "shards" — a so-called "secondary currency" which you can buy with your real-life credit card.

So how significant is this "small selection" of in-game items? How much of an advantage can you gain by immediately purchasing everything that shards can buy? I can only assume the advantage is pretty significant; otherwise there would be no point. The real question is of whether a person who paid $10 more than you (and doesn't deserve the advantage) is distinguishable from someone who played 20 hours longer than you (and earned the advantage). As long as it's possible to unlock everything that matters through gameplay alone, and as long as doing so is feasible (i.e., not a 500-hour grind), there's some hope for this game.

Mojang has claimed that Scrolls won't become a pay-to-win game despite its purchasable items, but developers say a lot of things before their games are released. The only reason to believe them is that the game does in fact have an initial cost — in other words, it's not "free-to-play" so the developers don't need to rely on in-game purchases to turn a profit.

The cost of access to the open beta is $20, which isn't so bad when you consider the average cost of a modern video game, which tends to be around $50 or $60 regardless of quality. (While this high cost applies mostly to console games, high-profile PC releases tend to follow the same model with some notable exceptions. Runic Games, for example, earned some praise for selling Torchlight II at $20, which gave the action role-playing game a significant advantage over its controversial $60 competitor Diablo III.) Assuming that Scrolls turns out to be a decent game, this discounted price for early access is a pretty good deal.

Unfortunately for Mojang, I've been trained by Steam sales and Humble Bundle events to refrain from buying anything unless or until it's dirt cheap. With some patience and good timing, I could buy a handful of older games for the same $20 and I'd be sure to enjoy at least one of them. It doesn't take long for the price of a game to drop, and this is especially true of PC games now that developers are realizing they need to compete with piracy instead of trying in vain to stamp it out. As a result, people who play PC games — or the "PC gaming community" for those of you who can say such a thing with a straight face — have come to expect their games to be inexpensive. $20 is a good deal, but it's not great.

I certainly don't mean to imply, of course, that we should all wait a few years to pick up Mojang's new release. After all, we don't even know if it will ever be subjected to such brutal price-slashing. Furthermore, Scrolls is a multiplayer game which might only be fun for as long as the number of players remains high, so the time to buy is now, if you want it. The problem is that the game is a risky investment and my spending limit for such a risk is so low.

That limit — the point below which a risky investment becomes a risk worth taking and any potential buyer's remorse becomes bearable — is different for everyone. For me, it's about $5. That might seem like a ridiculously small figure, but it's what I paid for BioShock a few years ago. It's what I paid for S.T.A.L.K.E.R.: Shadow of Chernobyl. It's also what I paid for the first two Max Payne games combined. I almost bought Metro 2033 for $5, but I waited and got it for even less. I got Killing Floor for $5, a few years ago, and I've put more hours into that game than anything else I can remember. None of these games were new when I bought them, but I still enjoyed each of them at least as much as any $20 game I ever bought.

None of this is really a complaint about Scrolls or the open beta price tag in particular. But I might be more willing to spend four times what I paid for Killing Floor if I actually knew Scrolls would be a worthwhile purchase. Isn't there some way of trying out a game before its release without paying $20 for access to a beta version? Oh, yes, a free demo certainly would be nice. Maybe we'll get one of those later on... but we probably won't.

Wednesday, May 22, 2013

No Sequel for Alan Wake (Yet)

This article was also published on Gather Your Party on May 23, 2013. Read it here.



Today brings good news and bad news for Alan Wake fans.


The good news is that, for the next week, you'll be able to get both Alan Wake games for very cheap, along with some previously unseen goodies. The bad news is that it's the last we'll be seeing of this series for a while.

After a five-week break, Humble Bundle has gone back to its weekly sales, a practice which had initially lasted for less than a month before being put on hiatus. The flavor of the week, as you might have guessed, is Remedy Entertainment's third-person, horror-themed, psychological thriller Alan Wake. The bundle includes the original game alongside its ambiguously canonical and unnumbered follow-up, Alan Wake's American Nightmare, as well as a whole bunch of bonus content, some of which has never been released.


As with most other Humble Bundle sales, you can pay whatever you want (down to one cent), but you'll need to pay at least a dollar to get Steam copies of the games. Oddly, unlike most other bundles, there doesn't seem to be any material incentive to pay higher than the average contribution. Surely, you could just do that out of the kindness of your heart... but if you're not that kind, these games will never be cheaper. The bundle is a pretty good deal no matter how you look at it, even if you prefer the Steam keys, because a similar Steam bundle (minus previously unreleased bonus content) has only gone as low as $9.99 in previous Steam sales.

With this bundle also comes an announcement video from Remedy Entertainment creative director Sam Lake, which might have Alan Wake lovers crying themselves to sleep like Half-Life enthusiasts have been doing for the past five-and-a-half years.

The video, addressed apologetically to Alan Wake fans, reveals that Remedy is working on "something new, something big" — but the key word here is "new" which means, of course, that we're not talking about the continuation of an existing franchise. There's no way around it, so Lake comes out and tells it like it is: Alan Wake 2 will not be released in the foreseeable future.


This might come as a surprise to fans who were paying attention when Remedy seemingly dropped a few clues about an upcoming sequel. Most notably, Sam Lake tweeted a cryptic quote, the same heard as a backwards message in a song performed by Poets of the Fall as the fictional heavy metal band Old Gods of Asgard. It seemed promising, but maybe we shouldn't have been so excited.

The "town called Ordinary" might have been the setting for the next game in the series, but it might also have been a simple reference to the setting of American Nightmare, the game in which the song is featured. That game, after all, does take place in an unnamed town. (The narrator calls it Night Springs, but this is the Alan Wake universe's parody of The Twilight Zone, so if any part of the game takes place outside of Alan's mind then this disembodied voice can't be trusted.)

Regardless, it looks like Alan Wake 2 is on the back burner. Remedy "worked hard to make the sequel happen," says Lake, but the project apparently suffered from a lack of sufficient funding. Although total sales of the original game have exceeded 3 million copies, he notes, it was by no means an instant success upon its release. This makes throwing money at a sequel a risky investment, especially when the success of a modern game is judged so heavily on pre-order sales. One might expect a sequel to do better in that department, given the existing fanbase, but it would seem the franchise is cursed by its initial sales performance.


Lake remarks that Remedy could have done something "less ambitious" (i.e., less expensive) with the next Alan Wake, but explains that such a compromise "wouldn't have done justice to you... to us... and certainly wouldn't have done justice to Alan Wake." Perhaps, while we lament the indefinite postponement of what might have been a great game, we should be glad that Remedy Entertainment was unwilling to spoil the franchise with a mediocre cash-grab sequel. Lake is careful not to suggest that another Alan Wake will never be made; he only makes it very clear that now is not the right time.

Fortunately for Remedy, money isn't an issue for their new project, Quantum Break. The trailer, first shown at the Xbox One reveal, has live-action footage, a creepy girl with special powers, a bridge disaster, and some poor guy dying in slow motion. Sam Lake calls it the "ultimate Remedy experience" drawing on everything the development studio learned from Max Payne and Alan Wake.


Aside from these vague hints, not much is known about the upcoming release. The game is said to be an Xbox One exclusive, but only time will tell if this is set in stone; Alan Wake was an Xbox 360 exclusive before it was released for Windows a couple of years later, so we can afford to be optimistically skeptical. Maybe the same thing will happen again.

Until then, the Alan Wake Humble Bundle sale, along with an accompanying sale on Xbox Live, is Remedy's big thank-you to all of its fans. Sadly, it seems a bit too much like a tentative funeral for the franchise. The last minute of the announcement video is a live-action clip of various Alan Wake paraphernalia being locked in a crate and wheeled off into the back of a warehouse. The crate is labeled "Do Not Open Until..." but the date, of course, is hidden and will likely remain unknown until the Quantum Break franchise has run its course.

Wednesday, February 6, 2013

Dear Esther & Video Games as Art

Over the weekend, I played through Dear Esther with my girlfriend. I'd bought it from Steam for $2.49 because I was curious, and I'd sent it to her account because I'd (mistakenly) assumed she might ultimately appreciate it more than I would. Alas, while I was pleasantly surprised, she was bewildered and irritated by the apparent pointlessness of the trek across the virtual island that lay before her.

A bit of background (with links) for the uninformed: Dear Esther was originally released in 2008 as a mod for Half-Life 2. Like so many popular mods, these days, it was then remade as a stand-alone release, which was completed in 2012. Unlike most things sold on Steam and discussed on blogs like mine, however, Dear Esther doesn't have any goals or challenges and you can't win or lose. It doesn't really have any "gameplay" at all because it's not a game in any traditional sense. It's more like virtual reality made love to an audiobook, had an illegitimate child, and left that poor bastard on video games' doorstep because no one else would take him. But I mean that in the nicest possible way.

At its core, Dear Esther isn't much more than an interactive story, and the word "interactive" is being used here rather loosely. What you do is walk around in a virtual world, admire the virtual environment, and listen to sporadic snippets of metaphor-laden monologue over a calming soundtrack and the sounds of the ocean waves.


Given the nature of this particular piece of software, which may or may not be considered a "video game" depending on whom you ask, you might say I'd be just as well off if I'd watched a playthrough on YouTube instead of buying it. But even with such minimal interactivity and such a linear experience overall, I think watching someone else play is a poor substitute and misses the point entirely. Even though there's no real "gameplay" here, I'd still be denying myself nearly everything that does make Dear Esther a unique experience if I were simply to watch some other person decide where to go and what to look at. The prosaic monologues are wonderful, but the modest potential for free exploration by the player is likely the sole reason that Dear Esther was turned into a "video game" instead of a short film or a short story.

You might also say I could have played the free Half-Life 2 mod instead of buying the stand-alone product, but the newer version just looks so much nicer, and enjoying some nice-looking stuff is where at least half of the enjoyment lies.


Despite all of this, paying for Dear Esther might seem like a waste. There's a rather stiff limit to the number of hours of enjoyment you can possibly squeeze out of this product. Although each playthrough is supposedly a bit different, due to some randomization in the playback of monologue passages, this only gives it a little more replay value than a movie, and a single playthrough is considerably shorter than average movie length. (The first playthrough should take no more than 90 minutes. Mine clocked in at exactly 90 minutes, but that included some aimless wandering, graphical tweaking, and even pausing to get drinks.) While I'm guilty of impulsively buying Dear Esther at 75% off, and while I'm content with that decision, I wouldn't be so enthusiastic about paying the full price of $9.99 and I can't honestly recommend doing so.

Missing the Point


That being said, I think there's some unwarranted hostility toward Dear Esther that stems not from its quality or from any of its own merits, but from a misunderstanding of its purpose, and from a rejection of the concept of video games as interactive fiction. "That's the dumbest thing ever" was the response of one friend when he was told what Dear Esther is like. Opinions are opinions, so I can't really debate that point, but I do think the context matters: When this conversation took place, my girlfriend had just mentioned a new "video game" that we'd played. This guy was expecting to hear about a game, but then he heard there was no objective, no challenge, and no real gameplay at all. So, yeah, of course that sounds dumb.

The whole problem, I think, is that Dear Esther is considered and treated as a video game, but this is only for lack of a (commonly used) better term. You could call it "interactive fiction" but that might not be sufficient to fully describe such a product, and I don't see the term catching on as a way to describe these things anyway. Instead, I'm tempted to call it a "video non-game" because it really is, precisely, a video game with the game element removed. Actually, I think this might be the best way to describe it. The strong connection to video games is there, but it doesn't leave us expecting something we're not going to get.

When judged as a video game, Dear Esther might be called a failure, but let's be fair: the same thing happens when you judge Lord of the Rings as a romantic comedy. A valid criticism of Dear Esther should focus on what's there — the writing, the visuals, and the music — rather than obsessing over exactly how it's not a game. Unfortunately, so much of the criticism I've encountered takes the latter route and fails to make a relevant point. I can't say I'm surprised that everyone gets stuck on the non-game aspect, though; after all, we're still pressing buttons to make things happen on a screen. It feels like a game, and that's what makes it feel so wrong.

Experimental and atypical releases such as Passage, Flower, The Graveyard, Universe Sandbox, and Dear Esther seem to be expanding the definition of "video game" by really pushing the boundaries that separate this medium from others, and this seems to be happening regardless of whether the creators of these products even choose to refer to them as games at all. The result is that, while video games used to be a subset of games, they now occupy another space entirely. Dear Esther is, arguably, a "video game" — and most people will probably call it one — but it certainly isn't a game at all. Consequently, if people install it expecting a game, they're in for a disappointment. However, this doesn't make it a bad product. It just makes it something other than a game.

The Newest Art Form


But regardless of whether we choose to call them games, Passage and Dear Esther seem to be at the forefront of the movement to have video games recognized as an art form. It seems good enough, for most people, that these video non-games attempt to be something resembling art while existing in a video game-like format. Just as often as they are criticized for not being game-like enough, they are cited as examples in arguments and discussions over the elevation of video games to the status of art — arguments and discussions which, for better or worse, tend to revolve around those artistically driven (but, importantly, secondary) aspects of the medium: story, graphics, music, et cetera.

Bringing this up on a video game forum is like bringing up politics at Thanksgiving dinner; that is, it's a good way to upset everyone. The idea that a video game, of all things, can actually be art isn't just a problem for video game haters; it's also enough to offend some "hardcore gamers" who reject the very notion that story, graphics, music, and intangible things like atmosphere can add anything of value to the medium. Any attempt to create a video game explicitly as a work of art, which unfortunately is most often done at the expense of the quality or amount of traditional gameplay, is obviously going to upset these people, and — referring again to Dear Esther in particular — the outright and total abandonment of the "game" in "video game" is obviously going to drive them crazy. The existence of Dear Esther itself isn't really such a problem, but the paradoxical notion that video non-games are actually the future of the medium is anathema to "hardcore gamers" everywhere.

To be honest, though, I don't think it should be a surprise that we're moving in this direction after so many years of video games with increasingly more emphasis on story, character development, visual effects, and other non-essential, movie-like qualities, often with less focus on conventional gameplay and player freedom. (I think I've discussed such things once or twice before.) Even where core gameplay mechanics have been preserved, video games have already become more like movies (presumably in order to grab larger audiences who might be bored with playing just a game), and maybe we've already passed the point where gameplay mechanics truly become the secondary attraction to the mainstream audience.

Is all of this good or bad? (Does such a distinction exist?) What does the concept of video games as an art form mean for the future of video games? But wait; if we're going to ask that question, we first have to answer a couple of others: Is it even possible for a video game to be a work of art? And should video game developers attempt to be artists? Perhaps these are silly questions — no doubt the idea of treating a video game as a work of art sounds downright ridiculous to a lot of people — but this debate seems to be happening whether we like it or not, so I think it's worth discussing.

To these last two questions, respectively, I'd give a tentative yes and a maybe. Whether a video game created specifically and intentionally as a "work of art" can be good, as a game, is certainly questionable, but if music and literature and acting and photography and, most importantly, film can be treated as art, then... well, I need to be honest: I can't think of a good (objective) reason that video games in general should be excluded. That video games, as a medium, should be considered an art form simply because of how a game can imitate and appropriate other forms of art (i.e., music and acting and writing and film) is a dubious argument at best, but I do believe that a good film would not automatically stop being a work of art simply if interactive (game-like) elements were added to it. Perhaps the new generation of video games, which are often more movie-like than game-like, should be analyzed this way instead. And if video games, at least in theory, have the potential to be works of art, then perhaps developers should strive for this... right? I guess. Whether they know how is another question entirely, but more on that will come later.

Comparisons and Analogies


The opposition to the idea of video games as art is largely (but not entirely) from those who don't believe that expensive electronic toys are deserving of whatever respect or elevated status comes along with inclusion in the invisible list of which things are allowed to be considered art. You might similarly argue that Picasso's paintings are not art just because you dislike them. Beyond personal tastes, however, I have to wonder if there's an actual reason for excluding video games when everything else that claims to be art seems to be accepted without much fuss. You can carefully arrange a bunch of garbage and call it art, and other people will call it art as well, as long as you can say with a straight face that the garbage arrangement means something. Or maybe it means nothing, and that's deep. Who cares? It's art if people say it's art.

It's clear, however, that video games are fundamentally different from all other things which are commonly considered art. The whole point of a video game is player interaction. Most art, meanwhile, is meant to be enjoyed passively, and one might even call this a requirement. Such a rule remains unwritten, however, since no one ever had a reason to include the words "passive" and "non-interactive" in the definition of art before video games tried to nudge their way in. Attempts to redefine the word "art" just for the sake of snubbing video games are confusing and unhelpful.

Other arguments against the notion of "video games as art" come from a comparison of video games to more traditional games. Chess is not art, and neither is football. On the other hand, a great amount of creative work, including visual art, often goes into the creation of many tabletop games, notably those of the collectible card variety. Furthermore, the entire analogy is rather fallacious; I've already pointed out that video games are, perhaps unfortunately, no longer strictly a subset of games, and moreover they can do things that traditional games cannot.

Some even try to argue that video games cannot be art because they're most often created for profit, or because they're most often created by large development teams in large companies. Obviously, though, these arguments allow indie games to slip through the cracks.

Ultimately, these debates never go anywhere because the definition of art is notoriously fuzzy, subjective, and ever-changing. It all boils down to opinion, and that's okay. Words aren't invented by dictionaries; their definitions come from their usage, not the reverse. Arguing semantics in this case is effectively a dead end, and once you get past all that nonsense, the most commonly cited reason for excluding video games in particular from the art world is simply that we haven't yet seen a video game worthy of the same praise as a Shakespeare play or a Rembrandt painting. The implication is often that we never will, even if no specified rules would exclude video games on principle, because the quality of creative work that goes into the most critically acclaimed video games is still supposedly mediocre at best in comparison to, say, the most critically acclaimed films.

Again, the opinion that video games will never be art doesn't just come from old men like Roger Ebert who never played a video game. It comes from within the "gaming" community as well, mostly from those "hardcore gamers" who would argue (perhaps correctly) that the industry needs to return to a strong focus on complex and challenging gameplay, and to stop pandering to casual "gamers" with artsy/cinematic nonsense without even a lose state or a hint of any meaningful challenge. Games shouldn't be movies, the hardcore audience likes to say. If you've perceived a significant decline in the quality of video games over the years — that is, I should clarify, a decline in the quality of everything in video games except for graphics — then you'd probably say this is a compelling argument, and I would strongly agree. However, if we want to push for better gameplay via an end to the game industry's distracting infatuation with film, then we should just do exactly that. The argument about the video game's status as an art form is a separate one entirely.

Even arguing successfully that video games should not be art doesn't exactly prove that they are not or cannot be art, and even arguing successfully that they are not or cannot be art wouldn't keep them from trying to be art. More importantly, the notion of "art" being discussed here might be the wrong one for this context. It is possible, after all, for games to be a kind of "art" without relying on the imitation or appropriation the various aspects of other art forms.

Pixels and Squares


It's with some reluctance that I place myself on the pro-art side of the fence, for a number of reasons. First, regarding the more dubious but more common notion of "games as art" by virtue of their essentially movie-like qualities, I must admit that such a definition of art is valid whether or not it's good for the video game industry.

Although I don't think the potential for the inclusion of non-game-like qualities should be the justification for broadly treating the video game medium as an art form, I do think it's fair to treat an individual video game as a work of art based on whatever kind of arguably artistic work was involved in its creation. That is, although I don't think video games should necessarily be praised for how they simply imitate film and other media, the typical modern video game (like a typical film) is the product of many kinds of creative work — music, writing, acting, and of course the visuals which might be hand-drawn or computer-generated — and regardless of the average quality of all this creative work, it's still there. Picasso is still an artist even if you don't like him.

So how can one say that the culmination of all the artistic work that goes into a video game isn't art? I can't think of a non-feelings-based argument to support such a claim. Short of declaring that none of that work is currently done at a level that qualifies as true art (which leaves the door open for better games to qualify as art in the future), the only way out is to say that it ceases being art once it becomes a game — that even though it contains art in various forms, the finished product is not art because its primary function is to provide the player with a challenge or some entertainment. And I think that's a pretty bizarre thing to say.

But let's just go with it. Let's say it's true: video games cannot be art because they're games. Now we get to ask the really interesting question. What happens when the video game evolves to the point where it's no longer a game, as is the case with Dear Esther? Are we then allowed to call it art? And if so, is there really no point along the continuum from Tetris (pure gameplay) to Dear Esther (pure "art") at which games and art do intersect?

Perhaps the right course of action is to reject everything I just wrote and say that Tetris itself is a work of art already. So far, I've followed the typical course of these "video games as art" debates by analyzing the controversial (and perhaps misguided) idea that video games should be considered art by virtue of the way they incorporate other forms of art, e.g., the writing of the story that provides context to the gameplay, the drawing and modeling that result in the game's graphics, and the production of the soundtrack — but you also could argue (and should argue) that a well-designed game is a work of art by virtue of its design, be it elegant or complex or somewhere in between.

That's probably how the concept of the video game art should be understood in the academic sense. I think games can, and should, be recognized as art for the qualities that actually make them games. The defining feature of the video game as a medium — gameplay — needs to be considered, and perhaps nothing else. If architecture is an art form, then it's not because architects like to hang paintings on the walls of the buildings they design. It's because the talented architect will bring a unique kind of excellence to the actual design of the building itself. The same should be true of video games if they are to achieve that same status.

In truth, regardless of what we might say on occasion about an individual game which incidentally borders on "work of art" territory according to someone's opinion, I think the video game as a medium can never be accepted as an art form unless it is recognized as such for the qualities which make video games what they are. For the video game to be accepted as an form of art, game developers need to do more than paste some audiovisual art on top of some game mechanics. The game design itself — not just the graphics, or the music, or the story — needs to be done at a level that deserves to be called art. If you can remove the interactive elements from a particular game without sacrificing any of what makes that game a work of art, then that game isn't doing anything to promote video games as an art form. It could have done just as well as a movie.

In the colloquial sense, however, most people accept a game as a work of art only if it conveys some meaning and evokes some emotion, and thus pasting audiovisual art on top of game mechanics is perfectly fine. Most video games attempt art status by telling a story, and maybe that's totally legitimate as well. I wouldn't object to classifying Max Payne as a work of art for its narration alone, even though Max Payne achieving "art status" merely by way of its writing does nothing for video games as a medium. In any case, on the subject of video games as art via narrative, I only wish it were more often done much better. The typical story-driven game is an alternating sequence of meaningless challenges and non-interactive cut scenes. They could very often be separated from one another and each do better on their own. If developers want their games to be art — or, perhaps more accurately, if they want their art to be games — they should at least incorporate interactive elements in a way that supplements the supposedly artistic value. Too often, these two aspects of a game just end up sitting side-by-side. Ideally, game developers who want to be artists should just study the art of good design instead of stapling a half-decent game design to a half-decent movie.

All of this is just food for thought, obviously. The question at the heart of all this thought is too subjective (and currently too controversial) for a satisfying answer. If you want an objective definition of the word "art" then I have one observation to share: with few exceptions, a thing is considered art if and only if it was meant to be art, created with artistic intentions by one who fancies oneself an artist. (Whether it's good art is another question entirely.) In other words, the creator does have some say in the matter. In 1915, a Russian guy named Kazimir Malevich painted a black square and called it art, and that black square ended up in an art museum, but that doesn't mean every other black square is also art. And of course, the "consumers" of art also have some say in the matter, because that black square wouldn't have ended up in a museum if nobody else had thought it was worth displaying.

And hey, look, there are video games in an art museum now. It's worth noting that the games featured at MoMA were selected for their ingenious design. They are being appreciated for the qualities that make video games a unique medium, and nothing else. That's a step in the right direction both for gameplay purists and for those who want video games to be taken seriously as an art form. After all, how are video games ever going to get this recognition if the way we're trying to make them more like art is by making them less like games and more like movies? The video game itself cannot be art if individual games only become art by branching out into other established art forms. Indeed the game design itself needs to be recognized as an art form on a fundamental level, with or without all the fancy toppings.

In any case, as with black squares, I would hesitate to hail a video game as a "work of art" if it's known that the developers never had this in mind, but if the developers are passionate about their work and if consumers are passionate about enjoying it, the label fits well enough to elicit no complaints from me. The relevant point, I suppose, is that a video game can be art — the art museum has spoken — and, more importantly, it can still be a good game, too. However, with regard to digital entertainment in which the basic elements typically defining the traditional game are drastically demoted or abandoned entirely in favor of other types of artistic expression, I really think we need to update our terminology. In other words, if Dear Esther isn't a game, it shouldn't be called a "video game" either.

Wrong Direction


The fact that Dear Esther and similar releases are considered to be video games, by many, is terrifying to the rest of us because it amplifies the perception that these overly cinematic, overly linear, sometimes pretentiously artsy experiences, devoid of any challenge or depth in gameplay, are the future of our hobby.

There are those who really would argue, instead, that Dear Esther is an extreme example of where video games should be headed. Some say that video games should do more than simply challenge the player — that they should convey a deeper meaning and tell a better story — and that's totally fine, as long as we're talking about supplementing the gameplay, not removing it. Otherwise, the argument is really just a roundabout way of saying "I've realized that I don't even like video games and I need something else to do with this controller I bought" — something else like interactive fiction, perhaps. So why don't we make that, and call it that, instead of pushing to change video games into that? Apparently because, even when people realize that all they care about is the storyline, they still seem unusually desperate to call themselves "gamers" despite the fact that their ideal "video game" is hardly a video game at all. They just really want their nerd cred or something.

Perhaps this is what the industry gets for having attempted for so many years to fit deep story and deep gameplay into the same product. The prospect of an interactive story inevitably attracts people who — let's be honest — just aren't interested in playing real video games. I'm referring, of course, to the "casual gamers" who really do see challenge as an unnecessary obstacle that should be removed so that people who aren't any good at video games can still enjoy what's left of them. To be honest, this worries me. If players see challenging gameplay itself as a nuisance, and developers cater to them by making challenging gameplay optional, we're coming awfully close to throwing out one of the most fundamental properties of the video game as we once knew it.

I think we'd all be better off if we just allowed interactive fiction to become its own thing, with its own audience, instead of allowing the entire industry to be dragged in the wrong direction. It seems to be going in that direction either way, in its attempts to hook that casual (non-)gamer audience, but we shouldn't legitimize this by expanding the definition of "video game" to such an extent that people who buy interactive movies get to call themselves gamers.

Wednesday, January 16, 2013

Shooting Inanimate Targets is Disgusting

I wasn't really planning to write more about the aftermath of the Sandy Hook shooting. My other post, written out of frustration when I had too much time on my hands, was more than enough — too much, in fact, since everything worth saying was already said before I got mad enough to say it again. (My apologies for being redundant.)

But, once again, this is just too stupid to ignore.

It seems the country is in an uproar over an iPhone game called NRA: Practice Range, allegedly funded by the National Rifle Association itself (though it was developed by MEDL Mobile). In the game, you shoot at targets in both indoor and outdoor shooting ranges. All of the targets are inanimate objects. Some of them move, some of them don't, but as far as I can tell without buying the app (and according to what I've heard from those who have), none of them bleed. Even so, everyone is pretty angry about it, either because of the timing of the application's release or simply because so many people have never seen a gun in a cheaply made video game before.

Anti-gun activists and half the journalists in the nation have been attacking the NRA relentlessly for whole month, and this game only made it worse. Personally, I don't care about the NRA, because I don't own any rifles, and I don't care much about this game, because I don't own an iPhone. But it's hard not to notice when the media collectively goes off the deep end and drowns in its own frantic panic-mongering.

I'll just post a nice example that caught my attention earlier today. An opinion piece on NY Daily News, creatively titled Simply app-alling, calls the game "truly sickening" and — echoing the words of NRA executive vice president Wayne LaPierre as he criticized the developers of violent video games — "callous, corrupt and corrupting." It is slightly ironic that an NRA-branded shooting game has appeared so soon after LaPierre uttered those words in his ill-informed attempt to redirect blame for shootings from guns themselves to violence in media (i.e., from one scapegoat to another)... but only slightly.

LaPierre, it would seem, has a problem with violent games, and NRA: Practice Range isn't violent. Literally no violence occurs. So while the untimely release of this game is somewhat ironic and even a bit distasteful, given the context, it certainly isn't an example of "disgusting hypocrisy, profiteering and irresponsibility."

I don't see how it's disgusting at all. It might actually be one of the least violent shooting games ever released. It makes Duck Hunt look like a nightmarish glimpse into the mind of a mass murdering psychopath.

cute little ducky... BANG

In fact, Duck Hunt would serve as a much better training tool for an aspiring spree killer, since it's played by aiming a plastic gun at the television. Meanwhile, NRA: Practice Range is played by making smudgy thumb prints on a touch screen. If we're going to write about disgusting, appalling, "truly sickening" games, we should probably start with something that actually involves killing, even something as mild as this wholesome Nintendo classic, rather than railing against the one game in which nothing gets hurt.

Somehow, though, I don't think Duck Hunt is soon to be the target of any political poo-slinging. I'd have mentioned instead a few of the many games in which shooting virtual people is the goal, but if today's journalists are so horrified and offended by fake target practice in a fake shooting range, then learning about a game in which you blow fake people's brains out might send the lot of them into a catatonic state. On second thought, that would be pretty nice. I should send them some gameplay footage of Max Payne.

Sunday, January 13, 2013

Jumping in Slow Motion: The Sequel

Recently, I wrote a little about Max Payne, which I'd neglected to play until this winter. Now it's a week later and I've gotten through Max Payne 2: The Fall of Max Payne.

Again, very enjoyable, and again, occasionally hilarious. In my opinion, it wasn't quite as difficult as the first — or, at least, it seemed to require a bit less of that dreaded trial-and-error save scumming in most levels. This is because fewer of the fights are set up in such a way that you'll be blasted as soon as you enter a room and subsequently die for no good reason other than your failure to predict the future (which, of course, was perfectly fine in the first Max Payne as long as you didn't mind saving often and learning things the hard way).

Regardless, I still progressed through Max Payne 2 at a snail's pace, retrying some fights over and over again before moving on, but that was usually by choice rather than by way of humiliating failure. After getting past a particular room, I'd often decide to see if I could do it again without getting shot, or with as few bullets as possible, or just for the sake of using some grenades (which I have a tendency to hoard until I can't carry any more).

The combat itself seemed a lot more forgiving. Either that, or I've just gotten that much less terrible at third-person shooters in general, and Max Payne in particular. So maybe I'm completely wrong, but I did get the impression that the enemies' reaction times were increased, and that the protagonist is a bit more bulletproof than he was before. Bullet time, also, seems considerably more useful this time around.

Maybe I'd be able to offer a better analysis of the differences between the two games if I were to play them both in the same day, perhaps side-by-side for a careful comparison, but I'm less interested in the details than in the overall experience of playing the game. In short, it was a hell of a good time.

The story, though I'd be lying if I said it were even 10% of my motivation for finishing the game, was a lot better than in the original. Spending some time playing as Max Payne's sidekick, nemesis, and love interest Mona Sax was a nice change, too, even if it was only a superficial one. What I really didn't like about the game was the occasional escort mission, the first of which has a player-controlled Mona Sax defending a nearly-helpless computer-controlled Max Payne.

Escort and defense missions are generally pretty terrible in any game, and I failed plenty of times in this particular level — partly because it was such a drastic change of pace, and partly because I vastly overestimated the protagonist's ability to survive when I wasn't controlling him. With Mona Sax high up in a building and Max Payne down with the bad guys on the ground, this part of the game is essentially an exercise in spotting those bad guys before they can start shooting. If Max is getting shot and you don't see the shooter right away, his health is going to drop very quickly.

By contrast, whenever the player is controlling Max Payne and Mona Sax is nearby, she's invincible and doesn't need to be defended. In these particular fights, if you're feeling lazy, you can even hang back and let her wipe out every enemy in the room. I can't help but wonder if the developers sought intentionally to turn the so-called "damsel in distress" cliché on its head with this rather unusual discrepancy. I almost wonder what Anita Sarkeesian thinks of it, and perhaps we'll find out when she finally releases the long-overdue premier video in her crowd-funded series on sexism in video games, but if she does analyze this game then she'll probably be more interested in the fact that the game's heroine is seen nude or partially undressed on multiple occasions, and might be regarded as little more than an excuse for sex appeal if you can manage to ignore most of the game's plot.

Whatever her purpose in the game, I really started to enjoy Max Payne 2 during the first level in which Mona Sax is the playable character, though I can't say exactly why. Maybe it was the level design. Or maybe it's just that I had finally reached the point where practice pays off and started getting a lot of headshots with the Desert Eagle... which, by the way, might just be the most satisfying weapon in the game. Even blowing up three guys with a grenade isn't nearly as gratifying as efficiently popping each of them in the head as you jump out from cover in slow motion, and even though painting the entire room with bullet holes while wielding dual Ingram machine pistols might be more effective if you're still a beginner, it also gets dull pretty quickly. Part of making an enjoyable shooter is including a selection of weapons that feel powerful (without actually making them so powerful that they break the balance of the game), and this means good sound effects. The Ingram, unfortunately, sounds awful, but the Desert Eagle sounds very nice.

It only does the job when you can manage to hit the bad guys in the face, but that's where you should always be aiming regardless of your weapon of choice. Headshots are everything in Max Payne (which is why I find it so strange that the developer, Remedy Entertainment, went on to create Alan Wake, in which shooting a bad guy in the head is no different from shooting him in the toe... but I guess that's just because Alan Wake really isn't a shooter). Max Payne 2, like its predecessor, very often becomes a game of activating bullet time and then clicking on heads as quickly as you can. Anything else is a waste of bullets.

Now that I've gotten both of Remedy Entertainment's Max Payne games out of my backlog, I can look forward to (eventually) playing Rockstar Games' Max Payne 3. But, as I've mentioned before, I'm in need of a better computer, and I'd rather wait until after I make that purchase so that I don't have to play the game on minimum graphical settings, or at a mediocre frame rate, or both.

Sunday, January 6, 2013

Jumping in Slow Motion: The Game

Last night, I finished Max Payne. (The first one.) It took me a while, because my playing time has been cut drastically by that irritating thing called employment, but I finally finished it.

I'd previously played both games in Remedy Entertainment's newer series, Alan Wake, so I was confident in the developers' ability to create an entertaining third-person shooter. Obviously, though, the two franchises don't really have much in common aside from the presence of firearms, the fact that each game is named after its protagonist, and the amount of time you'll spend looking at the back of each protagonist's head.

Max Payne, for example, is insanely, ludicrously, hilariously violent. It was even a bit awkward, I must admit, to start playing this game less than a week after a particular mass murder which called into question (for some) the effects of violent video games on young people. Personally, I don't think games like this one are harmful at all, but the subject was still on my mind. This is exactly the type of game, I thought to myself, that would start a misinformed media frenzy if it were ever discovered in the bedroom of some kid who had gone and shot a bunch of people.

If someone told me that Max Payne, of all games, might desensitize players to violence and make them more likely to commit violent acts, I'd respond that the game is just too damn silly to have that kind of effect on people. The game doesn't take itself very seriously at all, and while some might say that's part of the problem, there's just no way that anyone could get real ideas about harming people after playing a game in which most of the shooting is done while jumping in slow motion. If a killer in the making were going to get ideas about how to commit a mass murder from a video game (which is unlikely), or if such a person actually wanted to train for such an act using video games (which is extremely unlikely), he or she would probably be found playing a more realistic shooter.

Max Payne has absolutely no respect for the concept of realism, and everything (including the violence and the motivation for violence) is consequently so tongue-in-cheek that, to me, this particular shooter almost seems like a parody of its own genre. (It some parts it even becomes a parody of itself.) Like the gameplay, the story is pretty grim, in a lot of ways — it involves drug use, corruption, government conspiracy, and a ton of murder — but there's enough humor and absurdity in the telling of that story to lighten the mood, just a little, if you're paying attention. More importantly, the ridiculous, contorted expression on Max Payne's unmoving face destroys the serious tone of the opening cinematic. I couldn't hold it in; I had to laugh. In fact, that happened a lot throughout the game, not necessarily because it was meant to be funny but because the stuff happening on my screen was so insane.

The type of carnage that occurs throughout Max Payne is only a couple of steps beyond cartoon violence. There's a bit of blood spray and some flailing when the bad guys fall down, but that's it. They even stay in one piece when they're blown up with grenades, which makes it more hilarious than tragic when a thug tries to toss a grenade at you and ends up killing himself. I should also mention that Max Payne isn't the only one with a goofy face. All of this is due in part to the outdated graphical capabilities of the game, but the developers clearly weren't aiming for realism in any case. The only hint of realism throughout the entire experience is that the protagonist is by no means invincible. If you're playing for the first time, prepare to watch Max Payne die over and over again, sometimes from a single gunshot without any warning.

Part of what made the game interesting for me is that the main character, though certainly a bad ass, is pretty fragile for an action hero. I played through the game on the easiest difficulty setting (because the others are initially locked), and the game still killed me plenty of times. Unlike some other video game protagonists, Max Payne isn't exactly a bullet sponge... and unlike some other video game villains, the ones here aren't always terrible at shooting, so you won't be feeling so great if one of them manages to shoot first. You can fill up your health bar by finding painkillers and eating them like candy — another aspect of the game which is so absurdly unrealistic that it's hard to see it as anything but lighthearted humor — but the painkillers don't work instantaneously. If you start getting hit with bullets, and the guy who fired them isn't already dead, you're in big trouble, and you'll probably be watching your own slow-motion death before you can head for cover.

The reason Max can survive to the end of the game, taking down hundreds of heavily armed bad guys along the way, isn't because he's tougher. It's because of the advantages that you, as a player, have over the game — actual intelligence, the ability to quicksave and try again when you die, and the ability to know what's coming when you do so. The game is very scripted, so it's rather predictable. If you immediately die when you walk into a room because some thug with a shotgun was waiting or you on the left, you'll know to turn left the next time you enter that room. If you get blown up by a grenade that comes flying around the corner as you walk down a hallway, you know that the same grenade will be thrown in the same spot the next time around.

Since playing the game for the first time without dying over and over again is virtually impossible, you'll have to rely heavily on quicksaving. In some places, you'll only succeed through trial and error. You can always improve your reflexes and practice your aiming, but you'll win by knowing where the bad guys are and by turning to shoot them before they appear.

The player also has the ability to jump in slow motion, shooting in mid-air, and this is the game's big gimmick. Having trouble with a particular gun fight? Try using bullet time. While it certainly isn't the end-all game-winning move, as the slow-motion feature in F.E.A.R. was, it's still pretty useful. Then again, the game was designed around it, so it's really all but necessary if you don't want to be hopelessly outgunned. Like quicksaving, it's a crutch, but it's a crutch you'll probably need to lean on in order to beat the game. If you refuse to use those crutches, you'll be putting yourself through a lot of unnecessary punishment.

In any case, despite the occasional frustration, Max Payne was a pretty interesting experience. To be perfectly honest, I didn't expect to like it very much, for some of the game's most noticeable attributes are associated today with bad game design — the scripted enemy behavior, the excessive reliance on a single gameplay gimmick, and what might be called artificial difficulty (in the sense that the player will die frequently through no fault of his or her own and must rely on save scumming to progress). But the game is still fun, in its own way. Furthermore, while much of the game is a thoroughly tongue-in-cheek embrace of senseless ultra-violence, Max Payne is still known for its great writing.

The plot isn't the most imaginative, but the protagonist's monologues — though a bit too heavy on the metaphors — are very well done. They affect the mood and the atmosphere of the game in a way not often seen in shooters, and the neo-noir graphic-novel style storytelling makes the game truly unique.

Obviously, the game is pretty outdated now, so I'm not sure how strongly I should recommend it to the average player. You certainly won't see the beauty in this 2001 game's ugly graphics if you can't compare them to the uglier graphics of the late '90s and beyond. But I hope new and future fans of the series will keep returning to this game despite its visual shortcomings. The franchise is still alive with the past year's release of Max Payne 3, and it's always best to play an entire series in order rather than simply skipping to the newest game.

Then again, I haven't yet played either of the Max Payne sequels, and I honestly can't say whether the latest installment bears any resemblance to the original.