Tuesday, February 26, 2013

The Death of Used Games (Or Not)

I've long speculated that digital rights management doesn't really exist for the sake of stopping software piracy, but rather to stop people from buying used games. After all, DRM almost always fails to stop piracy (especially of single player games), while the potential second-hand market of any game is completely destroyed as soon as any form of DRM is implemented.

The PC gaming community has largely gotten over this, because we've had to deal with it for so many years. Many of us have even embraced digital distributors like Steam (and, to a lesser extent, Origin), whose downloadable games, forever tied to online accounts, cannot be resold or even returned. Surely we're all aware that this isn't the natural state of things — that when we pay for something, we should own it — but we're quickly moving toward an all-digital era in which the things we buy aren't really "things" at all. Perhaps video games are actually behind the times, in that regard. You can't resell the songs you buy on iTunes, and people stopped complaining about that a long time ago.

However, while people who buy PC games have had plenty of time to digest the prospect of a non-existent used game market and sales that feel more like indefinite rentals (which might need to be "returned" if the company running the authentication server goes bankrupt), console owners have not. They're ignorant of much of this, and what I've heard is that they're about to get a rude awakening.

It's not a new rumor, but it's been started up anew thanks to fresh quotes from Eidos co-founder Ian Livingston, who says (according to Destructoid):
With the next Xbox, you supposedly have to have an internet connection, and the discs are watermarked, whereby once played on one console it won’t play on another. So I think the generation after that will be digital-only
Whether the games for the next Xbox will each be married to an Xbox Live account, or to the physical hardware inside of an Xbox, I'm not really sure. Either way, this is bad news for GameStop, GameFly, and anyone else who wants to make money by selling or renting out used games. If DRM has been the industry's attempt to destroy the used game market, this will be the final nail in the coffin. Certainly, we'll still be able to privately resell some of our non-Xbox games — DRM-free PC games, for example — but with an entire next-generation console taken out of the equation, there won't be enough business to justify having a shelf for pre-owned games at any game store.

I suppose I should point out that I don't really care if GameStop goes out of business or resorts to selling new merchandise exclusively — their prices for used games are awful anyway — but the whole idea of eliminating used games on an entire platform is bad for players as well. Sure, those of us playing PC games have gotten used to this, and we've learned to compromise with the current state of things, but nobody wants another step in the wrong direction.

Nobody except greedy developers and delusional fanboys, I mean.

There are those who will claim that used games are actually bad for the game industry, despite the fact that no other industry needs to have its workers complain about the used sale of its products. (When was the last time someone told you that buying used furniture is bad for the furniture industry?) They'll even say that buying a used game is just as bad as piracy, because it puts no money in the hands of developers, despite the fact that it does put money back in the hands of people who buy new games... and despite the fact that, while piracy is morally questionable, buying a thing from a person who owns that thing is not. (When was the last time someone told you that buying used furniture is equivalent to robbing a furniture store?)

But I guess we all know which way the industry is going despite what people think. Or... maybe not. Only a few days ago, Eurogamer reported that Sony has taken a stance opposite that of Microsoft, and that the next PlayStation console will not be blocking used games. I can only wonder if this means the PlayStation 4 will sell better than Microsoft's next console. If consumers want to make a statement with their purchases, it likely will. On the other hand, consumers are pretty dumb.

Tuesday, February 12, 2013

So Now Demos Are Bad for the Industry

It looks like Jesse Schell's keynote speech at the D.I.C.E. Summit is causing some minor controversy.

You can watch the whole thing (which runs just over 22 minutes) right here, and I recommend doing so, since the speech is pretty entertaining and informative overall. Schell makes some jokes, discusses the psychology of getting customers to spend money, talks about why Facebook games and microtransactions are terrible, and makes some less-than-optimistic predictions about the future of 3D games and augmented reality.

But most of that is being ignored because of something that occurred about half-way through the speech. In the context of discussing how "plans" affect consumer behavior, Schell presented a graph with data from EEDAR showing that games with a demo (and no trailer) sell better in the first six months than games with neither a demo nor a trailer... and that games with both a demo and a trailer sell even better... and that games with a trailer and no demo sell even better. In fact, among games with a trailer, those without a demo appear to be selling twice as well. Schell comments:
"Wait, you mean we spent all this money making a demo, and getting it out there, and it cut our sales in half?" Yes, that's exactly what happened to you. Because when you put the demo out — people had seen the trailer and they're like, "that's cool," and they made a plan: "I gotta try that game!" And then when they played the demo: "all right, I've tried that game. That was okay. All right, I'm done." But the things with no demo — you've got to buy it if you want to try it.
This all starts at about 10:24 in the video below.


This minute-long part of the speech, far more than any other, is being picked out for commentary by video game news sites (see articles here and here and here and here and here). While it's regrettable that most of the speech is being overlooked, it should be no surprise that this particular section is a point of controversy. Schell is basically implying that sales can be increased by not releasing a demo, since a player might not want to buy the game anymore after he or she tries it.

A lot of us, understandably, are upset about the idea of game developers forgoing the playable demo for the express purpose of keeping customers uninformed. Such business practices are generally regarded as evil (and I've said as much before when discussing pre-order bonuses and other incentives to invest in games without even knowing how good they are). The reasoning is that players should be allowed to know what they're paying for, and this often means a bit of hands-on experience with the game, especially when a teaser trailer is insufficient. Trailers work for the film industry, but the most important aspect of a typical video game is — not surprisingly — the gameplay, and a trailer tells you little or nothing of that.

We care about how the game is played, how well the difficulty is balanced, and how the controls feel; unfortunately, you can't sample any of those things by watching a pre-rendered plot-focused movie-style preview. (This argument for the necessity of the playable demo isn't quite as strong in the case of a sequel, for which the continuation of a storyline is the major selling point, and in which gameplay might be expected to remain largely the same as in the previous release. However, even die-hard fans of a series will want to verify that the gameplay in the upcoming installment is up to par with that of its predecessors. And if it's not, this happens.) By failing to release a demo, a developer denies its customers the chance to gauge accurately the quality of the product first-hand, leaving them to rely on the opinions of professional reviewers and "gaming journalists" whose integrity is questionable to say the least.

If a developer refuses to release a demo, shouldn't that lack of transparency be troubling? Furthermore, if a developer actually fears that releasing a demo will result in a significant loss of sales, what does that say about their confidence in their own product? Finally, if the release of a demo does in fact lead to fewer sales, doesn't that mean the game wasn't very good? Maybe, just maybe, the release of a demo hurts sales on average because the average game just isn't very good. I'd like to think a fantastic game would only benefit from the release of a demo because people would see how great it is.

Schell, however, thinks it's less about the quality of the game, and more about the player's curiosity being satisfied — that once the player tries the game, even in the form of a demo, he or she will no longer need to buy it. To me, this is pretty unbelievable. When I download a demo, it's because I need to see the gameplay in order to decide whether to buy the game. The demo will make or break that purchase based on the apparent quality of the game alone. If I decide against the purchase, there's no "lost sale" as a result of the demo because, in the absence of a demo, I would never have bought the game at full price just to "try" it. But maybe most consumers just aren't as cautious as I am, and maybe Schell is right that a huge portion of sales are due to nothing but blind impulse.

The numbers, after all, seem to suggest as much. Still, it is exceedingly important, as always, to note that correlation does not necessarily imply causation. Moreover, with nothing to analyze but the data itself, we can't really say which way any existing causation goes. Maybe some developers didn't bother to release demos because they knew their games — perhaps sequels riding on reputation alone — were going to sell regardless. Maybe some other developers released demos to generate interest because they knew their games weren't going to sell as well as those blockbuster sequels. Or maybe that's a stretch.

In any case, while I think it's plainly obvious that Schell is at least partially right, he's being rather dramatic ignoring a lot of possible contributing factors. To say that the release of a demo is solely responsible for a 50% reduction in sales is clearly an exaggeration. Furthermore, if Schell's statement is to be taken as advice for game developers, it's pretty horrid advice. If a developer does manage to increase the sales of a certain game by failing to release an hour-long playable demo, the only sales gained will be from players who get tired of playing the game after an hour. An increase in sales will mean an equal increase in disappointed customers. But the developer doesn't care, right? Hey, kid, no refunds.

So what should we take from all this? Dubious advice and questionable arguments aside, one fact still remains: If a developer releases a trailer and no demo, their game can still sell. We know this not because of the statistics Schell cited, but because we see it happening every day. Regardless of whether the release of a demo can hurt a game's sales, we're certainly not teaching the industry that the release of a demo is a prerequisite for record-breaking pre-orders. Too many of us are buying games without knowing what they're like, and eating up whatever the industry throws at us, probably because some of us don't know any better.

Worse yet, if Schell is right about how much the release of a demo can negatively affect sales — or, more accurately, how the absence of a demo can positively affect sales — it would mean that even "smart" consumers can be pretty dumb. It would mean that even those of us who like to make informed decisions, and try games before we buy them, will buy on impulse when such informed decisions are not possible. I don't doubt that some consumers do this on occasion, but could it really be happening to such an extent that the overall affect of not releasing a demo — of keeping us ignorant — is more profit for the publishers?

I'd like to think not, but if it's true, we only have ourselves to blame.

Wednesday, February 6, 2013

Dear Esther & Video Games as Art

Over the weekend, I played through Dear Esther with my girlfriend. I'd bought it from Steam for $2.49 because I was curious, and I'd sent it to her account because I'd (mistakenly) assumed she might ultimately appreciate it more than I would. Alas, while I was pleasantly surprised, she was bewildered and irritated by the apparent pointlessness of the trek across the virtual island that lay before her.

A bit of background (with links) for the uninformed: Dear Esther was originally released in 2008 as a mod for Half-Life 2. Like so many popular mods, these days, it was then remade as a stand-alone release, which was completed in 2012. Unlike most things sold on Steam and discussed on blogs like mine, however, Dear Esther doesn't have any goals or challenges and you can't win or lose. It doesn't really have any "gameplay" at all because it's not a game in any traditional sense. It's more like virtual reality made love to an audiobook, had an illegitimate child, and left that poor bastard on video games' doorstep because no one else would take him. But I mean that in the nicest possible way.

At its core, Dear Esther isn't much more than an interactive story, and the word "interactive" is being used here rather loosely. What you do is walk around in a virtual world, admire the virtual environment, and listen to sporadic snippets of metaphor-laden monologue over a calming soundtrack and the sounds of the ocean waves.


Given the nature of this particular piece of software, which may or may not be considered a "video game" depending on whom you ask, you might say I'd be just as well off if I'd watched a playthrough on YouTube instead of buying it. But even with such minimal interactivity and such a linear experience overall, I think watching someone else play is a poor substitute and misses the point entirely. Even though there's no real "gameplay" here, I'd still be denying myself nearly everything that does make Dear Esther a unique experience if I were simply to watch some other person decide where to go and what to look at. The prosaic monologues are wonderful, but the modest potential for free exploration by the player is likely the sole reason that Dear Esther was turned into a "video game" instead of a short film or a short story.

You might also say I could have played the free Half-Life 2 mod instead of buying the stand-alone product, but the newer version just looks so much nicer, and enjoying some nice-looking stuff is where at least half of the enjoyment lies.


Despite all of this, paying for Dear Esther might seem like a waste. There's a rather stiff limit to the number of hours of enjoyment you can possibly squeeze out of this product. Although each playthrough is supposedly a bit different, due to some randomization in the playback of monologue passages, this only gives it a little more replay value than a movie, and a single playthrough is considerably shorter than average movie length. (The first playthrough should take no more than 90 minutes. Mine clocked in at exactly 90 minutes, but that included some aimless wandering, graphical tweaking, and even pausing to get drinks.) While I'm guilty of impulsively buying Dear Esther at 75% off, and while I'm content with that decision, I wouldn't be so enthusiastic about paying the full price of $9.99 and I can't honestly recommend doing so.

Missing the Point


That being said, I think there's some unwarranted hostility toward Dear Esther that stems not from its quality or from any of its own merits, but from a misunderstanding of its purpose, and from a rejection of the concept of video games as interactive fiction. "That's the dumbest thing ever" was the response of one friend when he was told what Dear Esther is like. Opinions are opinions, so I can't really debate that point, but I do think the context matters: When this conversation took place, my girlfriend had just mentioned a new "video game" that we'd played. This guy was expecting to hear about a game, but then he heard there was no objective, no challenge, and no real gameplay at all. So, yeah, of course that sounds dumb.

The whole problem, I think, is that Dear Esther is considered and treated as a video game, but this is only for lack of a (commonly used) better term. You could call it "interactive fiction" but that might not be sufficient to fully describe such a product, and I don't see the term catching on as a way to describe these things anyway. Instead, I'm tempted to call it a "video non-game" because it really is, precisely, a video game with the game element removed. Actually, I think this might be the best way to describe it. The strong connection to video games is there, but it doesn't leave us expecting something we're not going to get.

When judged as a video game, Dear Esther might be called a failure, but let's be fair: the same thing happens when you judge Lord of the Rings as a romantic comedy. A valid criticism of Dear Esther should focus on what's there — the writing, the visuals, and the music — rather than obsessing over exactly how it's not a game. Unfortunately, so much of the criticism I've encountered takes the latter route and fails to make a relevant point. I can't say I'm surprised that everyone gets stuck on the non-game aspect, though; after all, we're still pressing buttons to make things happen on a screen. It feels like a game, and that's what makes it feel so wrong.

Experimental and atypical releases such as Passage, Flower, The Graveyard, Universe Sandbox, and Dear Esther seem to be expanding the definition of "video game" by really pushing the boundaries that separate this medium from others, and this seems to be happening regardless of whether the creators of these products even choose to refer to them as games at all. The result is that, while video games used to be a subset of games, they now occupy another space entirely. Dear Esther is, arguably, a "video game" — and most people will probably call it one — but it certainly isn't a game at all. Consequently, if people install it expecting a game, they're in for a disappointment. However, this doesn't make it a bad product. It just makes it something other than a game.

The Newest Art Form


But regardless of whether we choose to call them games, Passage and Dear Esther seem to be at the forefront of the movement to have video games recognized as an art form. It seems good enough, for most people, that these video non-games attempt to be something resembling art while existing in a video game-like format. Just as often as they are criticized for not being game-like enough, they are cited as examples in arguments and discussions over the elevation of video games to the status of art — arguments and discussions which, for better or worse, tend to revolve around those artistically driven (but, importantly, secondary) aspects of the medium: story, graphics, music, et cetera.

Bringing this up on a video game forum is like bringing up politics at Thanksgiving dinner; that is, it's a good way to upset everyone. The idea that a video game, of all things, can actually be art isn't just a problem for video game haters; it's also enough to offend some "hardcore gamers" who reject the very notion that story, graphics, music, and intangible things like atmosphere can add anything of value to the medium. Any attempt to create a video game explicitly as a work of art, which unfortunately is most often done at the expense of the quality or amount of traditional gameplay, is obviously going to upset these people, and — referring again to Dear Esther in particular — the outright and total abandonment of the "game" in "video game" is obviously going to drive them crazy. The existence of Dear Esther itself isn't really such a problem, but the paradoxical notion that video non-games are actually the future of the medium is anathema to "hardcore gamers" everywhere.

To be honest, though, I don't think it should be a surprise that we're moving in this direction after so many years of video games with increasingly more emphasis on story, character development, visual effects, and other non-essential, movie-like qualities, often with less focus on conventional gameplay and player freedom. (I think I've discussed such things once or twice before.) Even where core gameplay mechanics have been preserved, video games have already become more like movies (presumably in order to grab larger audiences who might be bored with playing just a game), and maybe we've already passed the point where gameplay mechanics truly become the secondary attraction to the mainstream audience.

Is all of this good or bad? (Does such a distinction exist?) What does the concept of video games as an art form mean for the future of video games? But wait; if we're going to ask that question, we first have to answer a couple of others: Is it even possible for a video game to be a work of art? And should video game developers attempt to be artists? Perhaps these are silly questions — no doubt the idea of treating a video game as a work of art sounds downright ridiculous to a lot of people — but this debate seems to be happening whether we like it or not, so I think it's worth discussing.

To these last two questions, respectively, I'd give a tentative yes and a maybe. Whether a video game created specifically and intentionally as a "work of art" can be good, as a game, is certainly questionable, but if music and literature and acting and photography and, most importantly, film can be treated as art, then... well, I need to be honest: I can't think of a good (objective) reason that video games in general should be excluded. That video games, as a medium, should be considered an art form simply because of how a game can imitate and appropriate other forms of art (i.e., music and acting and writing and film) is a dubious argument at best, but I do believe that a good film would not automatically stop being a work of art simply if interactive (game-like) elements were added to it. Perhaps the new generation of video games, which are often more movie-like than game-like, should be analyzed this way instead. And if video games, at least in theory, have the potential to be works of art, then perhaps developers should strive for this... right? I guess. Whether they know how is another question entirely, but more on that will come later.

Comparisons and Analogies


The opposition to the idea of video games as art is largely (but not entirely) from those who don't believe that expensive electronic toys are deserving of whatever respect or elevated status comes along with inclusion in the invisible list of which things are allowed to be considered art. You might similarly argue that Picasso's paintings are not art just because you dislike them. Beyond personal tastes, however, I have to wonder if there's an actual reason for excluding video games when everything else that claims to be art seems to be accepted without much fuss. You can carefully arrange a bunch of garbage and call it art, and other people will call it art as well, as long as you can say with a straight face that the garbage arrangement means something. Or maybe it means nothing, and that's deep. Who cares? It's art if people say it's art.

It's clear, however, that video games are fundamentally different from all other things which are commonly considered art. The whole point of a video game is player interaction. Most art, meanwhile, is meant to be enjoyed passively, and one might even call this a requirement. Such a rule remains unwritten, however, since no one ever had a reason to include the words "passive" and "non-interactive" in the definition of art before video games tried to nudge their way in. Attempts to redefine the word "art" just for the sake of snubbing video games are confusing and unhelpful.

Other arguments against the notion of "video games as art" come from a comparison of video games to more traditional games. Chess is not art, and neither is football. On the other hand, a great amount of creative work, including visual art, often goes into the creation of many tabletop games, notably those of the collectible card variety. Furthermore, the entire analogy is rather fallacious; I've already pointed out that video games are, perhaps unfortunately, no longer strictly a subset of games, and moreover they can do things that traditional games cannot.

Some even try to argue that video games cannot be art because they're most often created for profit, or because they're most often created by large development teams in large companies. Obviously, though, these arguments allow indie games to slip through the cracks.

Ultimately, these debates never go anywhere because the definition of art is notoriously fuzzy, subjective, and ever-changing. It all boils down to opinion, and that's okay. Words aren't invented by dictionaries; their definitions come from their usage, not the reverse. Arguing semantics in this case is effectively a dead end, and once you get past all that nonsense, the most commonly cited reason for excluding video games in particular from the art world is simply that we haven't yet seen a video game worthy of the same praise as a Shakespeare play or a Rembrandt painting. The implication is often that we never will, even if no specified rules would exclude video games on principle, because the quality of creative work that goes into the most critically acclaimed video games is still supposedly mediocre at best in comparison to, say, the most critically acclaimed films.

Again, the opinion that video games will never be art doesn't just come from old men like Roger Ebert who never played a video game. It comes from within the "gaming" community as well, mostly from those "hardcore gamers" who would argue (perhaps correctly) that the industry needs to return to a strong focus on complex and challenging gameplay, and to stop pandering to casual "gamers" with artsy/cinematic nonsense without even a lose state or a hint of any meaningful challenge. Games shouldn't be movies, the hardcore audience likes to say. If you've perceived a significant decline in the quality of video games over the years — that is, I should clarify, a decline in the quality of everything in video games except for graphics — then you'd probably say this is a compelling argument, and I would strongly agree. However, if we want to push for better gameplay via an end to the game industry's distracting infatuation with film, then we should just do exactly that. The argument about the video game's status as an art form is a separate one entirely.

Even arguing successfully that video games should not be art doesn't exactly prove that they are not or cannot be art, and even arguing successfully that they are not or cannot be art wouldn't keep them from trying to be art. More importantly, the notion of "art" being discussed here might be the wrong one for this context. It is possible, after all, for games to be a kind of "art" without relying on the imitation or appropriation the various aspects of other art forms.

Pixels and Squares


It's with some reluctance that I place myself on the pro-art side of the fence, for a number of reasons. First, regarding the more dubious but more common notion of "games as art" by virtue of their essentially movie-like qualities, I must admit that such a definition of art is valid whether or not it's good for the video game industry.

Although I don't think the potential for the inclusion of non-game-like qualities should be the justification for broadly treating the video game medium as an art form, I do think it's fair to treat an individual video game as a work of art based on whatever kind of arguably artistic work was involved in its creation. That is, although I don't think video games should necessarily be praised for how they simply imitate film and other media, the typical modern video game (like a typical film) is the product of many kinds of creative work — music, writing, acting, and of course the visuals which might be hand-drawn or computer-generated — and regardless of the average quality of all this creative work, it's still there. Picasso is still an artist even if you don't like him.

So how can one say that the culmination of all the artistic work that goes into a video game isn't art? I can't think of a non-feelings-based argument to support such a claim. Short of declaring that none of that work is currently done at a level that qualifies as true art (which leaves the door open for better games to qualify as art in the future), the only way out is to say that it ceases being art once it becomes a game — that even though it contains art in various forms, the finished product is not art because its primary function is to provide the player with a challenge or some entertainment. And I think that's a pretty bizarre thing to say.

But let's just go with it. Let's say it's true: video games cannot be art because they're games. Now we get to ask the really interesting question. What happens when the video game evolves to the point where it's no longer a game, as is the case with Dear Esther? Are we then allowed to call it art? And if so, is there really no point along the continuum from Tetris (pure gameplay) to Dear Esther (pure "art") at which games and art do intersect?

Perhaps the right course of action is to reject everything I just wrote and say that Tetris itself is a work of art already. So far, I've followed the typical course of these "video games as art" debates by analyzing the controversial (and perhaps misguided) idea that video games should be considered art by virtue of the way they incorporate other forms of art, e.g., the writing of the story that provides context to the gameplay, the drawing and modeling that result in the game's graphics, and the production of the soundtrack — but you also could argue (and should argue) that a well-designed game is a work of art by virtue of its design, be it elegant or complex or somewhere in between.

That's probably how the concept of the video game art should be understood in the academic sense. I think games can, and should, be recognized as art for the qualities that actually make them games. The defining feature of the video game as a medium — gameplay — needs to be considered, and perhaps nothing else. If architecture is an art form, then it's not because architects like to hang paintings on the walls of the buildings they design. It's because the talented architect will bring a unique kind of excellence to the actual design of the building itself. The same should be true of video games if they are to achieve that same status.

In truth, regardless of what we might say on occasion about an individual game which incidentally borders on "work of art" territory according to someone's opinion, I think the video game as a medium can never be accepted as an art form unless it is recognized as such for the qualities which make video games what they are. For the video game to be accepted as an form of art, game developers need to do more than paste some audiovisual art on top of some game mechanics. The game design itself — not just the graphics, or the music, or the story — needs to be done at a level that deserves to be called art. If you can remove the interactive elements from a particular game without sacrificing any of what makes that game a work of art, then that game isn't doing anything to promote video games as an art form. It could have done just as well as a movie.

In the colloquial sense, however, most people accept a game as a work of art only if it conveys some meaning and evokes some emotion, and thus pasting audiovisual art on top of game mechanics is perfectly fine. Most video games attempt art status by telling a story, and maybe that's totally legitimate as well. I wouldn't object to classifying Max Payne as a work of art for its narration alone, even though Max Payne achieving "art status" merely by way of its writing does nothing for video games as a medium. In any case, on the subject of video games as art via narrative, I only wish it were more often done much better. The typical story-driven game is an alternating sequence of meaningless challenges and non-interactive cut scenes. They could very often be separated from one another and each do better on their own. If developers want their games to be art — or, perhaps more accurately, if they want their art to be games — they should at least incorporate interactive elements in a way that supplements the supposedly artistic value. Too often, these two aspects of a game just end up sitting side-by-side. Ideally, game developers who want to be artists should just study the art of good design instead of stapling a half-decent game design to a half-decent movie.

All of this is just food for thought, obviously. The question at the heart of all this thought is too subjective (and currently too controversial) for a satisfying answer. If you want an objective definition of the word "art" then I have one observation to share: with few exceptions, a thing is considered art if and only if it was meant to be art, created with artistic intentions by one who fancies oneself an artist. (Whether it's good art is another question entirely.) In other words, the creator does have some say in the matter. In 1915, a Russian guy named Kazimir Malevich painted a black square and called it art, and that black square ended up in an art museum, but that doesn't mean every other black square is also art. And of course, the "consumers" of art also have some say in the matter, because that black square wouldn't have ended up in a museum if nobody else had thought it was worth displaying.

And hey, look, there are video games in an art museum now. It's worth noting that the games featured at MoMA were selected for their ingenious design. They are being appreciated for the qualities that make video games a unique medium, and nothing else. That's a step in the right direction both for gameplay purists and for those who want video games to be taken seriously as an art form. After all, how are video games ever going to get this recognition if the way we're trying to make them more like art is by making them less like games and more like movies? The video game itself cannot be art if individual games only become art by branching out into other established art forms. Indeed the game design itself needs to be recognized as an art form on a fundamental level, with or without all the fancy toppings.

In any case, as with black squares, I would hesitate to hail a video game as a "work of art" if it's known that the developers never had this in mind, but if the developers are passionate about their work and if consumers are passionate about enjoying it, the label fits well enough to elicit no complaints from me. The relevant point, I suppose, is that a video game can be art — the art museum has spoken — and, more importantly, it can still be a good game, too. However, with regard to digital entertainment in which the basic elements typically defining the traditional game are drastically demoted or abandoned entirely in favor of other types of artistic expression, I really think we need to update our terminology. In other words, if Dear Esther isn't a game, it shouldn't be called a "video game" either.

Wrong Direction


The fact that Dear Esther and similar releases are considered to be video games, by many, is terrifying to the rest of us because it amplifies the perception that these overly cinematic, overly linear, sometimes pretentiously artsy experiences, devoid of any challenge or depth in gameplay, are the future of our hobby.

There are those who really would argue, instead, that Dear Esther is an extreme example of where video games should be headed. Some say that video games should do more than simply challenge the player — that they should convey a deeper meaning and tell a better story — and that's totally fine, as long as we're talking about supplementing the gameplay, not removing it. Otherwise, the argument is really just a roundabout way of saying "I've realized that I don't even like video games and I need something else to do with this controller I bought" — something else like interactive fiction, perhaps. So why don't we make that, and call it that, instead of pushing to change video games into that? Apparently because, even when people realize that all they care about is the storyline, they still seem unusually desperate to call themselves "gamers" despite the fact that their ideal "video game" is hardly a video game at all. They just really want their nerd cred or something.

Perhaps this is what the industry gets for having attempted for so many years to fit deep story and deep gameplay into the same product. The prospect of an interactive story inevitably attracts people who — let's be honest — just aren't interested in playing real video games. I'm referring, of course, to the "casual gamers" who really do see challenge as an unnecessary obstacle that should be removed so that people who aren't any good at video games can still enjoy what's left of them. To be honest, this worries me. If players see challenging gameplay itself as a nuisance, and developers cater to them by making challenging gameplay optional, we're coming awfully close to throwing out one of the most fundamental properties of the video game as we once knew it.

I think we'd all be better off if we just allowed interactive fiction to become its own thing, with its own audience, instead of allowing the entire industry to be dragged in the wrong direction. It seems to be going in that direction either way, in its attempts to hook that casual (non-)gamer audience, but we shouldn't legitimize this by expanding the definition of "video game" to such an extent that people who buy interactive movies get to call themselves gamers.