Saturday, March 29, 2014

Neat, I'm Insignificant

For some horrible reason, I'm following Polygon on Twitter. Thanks to that mistake, I just came across this brief article about social interaction among us "video gamers" and how we aren't the "antisocial misfits, basement-dwellers and loners" that outsiders supposedly think we are. The headline and the first two paragraphs strongly contest the stereotype against avid video game players. Loners, we're told, are actually the outliers, going against the trend.

Great, right? Sure does sound great. But it's not great, and they're not even convincingly making the case that it's true.

The article — titled Research: Loners are the exception, not the rule, among video gamers — describes a study in which researchers from North Carolina State University, in collaboration with others at York University and the University of Ontario Institute of Technology, observed thousands of video game players and surveyed hundreds more. The results of this study are published in a paper titled Public Displays of Play: Studying Online Games in Physical Settings.

Let's stop there.

Here's what you should be asking yourself: "Online games specifically? Physical settings in particular? Are we really using this study alone to make claims about 'gamers' in general?" Unfortunately. According to Polygon, the findings of the study are based on observations and surveys of MMORPG players at various "public gaming events." If either Polygon or the researchers themselves really do intend to draw conclusions about all consumers of video games based on this information, they're making just about as much sense as some guy who uses limited anecdotal evidence to argue that we are all basement-dwelling antisocial freaks.

It should be no surprise to anyone that the subjects of this study — the fans of massively multiplayer online games who meet up with other fans in person to participate in highly social activities — are highly social. Nobody would suspect they are "loners" of any kind. In other words, this isn't the shocking and counter-intuitive finding that Polygon wants it to be. It's the same kind of obvious, common-sense-confirming, almost insignificant finding that usually comes from sociological studies and experiments.

I'll admit right now that I have neither the time nor the patience to read the study in full, so I'm not sure if this not-so-amazing revelation — that the most socially active type of "gamer" you can imagine is by no means antisocial — is really the only point that the researchers wanted to make. On the other hand, I'm not sure if they really meant to draw outrageous conclusions about all "gamers" either. Judging by the title of the research paper and the abstract, I can only suspect that the conspicuous generalization of the study's findings to all "gamers" was Polygon's own invention.

In a way, I do admire what Polygon is trying to do with his article. They're trying to break down the worst negative stereotypes about "gamers" and to make video games a more socially acceptable hobby by exposing the fallacy of the antisocial, basement-dwelling, lonely game enthusiast. However, the article succeeds only in telling me (erroneously by the way) that I'm not representative of the "gamer" demographic because, in some ways, I am a loner. I'm not a creepy loner with no friends, but I do most often play video games alone. A lot of people who make a hobby out of video games are loners, and there's nothing wrong with that. We're not necessarily antisocial and I'd wager that very few of us actually dwell in basements, but it's not crazy to assume that a large percentage of people who might properly be called "gamers" enjoy playing video games at home by themselves and are less socially active, in the traditional sense, than people who prefer football. I'd conduct my own research if I had nothing better to do.

By making sweeping generalizations about "video gamers" based on the results of a study with no consideration of single-player or otherwise non-MMO games (i.e., almost everything) and no consideration of people who only play at home (i.e., almost everyone), Polygon is saying that those convention-going MMO fans are the only "gamers" who count. That's not really fair.

Furthermore, while I do, again, appreciate the effort to promote video games as a more socially acceptable pastime, I'd rather they didn't. Video games are already in a long transition from dorky to mainstream — in fact, that transition is almost complete — and while this does have benefits, it also causes problems. We've all seen what happens when developers and publishers of video games pander to the same demographic that used to shun video game enthusiasts as losers and nerds. People who actually like video games for reasons other than its new-found status as a trendy thing to do — people who, if you'll pardon the hipsterism, actually liked video games before they were cool — are now shunned by the video game industry, which is more interested in marketing to the casual "I've never played a video game before and I hate a good challenge" crowd.

So no, screw Polygon.

While I'm certain that most video game enthusiasts are far more socially active (and generally more "normal") than the antisocial loser stereotype, I'm also certain that this stereotype is outdated and no one actually believes it anymore. We no longer need use dubious interpretations of potentially biased studies, completely disregarding the majority of actual video game hobbyists, in order to remind people that we're people too.

Thursday, February 13, 2014

If You Can't Say Something Nice...

We know all know how the grammatically incorrect saying goes: "...don't say nothing at all." It's pretty sound advice for social interaction. If you're not going to be nice to a person, sometimes it's best to leave them alone and mind your own business instead of causing unnecessary emotional pain or picking an unnecessary fight. I don't think, however, that the cute little rabbit from Bambi had criticism of video games in mind when he spoke his words of wisdom.

When I logged into Steam earlier today, I noticed a new feature: a user-generated tagging system complete with personalized recommendations for my account based on the games in my library and the tags attached to them. I also noticed that some of the players tagging games are very openly opinionated. Some games are being tagged as "bad" and all of its synonyms. Some games are being tagged as "overrated" and BioShock Infinite tops the list. Other examples of criticism-by-tagging are slightly more subtle. The "casual" tag is being placed not only on casual games as they are known in the traditional sense, but also on games deemed too easy by hardcore players who expect legitimate challenge in their games. (Notable examples include the Thief reboot, which is geared toward players who never played the original series or any difficult stealth game, and Call of Duty: Ghosts, the latest in a series of first-person shooters which has long been associated with an immature fan base who would eat garbage if the TV said it was cool).

Kotaku writer Patricia Hernandez noticed it too, and I don't usually comment every time a Kotaku employee writes something that annoys me — I don't have that kind of time — but on this occasion it will serve as a nice excuse to mention a couple of other things that were already on my mind.

"Trolling is definitely a thing in Steam Tags right now," Hernandez writes, and maybe she's not entirely wrong. Surely some tags are being added just for the sake of annoying the fans of certain games, just for laughs. The tags to which she's referring, though, are the ones that merely express a genuine opinion, like "casual" as applied to Dark Souls and "bad" as applied to Gone Home.

I'm not really sure when the meaning of the word "trolling" shifted so drastically. It used to mean saying inflammatory things for the sole purpose of angering other people, especially when the things being said are completely disingenuous. A good troll pretends to be completely serious when he posts deliberately flawed arguments the middle of an otherwise intelligent discussion for the sake of disrupting it. He pretends to be dumber than he is, or more ignorant than he is, because this makes his ignorant comments all the more infuriating, and he keeps this up for as long as possible because the game is essentially over when someone figures out that he's "just trolling." Trolls get attention because of the perception that they're incredibly stupid or incredibly horrible in some other way, not the because of their actual opinions.

But "trolling" adopted a different meaning in mainstream media soon after the mainstream media (thought they) learned the word, and maybe it's because successful trolls on the internet are so good at hiding their true intentions. The whole "pretending" part is conspicuously missing from the outside world's common understanding of what trolling is. Almost any type of online harassment or use of unkind words is, therefore, called "trolling" even if the supposed trolls are, instead of really trolling, just being completely serious and stating their actual opinions (albeit in a rude manner). Those who use the word "trolling" in this context probably wouldn't see through the ruse if an actual troll were trolling them, so maybe I can't blame them for getting it wrong, but I miss the times when words had meaning.

The past few years, I think, have seen the bastardization of the word come to completion. People are now using the word "trolling" even to refer to the expression of just about any unpopular or unacceptable opinion. We're seeing some of it right here in this Kotaku article.

Let's say I've never played Gone Home, but I tag it with the word "bad" just because I know its fans are likely to have a big cry and act like their human rights are being violated, resulting in a ludicrous and humorous display. That's trolling. If I play it and then tag it with the word "bad" because I genuinely think it's bad and should be categorized as such, I'm merely expressing an opinion. There's an important difference that Hernandez doesn't appear to understand. In fact, I'm really not sure if she knows how opinions work at all. The following is an excerpt from the article that comes right after her "trolling" comment:
Let's start with the "bad" tag. It does have games notorious for being poor—The Walking Dead: Survival Instinct, for example. It's hard to argue the quality of that game. Gone Home, though? It's not everyone's cup of tea, but that doesn't mean it's bad!
Really? It doesn't? One doesn't get to call something bad when one doesn't like it?

Let's analyze this bizarre logic. Tagging one game as "bad" is totally fine because it was unpopular (or because fellow Kotaku writer Kirk Hamilton thought it was bad too), but tagging the other game as "bad" is incorrect because... why? Because the right people (perhaps the kind of people who read Kotaku) like it? Because its perceived relative popularity means the alternate opinion doesn't exist? Hernandez seems to think that each game has an objective amount of badness, and that a certain threshold of badness (or of agreement on its badness) must be crossed before we're allowed to call it bad. In other words, "bad" is not allowed to be an individual person's opinion. That's kind of strange because, if you ask anyone who has a firm grasp on the difference between facts and opinions, the fact that it's "not everyone's cup of tea" does mean it's bad — it's bad in the minds of people who would prefer some other tea in a different cup.

A normal person in Hernandez' position would just say "I disagree with these people," and maybe that's what she's trying to say, but if that's the case then she has a very strange way of saying it. She's not simply stating her own opinion about the game; she's suggesting that some opinions are true while other opinions are false. She's saying it's wrong for anyone to tag Gone Home as a bad game on Steam, not only because the tagging system wasn't necessarily meant for opinions, but more importantly because this particular game wasn't as universally unloved as The Walking Dead: Survival Instinct. She's calling it "trolling" and, whether she knows the actual meaning of the word or not, it makes her look like a gigantic moron.

This is just the latest example of a worrying trend in which people take Thumper's advice about saying not-so-nice things and apply it to everything said about their favorite commercial products. In addition to the misuse of the word "trolling" (as usual), Hernandez' article uses words like "unfair" to describe the negative tags placed on games that she likes. Some of the tags being used are definitely uncool — concisely written spoilers and random profanity, for example — but expressing negative (or otherwise unpopular) opinions about a game is not by any means "unfair" and, in my opinion, even using tags to express these opinions doesn't really amount to abuse of the system. It was designed to let players tag games as they see fit. I guess I shouldn't be surprised that "gaming" "journalists" disagree. After all, they're the ones who make a living pumping out reviews with inflated scores because their advertising revenue depends on it, while they push the notion that players who complain about bad games are just displaying a false sense of entitlement.

The most popular tags for Gone Home right now are "not a game," "walking simulator," and "bad." Hernandez thinks these tags aren't helpful, but they are if a player wants to know if a game is worth buying. Since tags are more visible if they're more popular, even tags like "good" and "bad" are just about as helpful as the user scores on Metacritic. Tags like "not a game" and "walking simulator" are meant to be humorous but they do give players an idea of what Gone Home is like. They're informative even if they're exaggerations. The "not a game" tag is sure to be the most controversial, but people have been accusing Gone Home of not being a game since it was released, and it's a valid criticism. We don't get to say it's unfair just because it hurts the developers' or fans' feelings.

I sincerely hope that Valve doesn't side with the crybabies at Kotaku by moderating all traces of opinion out of the tagging system. If the people running Steam didn't want opinions to show up in user-generated tags, they shouldn't have implemented the feature at all. Games on Steam are already sorted by genre, and they could have just expanded upon this if they only wanted a dry, boring and sterile categorization system devoid of anything subjective.



Update (February 15, 2014):


It looks like Valve is indeed moderating the user-generated tags, and on second thought I really can't blame them for not wanting strongly negative descriptors attached to products on their own store. (Tags were never meant to be used as mini-reviews, so I can't even call it a scandal as long as no one is tampering with the actual user review system.) Apparently tags like "bad" are no more, and even the popular "not a game" tag has vanished. As of right now, though, Gone Home is still classified as a "walking simulator" first and foremost, and I think it's pretty hilarious.

Monday, January 27, 2014

Midlife Crisis, Part 4

I won't pretend that the following is some kind of revelation brought on by my somewhat recent decision to splurge on a shiny new computer. I've known it for a long time, and I've probably mentioned it before, but right now — at the beginning of a new year and the end of a string of posts regarding my PC-building experience and all the second-guessing involved — just seems like a pretty good time to bring it up.

Playing video games doesn't seem nearly as fun as it used to be.

And it's not just because I've grown up. It certainly does have something to do with my lack of free time as an actual adult who works for a living, but it's not a change of my own personality that makes it more difficult to sit back and enjoy what used to be incredibly entertaining. I mean, let's face it, I'm still a kid on the inside. I'm in my mid-20s but I have a blog about video games and I still think playing games (in moderation) is a perfectly good use of time. That alone, according to almost anyone, probably makes me a man-child in a man-man's body. I think I've grown up enough to convince people that I've grown up, but I'd still rather play a video game than read a newspaper, and I'd rather argue about video games than argue about politics. As far as I know, this isn't so abnormal for guys my age. Incidentally, I'm told I belong to an entire generation of man-children who don't know how to be real men — that I'm the product of the downfall of society and that it's probably the internet's fault — but that's a discussion for another day.

The problem isn't that I've grown out of video games. If I really had, there would be no problem at all and I wouldn't care enough to write this. The problem, in fact, is a bunch of problems.

A lot of my friends have grown out of video games, and I have fewer people with whom to enjoy my pastime of choice. Society as a whole thinks I should grow out of video games, so unless I work in the industry (which I do not), I can't openly express my enthusiasm for the medium without inviting all sorts of assumptions about my character (only some of which are true). Have I ever mentioned that I'm using a pseudonym? For the same reason I don't just go ahead and list "gaming" as a skill on my résumé, I don't really want potential employers to find this blog when they do a quick background check on me. It's both irrelevant and potentially damaging. So is most of what I've ever posted on Facebook, so I should really double-check my privacy settings or just delete my account.

Playing video games, in some ways, has become a pretty lonely activity, and it's not just because so many of my friends have left the party. It's also because I'm interested primarily in single-player games these days, and nobody wants to watch me play them, not that I expect them to. Oddly enough, in my youth I felt that video games were very much a spectator sport. Enjoying a new single-player game with my two brothers didn't necessarily involve taking turns, but those were the days when we had nothing better to do on a weekend than watch each other get mad at the final boss in some nearly impossible 2D sidescroller. People watching people play video games isn't actually such a weird thing — search for "Let's Play" on YouTube and see for yourself — but everyone I know who still has any interest in video games only seems to like massively multiplayer online stuff anyway. So screw me and my apparently bad taste, I guess.

To some extent, my trouble with maintaining an interest in any of the recent games I've played might also be a case of unrealistic expectations. In comparing today's games to those of my apparently idyllic childhood, the verdict is always "meh." Feelings of nostalgia make objectivity impossible and I have plenty of those feelings for video games I played when I was 10 years old.  Maybe I just think I'm having less fun because nothing can live up to that rose-tinted version of reality. It could also be that, having played so many games over the years, I'm less easily impressed. The first time I played a first-person shooter, it was really fantastic. After a hundred of them, it takes something really crazy to keep me interested.

This post probably wouldn't be complete if I didn't at least half-heartedly entertain the notion that games themselves are actually getting worse, but I think what's more important is that modern games are always made to cater to people who never played a game before. The depth I crave would probably alienate new players, and the mandatory tutorial levels that new players need are boring me.

All of these are contributing factors, but my lack of free time is by far the most significant. I work 40 to 50 hours a week doing something I don't enjoy, and when I get home it's hard to do anything but sleep. I spend the majority of my free time with my girlfriend, who doesn't care that I play video games but doesn't care to play them herself, so it's not something we can do together. When I do get some time alone to waste, it's rarely more than a couple of hours, and rarely can I find the motivation to start a new game when I'm on such a tight schedule. So I just end up browsing the web or loading a save in the middle of something I've already finished. Even on the weekends, the prospect of going back to work on Monday is so distracting that it's hard to enjoy anything at all. You know, I think I'm just depressed. This isn't helping to shrink my rapidly growing backlog of things I really want to play eventually.

Maybe part of me thought that buying a new PC would somehow fix all of this. I guess it has, at least a little, since there are more games that I can actually play and I'm excited about playing them. I have to admit, I've enjoyed replaying Crysis with the graphics turned up to crazy (even though this only gets me around 40 frames per second most of the time). But I haven't had the time or the patience to dive into L.A. Noire or Mirror's Edge or Dead Space, all of which are on my Steam account with only a few minutes logged. Maybe, one of these lazy Sundays, I'll have a chance to make some progress in one of them, and maybe the experience won't be ruined by thoughts of the impending Monday.

Sunday, December 22, 2013

Midlife Crisis, Part 3

The most frustrating thing about having a hobby is that you never really have time for one unless you're unemployed and lonely. For better or for worse, I'm neither. This was the case before I bought my new PC, and it's still the case now that I've gotten most of my games installed on it. There will always be weekends, and I have a few hours of downtime after work each weekday, but it becomes more clear every time a new game is released that I'm going to die of old age before I get to finish every game that I deem worth playing. Such is the price I pay for attempting to have a life on the side.

So far, I've actually spent more time fiddling with my PC than playing games on it. Lately, this fiddling has been the enjoyable kind; I've been installing all the software I need, rearranging my desktop icons like the truly obsessive-compulsive person I am, and more generally setting things up just how I like them. For the first few weekends of my PC's existence, however, I had nothing but trouble.

First, I didn't bother getting a wireless network adapter because a stationary computer should ideally be placed where an ethernet cable can reach it. Unfortunately, I needed the computer to be in another room temporarily. To remedy the situation, I tried using something I already had in my closet — a D-Link wireless USB adapter. It worked pretty well until my network started slowing down or crashing every time I tried to use a lot of bandwidth (i.e., by downloading a Steam game). I'm still not sure what the problem was; maybe there was some kind of incompatibility with the router, or maybe something more complicated was going on. Maybe it was my computer's fault, somehow. Fortunately, I don't really need to figure it out, since I'm using a wired internet connection now and I don't really have any need for Wi-Fi (let alone the D-Link adapter) in the near future.

Other problems included a couple of random blue screen errors (most likely caused by an AMD video card driver which I've updated) and various problems with various games. The original Assassin's Creed, for example, refused to start when I first installed it, and I'm not even sure how I fixed the problem. I'd tried a few things, given up, and turned off the computer, and when I tried launching the game again later, it worked just fine. (Actually, I had to turn on compatibility mode for Windows Vista because I was getting a black screen where the opening cut scene should have been, but that's hardly an issue. As often as compatibility mode fails, it should always be the default first move if an old game does something weird.)

Compatibility mode for Windows 98 / Windows ME was also the initial solution for the Steam version of the original Max Payne, which failed to launch even though the process was visible in the task manager. However, even after the game launched, some of the music was gone and the sound effects were severely messed up. Fortunately, some nice guy created his own patch to fix the problem. It sucks that the original developers of old games like Max Payne aren't willing to invest the time and money to solve these problems themselves (especially when they're still selling these old games alongside their sequels on digital services like Steam), and the amateurs who pick up the slack are true heroes.

I'm reminded of Command & Conquer: The First Decade, a box set of a dozen games from the series. A couple of official patches were released, but not all of the bugs were fixed, so fans started patching it up themselves. The unofficial 1.03 patch, a collection of bug fixes and other features, was absolutely essential for anyone who had this particular Command & Conquer box set. But it's not just the occasional issue with an outdated game that often necessitates a third-party fix.

Now that I have a good computer, my older games don't even come close to pushing the graphics card to its limits, which means most of these games will needlessly run at a frame rate much higher than my monitor's refresh rate. Usually, this just causes screen tearing. In extreme cases, I can even hear what sounds like coil whine, an irritating whistling noise coming from inside the computer (not the speakers). This happens on the main menu screens of F.E.A.R. and some other games, presumably because the computer is able to render thousands of frames per second when there isn't much to display.

Turning on a game's Vsync feature (preferably with triple buffering enabled as well) fixes these problems, but a few of my games don't have a working Vsync feature. Each of the games in the S.T.A.L.K.E.R. trilogy, for example, has an option for Vsync in the settings, but in all three games it does nothing. It's straight-up broken. The optimal solution would be to force Vsync and triple buffering through the control panel software of ones graphics card, but AMD cards can't do this for certain games on Windows 7, and it's my understanding that both Microsoft and AMD are to blame for that. Even with Vsync set to "always on" in Catalyst Control Center, I was getting stupidly high frame rates in S.T.A.L.K.E.R.: Shadow of Chernobyl.

Then I heard about D3DOverrider, a little tool included in an old freeware program called RivaTuner. It's made to enable Vsync and triple buffering in software that's missing one or both options, and it works like a charm. Despite S.T.A.L.K.E.R.'s broken Vsync feature, and despite Catalyst Control Center's inability to fix the problem, D3DOverrider gets the job done. Now I'm getting a fairly consistent 60 frames per second, instead of hundreds of frames in-game and thousands of frames on the menu. No more vertical tearing and more no quiet-but-irritating coil whine.

That other first-person shooter set in a post-apocalyptic Eastern Europe, Metro 2033, has its own share of issues, namely that a lot of useful options don't show up in its menu and have to be toggled on or off by editing a few configuration files in Notepad, and it also appears to have a broken Vsync feature. In this case, not even D3DOverrider appears to be solving the problem. Fortunately, the game's poor optimization means that it doesn't always exceed 60 frames per second at the highest graphics settings anyway, making Vsync mostly unnecessary. People with more powerful systems might have to keep on looking for solutions.

All of this is pretty frustrating, but troubleshooting is to be expected when playing games on a PC, especially when the games are relatively old and the operating system is relatively new. I guess I should just be glad that most of the common problems can be solved.

"But if only you'd bought a console," some would say, "your games would just work." That's the favorite argument in favor of consoles. They just work. But now that the short-lived phenomenon of backwards compatibility has gone out the window with PlayStation 4 and Xbox One, I don't think it's a fair argument. Most of the problems with PC games arise when one is trying to have a nostalgic experience by playing an old game on a new system, and the other problems are usually the fault of careless developers.

I guess we should all be glad that PC games work at all, considering that our "gaming computers" are not standardized like all the millions of identical Xbox One and PlayStation 4 consoles. Since I'm not a game developer, I can only imagine how difficult it must be to ensure that a game is going to work consistently on so many hardware configurations. Maybe I shouldn't be so upset that games like S.T.A.L.K.E.R. have a few broken features, or that games like Max Payne continue to be sold without being updated for the current version of Windows. On the other hand, it's harder to forgive professional developers for an imperfect product when presumably amateur developers are able to correct the imperfections without being paid.

Update: It seems that, since I originally wrote this post, S.T.A.L.K.E.R. was actually updated with a frame rate cap of 60 fps. I'm shocked that such an old game was actually updated, to be honest, but apparently some people with expensive computers were burning out their video cards by leaving the game paused (thereby allowing the game to run at hundreds or thousands of frames per second for long periods of time). Terrifying.