Thursday, February 13, 2014

If You Can't Say Something Nice...

We know all know how the grammatically incorrect saying goes: "...don't say nothing at all." It's pretty sound advice for social interaction. If you're not going to be nice to a person, sometimes it's best to leave them alone and mind your own business instead of causing unnecessary emotional pain or picking an unnecessary fight. I don't think, however, that the cute little rabbit from Bambi had criticism of video games in mind when he spoke his words of wisdom.

When I logged into Steam earlier today, I noticed a new feature: a user-generated tagging system complete with personalized recommendations for my account based on the games in my library and the tags attached to them. I also noticed that some of the players tagging games are very openly opinionated. Some games are being tagged as "bad" and all of its synonyms. Some games are being tagged as "overrated" and BioShock Infinite tops the list. Other examples of criticism-by-tagging are slightly more subtle. The "casual" tag is being placed not only on casual games as they are known in the traditional sense, but also on games deemed too easy by hardcore players who expect legitimate challenge in their games. (Notable examples include the Thief reboot, which is geared toward players who never played the original series or any difficult stealth game, and Call of Duty: Ghosts, the latest in a series of first-person shooters which has long been associated with an immature fan base who would eat garbage if the TV said it was cool).

Kotaku writer Patricia Hernandez noticed it too, and I don't usually comment every time a Kotaku employee writes something that annoys me — I don't have that kind of time — but on this occasion it will serve as a nice excuse to mention a couple of other things that were already on my mind.

"Trolling is definitely a thing in Steam Tags right now," Hernandez writes, and maybe she's not entirely wrong. Surely some tags are being added just for the sake of annoying the fans of certain games, just for laughs. The tags to which she's referring, though, are the ones that merely express a genuine opinion, like "casual" as applied to Dark Souls and "bad" as applied to Gone Home.

I'm not really sure when the meaning of the word "trolling" shifted so drastically. It used to mean saying inflammatory things for the sole purpose of angering other people, especially when the things being said are completely disingenuous. A good troll pretends to be completely serious when he posts deliberately flawed arguments the middle of an otherwise intelligent discussion for the sake of disrupting it. He pretends to be dumber than he is, or more ignorant than he is, because this makes his ignorant comments all the more infuriating, and he keeps this up for as long as possible because the game is essentially over when someone figures out that he's "just trolling." Trolls get attention because of the perception that they're incredibly stupid or incredibly horrible in some other way, not the because of their actual opinions.

But "trolling" adopted a different meaning in mainstream media soon after the mainstream media (thought they) learned the word, and maybe it's because successful trolls on the internet are so good at hiding their true intentions. The whole "pretending" part is conspicuously missing from the outside world's common understanding of what trolling is. Almost any type of online harassment or use of unkind words is, therefore, called "trolling" even if the supposed trolls are, instead of really trolling, just being completely serious and stating their actual opinions (albeit in a rude manner). Those who use the word "trolling" in this context probably wouldn't see through the ruse if an actual troll were trolling them, so maybe I can't blame them for getting it wrong, but I miss the times when words had meaning.

The past few years, I think, have seen the bastardization of the word come to completion. People are now using the word "trolling" even to refer to the expression of just about any unpopular or unacceptable opinion. We're seeing some of it right here in this Kotaku article.

Let's say I've never played Gone Home, but I tag it with the word "bad" just because I know its fans are likely to have a big cry and act like their human rights are being violated, resulting in a ludicrous and humorous display. That's trolling. If I play it and then tag it with the word "bad" because I genuinely think it's bad and should be categorized as such, I'm merely expressing an opinion. There's an important difference that Hernandez doesn't appear to understand. In fact, I'm really not sure if she knows how opinions work at all. The following is an excerpt from the article that comes right after her "trolling" comment:
Let's start with the "bad" tag. It does have games notorious for being poor—The Walking Dead: Survival Instinct, for example. It's hard to argue the quality of that game. Gone Home, though? It's not everyone's cup of tea, but that doesn't mean it's bad!
Really? It doesn't? One doesn't get to call something bad when one doesn't like it?

Let's analyze this bizarre logic. Tagging one game as "bad" is totally fine because it was unpopular (or because fellow Kotaku writer Kirk Hamilton thought it was bad too), but tagging the other game as "bad" is incorrect because... why? Because the right people (perhaps the kind of people who read Kotaku) like it? Because its perceived relative popularity means the alternate opinion doesn't exist? Hernandez seems to think that each game has an objective amount of badness, and that a certain threshold of badness (or of agreement on its badness) must be crossed before we're allowed to call it bad. In other words, "bad" is not allowed to be an individual person's opinion. That's kind of strange because, if you ask anyone who has a firm grasp on the difference between facts and opinions, the fact that it's "not everyone's cup of tea" does mean it's bad — it's bad in the minds of people who would prefer some other tea in a different cup.

A normal person in Hernandez' position would just say "I disagree with these people," and maybe that's what she's trying to say, but if that's the case then she has a very strange way of saying it. She's not simply stating her own opinion about the game; she's suggesting that some opinions are true while other opinions are false. She's saying it's wrong for anyone to tag Gone Home as a bad game on Steam, not only because the tagging system wasn't necessarily meant for opinions, but more importantly because this particular game wasn't as universally unloved as The Walking Dead: Survival Instinct. She's calling it "trolling" and, whether she knows the actual meaning of the word or not, it makes her look like a gigantic moron.

This is just the latest example of a worrying trend in which people take Thumper's advice about saying not-so-nice things and apply it to everything said about their favorite commercial products. In addition to the misuse of the word "trolling" (as usual), Hernandez' article uses words like "unfair" to describe the negative tags placed on games that she likes. Some of the tags being used are definitely uncool — concisely written spoilers and random profanity, for example — but expressing negative (or otherwise unpopular) opinions about a game is not by any means "unfair" and, in my opinion, even using tags to express these opinions doesn't really amount to abuse of the system. It was designed to let players tag games as they see fit. I guess I shouldn't be surprised that "gaming" "journalists" disagree. After all, they're the ones who make a living pumping out reviews with inflated scores because their advertising revenue depends on it, while they push the notion that players who complain about bad games are just displaying a false sense of entitlement.

The most popular tags for Gone Home right now are "not a game," "walking simulator," and "bad." Hernandez thinks these tags aren't helpful, but they are if a player wants to know if a game is worth buying. Since tags are more visible if they're more popular, even tags like "good" and "bad" are just about as helpful as the user scores on Metacritic. Tags like "not a game" and "walking simulator" are meant to be humorous but they do give players an idea of what Gone Home is like. They're informative even if they're exaggerations. The "not a game" tag is sure to be the most controversial, but people have been accusing Gone Home of not being a game since it was released, and it's a valid criticism. We don't get to say it's unfair just because it hurts the developers' or fans' feelings.

I sincerely hope that Valve doesn't side with the crybabies at Kotaku by moderating all traces of opinion out of the tagging system. If the people running Steam didn't want opinions to show up in user-generated tags, they shouldn't have implemented the feature at all. Games on Steam are already sorted by genre, and they could have just expanded upon this if they only wanted a dry, boring and sterile categorization system devoid of anything subjective.



Update (February 15, 2014):


It looks like Valve is indeed moderating the user-generated tags, and on second thought I really can't blame them for not wanting strongly negative descriptors attached to products on their own store. (Tags were never meant to be used as mini-reviews, so I can't even call it a scandal as long as no one is tampering with the actual user review system.) Apparently tags like "bad" are no more, and even the popular "not a game" tag has vanished. As of right now, though, Gone Home is still classified as a "walking simulator" first and foremost, and I think it's pretty hilarious.

Monday, January 27, 2014

Midlife Crisis, Part 4

I won't pretend that the following is some kind of revelation brought on by my somewhat recent decision to splurge on a shiny new computer. I've known it for a long time, and I've probably mentioned it before, but right now — at the beginning of a new year and the end of a string of posts regarding my PC-building experience and all the second-guessing involved — just seems like a pretty good time to bring it up.

Playing video games doesn't seem nearly as fun as it used to be.

And it's not just because I've grown up. It certainly does have something to do with my lack of free time as an actual adult who works for a living, but it's not a change of my own personality that makes it more difficult to sit back and enjoy what used to be incredibly entertaining. I mean, let's face it, I'm still a kid on the inside. I'm in my mid-20s but I have a blog about video games and I still think playing games (in moderation) is a perfectly good use of time. That alone, according to almost anyone, probably makes me a man-child in a man-man's body. I think I've grown up enough to convince people that I've grown up, but I'd still rather play a video game than read a newspaper, and I'd rather argue about video games than argue about politics. As far as I know, this isn't so abnormal for guys my age. Incidentally, I'm told I belong to an entire generation of man-children who don't know how to be real men — that I'm the product of the downfall of society and that it's probably the internet's fault — but that's a discussion for another day.

The problem isn't that I've grown out of video games. If I really had, there would be no problem at all and I wouldn't care enough to write this. The problem, in fact, is a bunch of problems.

A lot of my friends have grown out of video games, and I have fewer people with whom to enjoy my pastime of choice. Society as a whole thinks I should grow out of video games, so unless I work in the industry (which I do not), I can't openly express my enthusiasm for the medium without inviting all sorts of assumptions about my character (only some of which are true). Have I ever mentioned that I'm using a pseudonym? For the same reason I don't just go ahead and list "gaming" as a skill on my résumé, I don't really want potential employers to find this blog when they do a quick background check on me. It's both irrelevant and potentially damaging. So is most of what I've ever posted on Facebook, so I should really double-check my privacy settings or just delete my account.

Playing video games, in some ways, has become a pretty lonely activity, and it's not just because so many of my friends have left the party. It's also because I'm interested primarily in single-player games these days, and nobody wants to watch me play them, not that I expect them to. Oddly enough, in my youth I felt that video games were very much a spectator sport. Enjoying a new single-player game with my two brothers didn't necessarily involve taking turns, but those were the days when we had nothing better to do on a weekend than watch each other get mad at the final boss in some nearly impossible 2D sidescroller. People watching people play video games isn't actually such a weird thing — search for "Let's Play" on YouTube and see for yourself — but everyone I know who still has any interest in video games only seems to like massively multiplayer online stuff anyway. So screw me and my apparently bad taste, I guess.

To some extent, my trouble with maintaining an interest in any of the recent games I've played might also be a case of unrealistic expectations. In comparing today's games to those of my apparently idyllic childhood, the verdict is always "meh." Feelings of nostalgia make objectivity impossible and I have plenty of those feelings for video games I played when I was 10 years old.  Maybe I just think I'm having less fun because nothing can live up to that rose-tinted version of reality. It could also be that, having played so many games over the years, I'm less easily impressed. The first time I played a first-person shooter, it was really fantastic. After a hundred of them, it takes something really crazy to keep me interested.

This post probably wouldn't be complete if I didn't at least half-heartedly entertain the notion that games themselves are actually getting worse, but I think what's more important is that modern games are always made to cater to people who never played a game before. The depth I crave would probably alienate new players, and the mandatory tutorial levels that new players need are boring me.

All of these are contributing factors, but my lack of free time is by far the most significant. I work 40 to 50 hours a week doing something I don't enjoy, and when I get home it's hard to do anything but sleep. I spend the majority of my free time with my girlfriend, who doesn't care that I play video games but doesn't care to play them herself, so it's not something we can do together. When I do get some time alone to waste, it's rarely more than a couple of hours, and rarely can I find the motivation to start a new game when I'm on such a tight schedule. So I just end up browsing the web or loading a save in the middle of something I've already finished. Even on the weekends, the prospect of going back to work on Monday is so distracting that it's hard to enjoy anything at all. You know, I think I'm just depressed. This isn't helping to shrink my rapidly growing backlog of things I really want to play eventually.

Maybe part of me thought that buying a new PC would somehow fix all of this. I guess it has, at least a little, since there are more games that I can actually play and I'm excited about playing them. I have to admit, I've enjoyed replaying Crysis with the graphics turned up to crazy (even though this only gets me around 40 frames per second most of the time). But I haven't had the time or the patience to dive into L.A. Noire or Mirror's Edge or Dead Space, all of which are on my Steam account with only a few minutes logged. Maybe, one of these lazy Sundays, I'll have a chance to make some progress in one of them, and maybe the experience won't be ruined by thoughts of the impending Monday.

Sunday, December 22, 2013

Midlife Crisis, Part 3

The most frustrating thing about having a hobby is that you never really have time for one unless you're unemployed and lonely. For better or for worse, I'm neither. This was the case before I bought my new PC, and it's still the case now that I've gotten most of my games installed on it. There will always be weekends, and I have a few hours of downtime after work each weekday, but it becomes more clear every time a new game is released that I'm going to die of old age before I get to finish every game that I deem worth playing. Such is the price I pay for attempting to have a life on the side.

So far, I've actually spent more time fiddling with my PC than playing games on it. Lately, this fiddling has been the enjoyable kind; I've been installing all the software I need, rearranging my desktop icons like the truly obsessive-compulsive person I am, and more generally setting things up just how I like them. For the first few weekends of my PC's existence, however, I had nothing but trouble.

First, I didn't bother getting a wireless network adapter because a stationary computer should ideally be placed where an ethernet cable can reach it. Unfortunately, I needed the computer to be in another room temporarily. To remedy the situation, I tried using something I already had in my closet — a D-Link wireless USB adapter. It worked pretty well until my network started slowing down or crashing every time I tried to use a lot of bandwidth (i.e., by downloading a Steam game). I'm still not sure what the problem was; maybe there was some kind of incompatibility with the router, or maybe something more complicated was going on. Maybe it was my computer's fault, somehow. Fortunately, I don't really need to figure it out, since I'm using a wired internet connection now and I don't really have any need for Wi-Fi (let alone the D-Link adapter) in the near future.

Other problems included a couple of random blue screen errors (most likely caused by an AMD video card driver which I've updated) and various problems with various games. The original Assassin's Creed, for example, refused to start when I first installed it, and I'm not even sure how I fixed the problem. I'd tried a few things, given up, and turned off the computer, and when I tried launching the game again later, it worked just fine. (Actually, I had to turn on compatibility mode for Windows Vista because I was getting a black screen where the opening cut scene should have been, but that's hardly an issue. As often as compatibility mode fails, it should always be the default first move if an old game does something weird.)

Compatibility mode for Windows 98 / Windows ME was also the initial solution for the Steam version of the original Max Payne, which failed to launch even though the process was visible in the task manager. However, even after the game launched, some of the music was gone and the sound effects were severely messed up. Fortunately, some nice guy created his own patch to fix the problem. It sucks that the original developers of old games like Max Payne aren't willing to invest the time and money to solve these problems themselves (especially when they're still selling these old games alongside their sequels on digital services like Steam), and the amateurs who pick up the slack are true heroes.

I'm reminded of Command & Conquer: The First Decade, a box set of a dozen games from the series. A couple of official patches were released, but not all of the bugs were fixed, so fans started patching it up themselves. The unofficial 1.03 patch, a collection of bug fixes and other features, was absolutely essential for anyone who had this particular Command & Conquer box set. But it's not just the occasional issue with an outdated game that often necessitates a third-party fix.

Now that I have a good computer, my older games don't even come close to pushing the graphics card to its limits, which means most of these games will needlessly run at a frame rate much higher than my monitor's refresh rate. Usually, this just causes screen tearing. In extreme cases, I can even hear what sounds like coil whine, an irritating whistling noise coming from inside the computer (not the speakers). This happens on the main menu screens of F.E.A.R. and some other games, presumably because the computer is able to render thousands of frames per second when there isn't much to display.

Turning on a game's Vsync feature (preferably with triple buffering enabled as well) fixes these problems, but a few of my games don't have a working Vsync feature. Each of the games in the S.T.A.L.K.E.R. trilogy, for example, has an option for Vsync in the settings, but in all three games it does nothing. It's straight-up broken. The optimal solution would be to force Vsync and triple buffering through the control panel software of ones graphics card, but AMD cards can't do this for certain games on Windows 7, and it's my understanding that both Microsoft and AMD are to blame for that. Even with Vsync set to "always on" in Catalyst Control Center, I was getting stupidly high frame rates in S.T.A.L.K.E.R.: Shadow of Chernobyl.

Then I heard about D3DOverrider, a little tool included in an old freeware program called RivaTuner. It's made to enable Vsync and triple buffering in software that's missing one or both options, and it works like a charm. Despite S.T.A.L.K.E.R.'s broken Vsync feature, and despite Catalyst Control Center's inability to fix the problem, D3DOverrider gets the job done. Now I'm getting a fairly consistent 60 frames per second, instead of hundreds of frames in-game and thousands of frames on the menu. No more vertical tearing and more no quiet-but-irritating coil whine.

That other first-person shooter set in a post-apocalyptic Eastern Europe, Metro 2033, has its own share of issues, namely that a lot of useful options don't show up in its menu and have to be toggled on or off by editing a few configuration files in Notepad, and it also appears to have a broken Vsync feature. In this case, not even D3DOverrider appears to be solving the problem. Fortunately, the game's poor optimization means that it doesn't always exceed 60 frames per second at the highest graphics settings anyway, making Vsync mostly unnecessary. People with more powerful systems might have to keep on looking for solutions.

All of this is pretty frustrating, but troubleshooting is to be expected when playing games on a PC, especially when the games are relatively old and the operating system is relatively new. I guess I should just be glad that most of the common problems can be solved.

"But if only you'd bought a console," some would say, "your games would just work." That's the favorite argument in favor of consoles. They just work. But now that the short-lived phenomenon of backwards compatibility has gone out the window with PlayStation 4 and Xbox One, I don't think it's a fair argument. Most of the problems with PC games arise when one is trying to have a nostalgic experience by playing an old game on a new system, and the other problems are usually the fault of careless developers.

I guess we should all be glad that PC games work at all, considering that our "gaming computers" are not standardized like all the millions of identical Xbox One and PlayStation 4 consoles. Since I'm not a game developer, I can only imagine how difficult it must be to ensure that a game is going to work consistently on so many hardware configurations. Maybe I shouldn't be so upset that games like S.T.A.L.K.E.R. have a few broken features, or that games like Max Payne continue to be sold without being updated for the current version of Windows. On the other hand, it's harder to forgive professional developers for an imperfect product when presumably amateur developers are able to correct the imperfections without being paid.

Update: It seems that, since I originally wrote this post, S.T.A.L.K.E.R. was actually updated with a frame rate cap of 60 fps. I'm shocked that such an old game was actually updated, to be honest, but apparently some people with expensive computers were burning out their video cards by leaving the game paused (thereby allowing the game to run at hundreds or thousands of frames per second for long periods of time). Terrifying.

Sunday, November 10, 2013

Midlife Crisis, Part 2

My new PC is up and running. All of the parts arrived about a week before Halloween, I put everything together on a Friday night, and I started installing drivers over the weekend. Since then, I've installed and tested a few somewhat-high-performance games, namely Crysis, Alan Wake, Deus Ex: Human Revolution, L.A. Noire, and S.T.A.L.K.E.R.: Shadow of Chernobyl. They all run rather well on the highest graphics settings. I've also played a bit of Metro 2033, which I got for practically nothing from the Humble THQ Bundle last November, and it performs well enough on maximum settings as well. There's some stuttering, but that's probably the result of poor optimization and there might be a fix somewhere.

For obvious reasons, I don't own any truly "next-generation" games at the moment, so I'm not sure what kind of performance I'll get out of those. In any case, however, I'm better off with this new rig than without it. My old PC worked surprisingly well with some games (running the Metro 2033 demo at a playable frame rate on low settings), but it totally failed to work with others (namely L.A. Noire which, for whatever reason, was getting about two frames per second). Games ported to Windows from the upcoming generation of consoles can certainly be expected to work my new PC much harder than anything I've played so far, and I'm looking forward to seeing how it performs. On the other hand, I can't really say I'm looking forward to seeing what my new favorite toy can't do. After all the time spent on this thing, from finding the parts to powering it on, I want to believe it's perfect.

I breathed a sigh of relief when the final parts arrived — with any luck, I wouldn't have to shop for computer parts again for a few years — but there was still plenty of stress ahead of me. The first hiccup was a return of my supposedly new Gigabyte motherboard to Amazon, since the retail box was not sealed and had some rips in the corners. In other words, it looked like it had already been opened, though the parts inside were still in plastic. Despite my complaints, however, the replacement's box was in roughly the same condition, perhaps slightly worse. Again, however, the inner parts were still in plastic.

I don't know if Amazon was trying to screw me by selling me returned hardware as new, or if Gigabyte was to blame, but I figured I could just get it replaced if it was indeed broken or damaged so I decided to use the motherboard anyway. This might prove to be a mistake, but I was getting impatient. Besides, if Amazon couldn't send me a box that looked shiny and new, I'd have to buy it from elsewhere, and I wasn't confident that other sellers would be more trustworthy than one of the biggest online retailers in existence.

So I started building the computer. Long story short, the motherboard was not dead on arrival, and I've been careful to keep all the paperwork I received for warranty purposes in case something happens later. All of the parts, in fact, seem to be working nicely, even the cheap optical drive. The process of actually assembling the computer was quite an experience, though, since I'd never done it before.

Now that I have done it, building another would probably take less than an hour, but this first build took several. Most of that time was spent reading instructions, looking up computer-building tips, and wondering how hard I need to push to get one part to slide into another. Getting the stock CPU cooler into the motherboard was particularly terrifying, because there's no way to accomplish this without pushing harder than I ever though delicate electronics should be pushed. The same was true of installing the processor itself. I was afraid I'd break it, but those fears were unfounded, since I was doing it correctly and there was no other way.

After getting all the parts into the case, I experienced another momentary freak-out when I thought the fans on the case were totally incompatible with the motherboard. (The motherboard had four-pin headers and the fans had three-pin connectors.) I was wrong — they can, in fact, be plugged in — but it doesn't really matter now anyway, because I opted to plug the case fans directly into the power supply instead. My only concern now is that I might have created air bubbles in the thermal paste when installing that troublesome CPU cooler, since I picked it up again after letting it make contact with the top of the processor. So far, however, the temperatures don't seem to be reaching dangerous levels.

Given all the minor difficulties I encountered — all of which could have been much worse with a little bit of bad luck — I completely understand why the path I chose is less traveled than others. Most people buy consoles or pre-built computers instead, and I don't blame them. Consoles, in particular, are super easy; they plug in and work. You don't have to worry about whether a game is compatible as long as it has the right logo on the box. Moreover, they're affordable, and while performance might only be "good enough" instead of great, it's hard to tell when you're sitting on a couch ten feet from the screen.

People who choose PCs over consoles are sometimes seen as elitists in the so-called "gaming" community, and it's probably because some PC users feel the need to participate in the embarrassingly pathetic "console wars" that break out between fans of competing systems. Xbox fans and Playstation fans like to argue amongst themselves about which console is best, letting their brand loyalty metamorphosize into some kind of vendetta against everyone who bought the other product as they collectively provide Microsoft and Sony with all the free advertising they could ever want. But the PC user, whose system is built from various parts by different manufacturers, doesn't necessarily have any brand loyalty unless he has an affinity for AMD over Intel, or vice versa. The stereotypically elitist "PC gamer" thinks he's above the petty squabbling of console owners, but he stoops to their level nonetheless when he proclaims that his PC is better than any console and says not-so-nice thinks about everybody who bought one. So I'm not going to do that.

It's true that a good computer can outperform any console, because a console is just a specialized computer and it's never made of the best hardware available. For the right price, a PC can surpass a brand new console on the day of release. Even a cheap PC can beat a console in mid-generation, since PC parts continue to improve while consoles stagnate for up to eight years. The PC user, in a way, is right about his system's superiority. That's why console fans who brag about graphics will usually turn around and claim that graphics don't matter once the PC guy joins the discussion. Either that, or they'll pretend it costs over $2000 to build a PC that plays console games at console-equivalent settings, or they'll insist that the only games worth playing are console exclusives.

But there's really no need to grasp at straws so desperately, because consoles do have their purpose. While a PC is good for the hardcore game enthusiast, a console is a much easier solution for casual play, most often for a lower price. A console is a hassle-free, plug-and-play, guaranteed-compatible alternative for the living room. Let's just leave it at that. I might have considered buying a console myself if I weren't in need of a new computer anyway. It was a choice between a console plus a cheap computer, or one good computer, and I chose the latter.

The worst thing about choosing a personal computer over a console is all the second-guessing that comes naturally with an abundance of choice. Now that I have my PC, I won't be buying another for a few years unless something goes terribly wrong, so I won't get to try all the other hardware presently on the market. I guess that's why some people get paid to review this hardware, but there's nothing like first-hand experience, and I'll never be able to make my own comparisons unless I go and buy more parts than I can afford. Console users have fewer decisions to make when buying their hardware, but people are generally happier this way because they don't have to worry as much about making the wrong choice.

As for me, I'll just have to clear my mind of all those what-ifs, and be content with what I have. That is, unless it breaks.