The most frustrating thing about having a hobby is that you never really have time for one unless you're unemployed and lonely. For better or for worse, I'm neither. This was the case before I bought my new PC, and it's still the case now that I've gotten most of my games installed on it. There will always be weekends, and I have a few hours of downtime after work each weekday, but it becomes more clear every time a new game is released that I'm going to
die of old age before I get to finish every game that I deem worth playing. Such is
the price I pay for attempting to have a life on the side.
So far, I've actually spent more time fiddling with my PC than playing games on it. Lately, this fiddling has been the enjoyable kind; I've been installing all the software I need, rearranging my desktop icons like the truly obsessive-compulsive person I am, and more generally setting things up just how I like them. For the first few weekends of my PC's existence, however, I had nothing but trouble.
First, I didn't bother getting a wireless network adapter because a stationary computer should ideally be placed where an ethernet cable can reach it. Unfortunately, I needed the computer to be in another room temporarily. To remedy the situation, I tried using something I already had in my closet — a D-Link wireless USB adapter. It worked pretty well until my network started slowing down or crashing every time I tried to use a lot of bandwidth (i.e., by downloading a Steam game). I'm still not sure what the problem was; maybe there was some kind of incompatibility with the router, or maybe something more complicated was going on. Maybe it was my computer's fault, somehow. Fortunately, I don't really need to figure it out, since I'm using a wired internet connection now and I don't really have any need for Wi-Fi (let alone the D-Link adapter) in the near future.
Other problems included a couple of random blue screen errors (most likely caused by an AMD video card driver which I've updated) and various problems with various games. The original Assassin's Creed, for example, refused to start when I first installed it, and I'm not even sure how I fixed the problem. I'd tried a few things, given up, and turned off the computer, and when I tried launching the game again later, it worked just fine. (Actually, I had to turn on compatibility mode for Windows Vista because I was getting a black screen where the opening cut scene should have been, but that's hardly an issue. As often as compatibility mode fails, it should always be the default first move if an old game does something weird.)
Compatibility mode for Windows 98 / Windows ME was also the initial solution for the Steam version of the original Max Payne, which failed to launch even though the process was visible in the task manager. However, even after the game launched, some of the music was gone and the sound effects were severely messed up. Fortunately, some nice guy created his own patch to fix the problem. It sucks that the original developers of old games like Max Payne aren't willing to invest the time and money to solve these problems themselves (especially when they're still selling these old games alongside their sequels on digital services like Steam), and the amateurs who pick up the slack are true heroes.
I'm reminded of Command & Conquer: The First Decade, a box set of a dozen games from the series. A couple of official patches were released, but not all of the bugs were fixed, so fans started patching it up themselves. The unofficial 1.03 patch, a collection of bug fixes and other features, was absolutely essential for anyone who had this particular Command & Conquer box set. But it's not just the occasional issue with an outdated game that often necessitates a third-party fix.
Now that I have a good computer, my older games don't even come close to pushing the graphics card to its limits, which means most of these games will needlessly run at a frame rate much higher than my monitor's refresh rate. Usually, this just causes screen tearing. In extreme cases, I can even hear what sounds like coil whine, an irritating whistling noise coming from inside the computer (not the speakers). This happens on the main menu screens of F.E.A.R. and some other games, presumably because the computer is able to render thousands of frames per second when there isn't much to display.
Turning on a game's Vsync feature (preferably with triple buffering enabled as well) fixes these problems, but a few of my games don't have a working Vsync feature. Each of the games in the S.T.A.L.K.E.R. trilogy, for example, has an option for Vsync in the settings, but in all three games it does nothing. It's straight-up broken. The optimal solution would be to force Vsync and triple buffering through the control panel software of ones graphics card, but AMD cards can't do this for certain games on Windows 7, and it's my understanding that both Microsoft and AMD are to blame for that. Even with Vsync set to "always on" in Catalyst Control Center, I was getting stupidly high frame rates in S.T.A.L.K.E.R.: Shadow of Chernobyl.
Then I heard about D3DOverrider, a little tool included in an old freeware program called RivaTuner. It's made to enable Vsync and triple buffering in software that's missing one or both options, and it works like a charm. Despite S.T.A.L.K.E.R.'s broken Vsync feature, and despite Catalyst Control Center's inability to fix the problem, D3DOverrider gets the job done. Now I'm getting a fairly consistent 60 frames per second, instead of hundreds of frames in-game and thousands of frames on the menu. No more vertical tearing and more no quiet-but-irritating coil whine.
That other first-person shooter set in a post-apocalyptic Eastern Europe, Metro 2033, has its own share of issues, namely that a lot of useful options don't show up in its menu and have to be toggled on or off by editing a few configuration files in Notepad, and it also appears to have a broken Vsync feature. In this case, not even D3DOverrider appears to be solving the problem. Fortunately, the game's poor optimization means that it doesn't always exceed 60 frames per second at the highest graphics settings anyway, making Vsync mostly unnecessary. People with more powerful systems might have to keep on looking for solutions.
All of this is pretty frustrating, but troubleshooting is to be expected when playing games on a PC, especially when the games are relatively old and the operating system is relatively new. I guess I should just be glad that most of the common problems can be solved.
"But if only you'd bought a console," some would say, "your games would just work." That's the favorite argument in favor of consoles. They just work. But now that the short-lived phenomenon of backwards compatibility has gone out the window with PlayStation 4 and Xbox One, I don't think it's a fair argument. Most of the problems with PC games arise when one is trying to have a nostalgic experience by playing an old game on a new system, and the other problems are usually the fault of careless developers.
I guess we should all be glad that PC games work at all, considering that our "gaming computers" are not standardized like all the millions of identical Xbox One and PlayStation 4 consoles. Since I'm not a game developer, I can only imagine how difficult it must be to ensure that a game is going to work consistently on so many hardware configurations. Maybe I shouldn't be so upset that games like S.T.A.L.K.E.R. have a few broken features, or that games like Max Payne continue to be sold without being updated for the current version of Windows. On the other hand, it's harder to forgive professional developers for an imperfect product when presumably amateur developers are able to correct the imperfections without being paid.
Update: It seems that, since I originally wrote this post, S.T.A.L.K.E.R. was actually updated with a frame rate cap of 60 fps. I'm shocked that such an old game was actually updated, to be honest, but apparently some people with expensive computers were burning out their video cards by leaving the game paused (thereby allowing the game to run at hundreds or thousands of frames per second for long periods of time). Terrifying.
Showing posts with label metro. Show all posts
Showing posts with label metro. Show all posts
Sunday, December 22, 2013
Midlife Crisis, Part 3
Labels:
assassin's creed,
command and conquer,
f.e.a.r.,
max payne,
metro,
s.t.a.l.k.e.r.
Sunday, November 10, 2013
Midlife Crisis, Part 2
My new PC is up and running. All of the parts arrived about a week before Halloween, I put everything together on a Friday night, and I started installing drivers over the weekend. Since then, I've installed and tested a few somewhat-high-performance games, namely Crysis, Alan Wake, Deus Ex: Human Revolution, L.A. Noire, and S.T.A.L.K.E.R.: Shadow of Chernobyl. They all run rather well on the highest graphics settings. I've also played a bit of Metro 2033, which I got for practically nothing from the Humble THQ Bundle last November, and it performs well enough on maximum settings as well. There's some stuttering, but that's probably the result of poor optimization and there might be a fix somewhere.
For obvious reasons, I don't own any truly "next-generation" games at the moment, so I'm not sure what kind of performance I'll get out of those. In any case, however, I'm better off with this new rig than without it. My old PC worked surprisingly well with some games (running the Metro 2033 demo at a playable frame rate on low settings), but it totally failed to work with others (namely L.A. Noire which, for whatever reason, was getting about two frames per second). Games ported to Windows from the upcoming generation of consoles can certainly be expected to work my new PC much harder than anything I've played so far, and I'm looking forward to seeing how it performs. On the other hand, I can't really say I'm looking forward to seeing what my new favorite toy can't do. After all the time spent on this thing, from finding the parts to powering it on, I want to believe it's perfect.
I breathed a sigh of relief when the final parts arrived — with any luck, I wouldn't have to shop for computer parts again for a few years — but there was still plenty of stress ahead of me. The first hiccup was a return of my supposedly new Gigabyte motherboard to Amazon, since the retail box was not sealed and had some rips in the corners. In other words, it looked like it had already been opened, though the parts inside were still in plastic. Despite my complaints, however, the replacement's box was in roughly the same condition, perhaps slightly worse. Again, however, the inner parts were still in plastic.
I don't know if Amazon was trying to screw me by selling me returned hardware as new, or if Gigabyte was to blame, but I figured I could just get it replaced if it was indeed broken or damaged so I decided to use the motherboard anyway. This might prove to be a mistake, but I was getting impatient. Besides, if Amazon couldn't send me a box that looked shiny and new, I'd have to buy it from elsewhere, and I wasn't confident that other sellers would be more trustworthy than one of the biggest online retailers in existence.
So I started building the computer. Long story short, the motherboard was not dead on arrival, and I've been careful to keep all the paperwork I received for warranty purposes in case something happens later. All of the parts, in fact, seem to be working nicely, even the cheap optical drive. The process of actually assembling the computer was quite an experience, though, since I'd never done it before.
Now that I have done it, building another would probably take less than an hour, but this first build took several. Most of that time was spent reading instructions, looking up computer-building tips, and wondering how hard I need to push to get one part to slide into another. Getting the stock CPU cooler into the motherboard was particularly terrifying, because there's no way to accomplish this without pushing harder than I ever though delicate electronics should be pushed. The same was true of installing the processor itself. I was afraid I'd break it, but those fears were unfounded, since I was doing it correctly and there was no other way.
After getting all the parts into the case, I experienced another momentary freak-out when I thought the fans on the case were totally incompatible with the motherboard. (The motherboard had four-pin headers and the fans had three-pin connectors.) I was wrong — they can, in fact, be plugged in — but it doesn't really matter now anyway, because I opted to plug the case fans directly into the power supply instead. My only concern now is that I might have created air bubbles in the thermal paste when installing that troublesome CPU cooler, since I picked it up again after letting it make contact with the top of the processor. So far, however, the temperatures don't seem to be reaching dangerous levels.
Given all the minor difficulties I encountered — all of which could have been much worse with a little bit of bad luck — I completely understand why the path I chose is less traveled than others. Most people buy consoles or pre-built computers instead, and I don't blame them. Consoles, in particular, are super easy; they plug in and work. You don't have to worry about whether a game is compatible as long as it has the right logo on the box. Moreover, they're affordable, and while performance might only be "good enough" instead of great, it's hard to tell when you're sitting on a couch ten feet from the screen.
People who choose PCs over consoles are sometimes seen as elitists in the so-called "gaming" community, and it's probably because some PC users feel the need to participate in the embarrassingly pathetic "console wars" that break out between fans of competing systems. Xbox fans and Playstation fans like to argue amongst themselves about which console is best, letting their brand loyalty metamorphosize into some kind of vendetta against everyone who bought the other product as they collectively provide Microsoft and Sony with all the free advertising they could ever want. But the PC user, whose system is built from various parts by different manufacturers, doesn't necessarily have any brand loyalty unless he has an affinity for AMD over Intel, or vice versa. The stereotypically elitist "PC gamer" thinks he's above the petty squabbling of console owners, but he stoops to their level nonetheless when he proclaims that his PC is better than any console and says not-so-nice thinks about everybody who bought one. So I'm not going to do that.
It's true that a good computer can outperform any console, because a console is just a specialized computer and it's never made of the best hardware available. For the right price, a PC can surpass a brand new console on the day of release. Even a cheap PC can beat a console in mid-generation, since PC parts continue to improve while consoles stagnate for up to eight years. The PC user, in a way, is right about his system's superiority. That's why console fans who brag about graphics will usually turn around and claim that graphics don't matter once the PC guy joins the discussion. Either that, or they'll pretend it costs over $2000 to build a PC that plays console games at console-equivalent settings, or they'll insist that the only games worth playing are console exclusives.
But there's really no need to grasp at straws so desperately, because consoles do have their purpose. While a PC is good for the hardcore game enthusiast, a console is a much easier solution for casual play, most often for a lower price. A console is a hassle-free, plug-and-play, guaranteed-compatible alternative for the living room. Let's just leave it at that. I might have considered buying a console myself if I weren't in need of a new computer anyway. It was a choice between a console plus a cheap computer, or one good computer, and I chose the latter.
The worst thing about choosing a personal computer over a console is all the second-guessing that comes naturally with an abundance of choice. Now that I have my PC, I won't be buying another for a few years unless something goes terribly wrong, so I won't get to try all the other hardware presently on the market. I guess that's why some people get paid to review this hardware, but there's nothing like first-hand experience, and I'll never be able to make my own comparisons unless I go and buy more parts than I can afford. Console users have fewer decisions to make when buying their hardware, but people are generally happier this way because they don't have to worry as much about making the wrong choice.
As for me, I'll just have to clear my mind of all those what-ifs, and be content with what I have. That is, unless it breaks.
For obvious reasons, I don't own any truly "next-generation" games at the moment, so I'm not sure what kind of performance I'll get out of those. In any case, however, I'm better off with this new rig than without it. My old PC worked surprisingly well with some games (running the Metro 2033 demo at a playable frame rate on low settings), but it totally failed to work with others (namely L.A. Noire which, for whatever reason, was getting about two frames per second). Games ported to Windows from the upcoming generation of consoles can certainly be expected to work my new PC much harder than anything I've played so far, and I'm looking forward to seeing how it performs. On the other hand, I can't really say I'm looking forward to seeing what my new favorite toy can't do. After all the time spent on this thing, from finding the parts to powering it on, I want to believe it's perfect.
I breathed a sigh of relief when the final parts arrived — with any luck, I wouldn't have to shop for computer parts again for a few years — but there was still plenty of stress ahead of me. The first hiccup was a return of my supposedly new Gigabyte motherboard to Amazon, since the retail box was not sealed and had some rips in the corners. In other words, it looked like it had already been opened, though the parts inside were still in plastic. Despite my complaints, however, the replacement's box was in roughly the same condition, perhaps slightly worse. Again, however, the inner parts were still in plastic.
I don't know if Amazon was trying to screw me by selling me returned hardware as new, or if Gigabyte was to blame, but I figured I could just get it replaced if it was indeed broken or damaged so I decided to use the motherboard anyway. This might prove to be a mistake, but I was getting impatient. Besides, if Amazon couldn't send me a box that looked shiny and new, I'd have to buy it from elsewhere, and I wasn't confident that other sellers would be more trustworthy than one of the biggest online retailers in existence.
So I started building the computer. Long story short, the motherboard was not dead on arrival, and I've been careful to keep all the paperwork I received for warranty purposes in case something happens later. All of the parts, in fact, seem to be working nicely, even the cheap optical drive. The process of actually assembling the computer was quite an experience, though, since I'd never done it before.
Now that I have done it, building another would probably take less than an hour, but this first build took several. Most of that time was spent reading instructions, looking up computer-building tips, and wondering how hard I need to push to get one part to slide into another. Getting the stock CPU cooler into the motherboard was particularly terrifying, because there's no way to accomplish this without pushing harder than I ever though delicate electronics should be pushed. The same was true of installing the processor itself. I was afraid I'd break it, but those fears were unfounded, since I was doing it correctly and there was no other way.
After getting all the parts into the case, I experienced another momentary freak-out when I thought the fans on the case were totally incompatible with the motherboard. (The motherboard had four-pin headers and the fans had three-pin connectors.) I was wrong — they can, in fact, be plugged in — but it doesn't really matter now anyway, because I opted to plug the case fans directly into the power supply instead. My only concern now is that I might have created air bubbles in the thermal paste when installing that troublesome CPU cooler, since I picked it up again after letting it make contact with the top of the processor. So far, however, the temperatures don't seem to be reaching dangerous levels.
Given all the minor difficulties I encountered — all of which could have been much worse with a little bit of bad luck — I completely understand why the path I chose is less traveled than others. Most people buy consoles or pre-built computers instead, and I don't blame them. Consoles, in particular, are super easy; they plug in and work. You don't have to worry about whether a game is compatible as long as it has the right logo on the box. Moreover, they're affordable, and while performance might only be "good enough" instead of great, it's hard to tell when you're sitting on a couch ten feet from the screen.
People who choose PCs over consoles are sometimes seen as elitists in the so-called "gaming" community, and it's probably because some PC users feel the need to participate in the embarrassingly pathetic "console wars" that break out between fans of competing systems. Xbox fans and Playstation fans like to argue amongst themselves about which console is best, letting their brand loyalty metamorphosize into some kind of vendetta against everyone who bought the other product as they collectively provide Microsoft and Sony with all the free advertising they could ever want. But the PC user, whose system is built from various parts by different manufacturers, doesn't necessarily have any brand loyalty unless he has an affinity for AMD over Intel, or vice versa. The stereotypically elitist "PC gamer" thinks he's above the petty squabbling of console owners, but he stoops to their level nonetheless when he proclaims that his PC is better than any console and says not-so-nice thinks about everybody who bought one. So I'm not going to do that.
It's true that a good computer can outperform any console, because a console is just a specialized computer and it's never made of the best hardware available. For the right price, a PC can surpass a brand new console on the day of release. Even a cheap PC can beat a console in mid-generation, since PC parts continue to improve while consoles stagnate for up to eight years. The PC user, in a way, is right about his system's superiority. That's why console fans who brag about graphics will usually turn around and claim that graphics don't matter once the PC guy joins the discussion. Either that, or they'll pretend it costs over $2000 to build a PC that plays console games at console-equivalent settings, or they'll insist that the only games worth playing are console exclusives.
But there's really no need to grasp at straws so desperately, because consoles do have their purpose. While a PC is good for the hardcore game enthusiast, a console is a much easier solution for casual play, most often for a lower price. A console is a hassle-free, plug-and-play, guaranteed-compatible alternative for the living room. Let's just leave it at that. I might have considered buying a console myself if I weren't in need of a new computer anyway. It was a choice between a console plus a cheap computer, or one good computer, and I chose the latter.
The worst thing about choosing a personal computer over a console is all the second-guessing that comes naturally with an abundance of choice. Now that I have my PC, I won't be buying another for a few years unless something goes terribly wrong, so I won't get to try all the other hardware presently on the market. I guess that's why some people get paid to review this hardware, but there's nothing like first-hand experience, and I'll never be able to make my own comparisons unless I go and buy more parts than I can afford. Console users have fewer decisions to make when buying their hardware, but people are generally happier this way because they don't have to worry as much about making the wrong choice.
As for me, I'll just have to clear my mind of all those what-ifs, and be content with what I have. That is, unless it breaks.
Labels:
alan wake,
crysis,
deus ex,
l.a. noire,
metro,
midlife crisis,
s.t.a.l.k.e.r.
Monday, June 3, 2013
The Problem with Trading Card Games
Scrolls, the upcoming collectible-card-based strategy game from Minecraft developer Mojang, enters its open beta phase today. Essentially, this means you can buy the not-quite-finished game for less than its full price and start playing early while they work out the bugs and make improvements. Some part of me wants to partake in this, because the game looks pretty interesting (and because we all know how playing Minecraft quickly became the most popular thing since breathing), but the rest of me doesn't want to touch this game with a thirty-nine-and-a-half-foot pole. It looks fun, but I'm conflicted.
Collectible card games are an interesting thing. I want to love them, because I think they're so cool in theory. I wanted to love Magic: The Gathering, the original trading card game published by Wizards of the Coast back in 1993. The concept of such a card game is ingenious, both for its unique gameplay and — let's be honest — for its potential to rake in huge wads of cash for the owners.
Over the past 20 years, new sets of Magic cards have been released on a regular basis, and the total number of different cards seems to have surpassed 13,000, yet the game still remains accessible to newcomers. New rules are introduced all the time, and the balance is tweaked with each new expansion, but the earliest cards can still be used (outside of certain tournaments) together with the ones printed today. The number of possible combinations in a 60-card deck isn't even worth counting.
The ability of each player to create his or her own customized deck of cards, drawing from a collection unlike that of any opponent, is what makes this type of game so fun to play. Unfortunately, this makes the gameplay inherently imbalanced, unless we consider the start of the collection process to be the true beginning of any given match (and that's a stretch). Even then, a game like Magic too often requires continual monetary investment if you want to remain competitive, and this feature (while I'd like to call it a flaw) is by design. I played Magic for a brief period of time, several years ago, and my cards might have been only half-decent back then, but they're total garbage now. More powerful cards and better gameplay mechanics are created with each expansion to keep players spending their money. Of course.
There's also a certain threshold of monetary investment required in order to become competitive in the first place, and that threshold is probably going to scale in proportion to the size of your opponent's paycheck. Things might be balanced within a group if everyone involved cares enough to go on eBay to buy selectively the individual cards they need for one of a few strategies deemed viable at the expert level, but this isn't always affordable. Meanwhile, for more casual play in which most cards are obtained from random packs, the guy who wins most often is going to be the guy who spent the most money on his collection. The three pillars of succeeding in Magic: The Gathering are building a good deck, making the right in-game decisions, and (perhaps most importantly) owning better cards than the other guy (which is where the "collectible" aspect comes in).
When a video game affords even the smallest advantage to a player who spends extra money (e.g., through micro-transactions), we call it "pay-to-win" (even if this isn't literally true) and we hate it because it feels so wrong. It is wrong, because the delicate balance of the game in question is either compromised or completely destroyed. Being at a disadvantage sucks, and if you give in and buy your way to the top then the challenge is gone and the game quickly becomes pointless. (In the most extreme cases, you've essentially just paid to see the words "you win" on your screen, so congratulations on doing that.)
A lesser form of pay-to-win merely allows players to spend some extra money to skip past a seemingly endless grind, as is the case in many so-called "free-to-play" games. This doesn't necessarily destroy the game's balance of power (because the advantages being bought can also be earned through dozens of hours of play), but it does highlight the major flaws already present in the game. If a person wants to pay more money simply to get less gameplay, the game probably sucks (and the person playing it probably hasn't realized there's nothing left to do if you're not grinding).
In the video game world, all of this is positively awful, but most collectible card games are pay-to-win by nature. Sure, they're fun to play if you're up against someone whose skill level and deck quality are in the same league as yours, but if you play against a guy whose collection of cards is twice as big (and twice as expensive) then it's completely unfair.
When I first heard of Magic: The Gathering Online prior to its release in 2002, I thought it might be a little more fair (and affordable) than its tabletop equivalent. I assumed (or at least hoped) that each player would be given access to the same pool of cards, or perhaps that better cards might be unlocked by winning matches, or something. At the very least, I naively believed that players wouldn't have to buy all of their virtual cards at the same price as physical ones because... well, you know, because they're not real cards. Unfortunately, Magic: The Gathering Online is identical to the original card game except that the cards aren't made of card stock and ink.
Duels of the Planeswalkers looks like a nice alternative, even with its relatively small number of cards, until you realize that you can't even build your own deck. This is no surprise, though, since Wizards of the Coast doesn't want this game to be a viable alternative. Duels of the Planeswalkers is meant to draw in new players and get them hooked, so they become frustrated by the lack of deck-building options and graduate to buying packs of cards, be they physical or digital. The virtual cards in Magic: The Gathering Online, despite being virtual, have monetary value because Wizards of the Coast doesn't let you do whatever you want with them. Artificial scarcity makes them seem as rare as the physical cards printed in limited runs on actual paper.
Digital game distributor Steam recently unveiled its own trading card meta-game, which is still in beta, and it's proving to be a nice example of how such artificial scarcity can make something desirable even if it has no real value, no purpose, and no practical function.
Players with access to the beta test can earn virtual trading cards for their Steam Community accounts by logging play time in certain Steam games. These currently include Borderlands 2, Counter-Strike: Global Offensive, Don't Starve, Half-Life 2, and Portal 2, as well as the free-to-play games Dota 2 and Team Fortress 2 (but only if you spend money on them). You can get up to four cards per game just by playing, while eight cards from a single game comprise a complete set. The fact that you can only earn half of any set on your own means that trading (or buying from other players) is a necessity.
Once you get a complete set, those eight cards can be turned into a badge and some other items. The badge is good for nothing at all, while the other goodies that come with it are mostly vanity items, like emoticons and points to "level up" your Steam Community account. (There's also a chance of getting a coupon, but my experience with Steam coupons is that the discounts they offer are less impressive than the ones you see during a typical sale.) The whole thing seems pretty dumb, but you can already see cards for sale on the Steam marketplace, and that doesn't usually happen unless people are buying. There's also a demand for those vanity items. Apparently, some users even made a profit by buying lots of cards and then selling the goodies that come with each badge.
In general, things that were specifically made to be collected usually don't have a lot of real value to collectors. However, if you turn that collection process into a game — even if it's a stupid one — people go nuts. If people are willing to spend real money on virtual trading cards just so they can earn virtual badges and virtual emoticons and level up their Steam accounts for virtual bragging rights, it should be no surprise if the same people are willing to spend money on virtual trading cards that give them an actual advantage in an online game. I can't really blame Wizards of the Coast for taking advantage of this kind of behavior. But when the game is a competitive one, I just don't like the idea of buying victories, even if it's done in an indirect and convoluted way.
A true trading card game, even if its entirely virtual, is going to have some level of imbalance. If each player draws cards from a unique collection, it's never going to be completely fair. All of this might be okay, however, if everything were unlockable through in-game actions and accomplishments. Naturally, I was hopeful when I first saw Scrolls; the official website tells us items at the in-game store can be bought with the gold earned by playing matches, and this presumably includes new cards (called "scrolls" because it sounds so much cooler). However, a "small selection" of items can also be bought with "shards" — a so-called "secondary currency" which you can buy with your real-life credit card.
So how significant is this "small selection" of in-game items? How much of an advantage can you gain by immediately purchasing everything that shards can buy? I can only assume the advantage is pretty significant; otherwise there would be no point. The real question is of whether a person who paid $10 more than you (and doesn't deserve the advantage) is distinguishable from someone who played 20 hours longer than you (and earned the advantage). As long as it's possible to unlock everything that matters through gameplay alone, and as long as doing so is feasible (i.e., not a 500-hour grind), there's some hope for this game.
Mojang has claimed that Scrolls won't become a pay-to-win game despite its purchasable items, but developers say a lot of things before their games are released. The only reason to believe them is that the game does in fact have an initial cost — in other words, it's not "free-to-play" so the developers don't need to rely on in-game purchases to turn a profit.
The cost of access to the open beta is $20, which isn't so bad when you consider the average cost of a modern video game, which tends to be around $50 or $60 regardless of quality. (While this high cost applies mostly to console games, high-profile PC releases tend to follow the same model with some notable exceptions. Runic Games, for example, earned some praise for selling Torchlight II at $20, which gave the action role-playing game a significant advantage over its controversial $60 competitor Diablo III.) Assuming that Scrolls turns out to be a decent game, this discounted price for early access is a pretty good deal.
Unfortunately for Mojang, I've been trained by Steam sales and Humble Bundle events to refrain from buying anything unless or until it's dirt cheap. With some patience and good timing, I could buy a handful of older games for the same $20 and I'd be sure to enjoy at least one of them. It doesn't take long for the price of a game to drop, and this is especially true of PC games now that developers are realizing they need to compete with piracy instead of trying in vain to stamp it out. As a result, people who play PC games — or the "PC gaming community" for those of you who can say such a thing with a straight face — have come to expect their games to be inexpensive. $20 is a good deal, but it's not great.
I certainly don't mean to imply, of course, that we should all wait a few years to pick up Mojang's new release. After all, we don't even know if it will ever be subjected to such brutal price-slashing. Furthermore, Scrolls is a multiplayer game which might only be fun for as long as the number of players remains high, so the time to buy is now, if you want it. The problem is that the game is a risky investment and my spending limit for such a risk is so low.
That limit — the point below which a risky investment becomes a risk worth taking and any potential buyer's remorse becomes bearable — is different for everyone. For me, it's about $5. That might seem like a ridiculously small figure, but it's what I paid for BioShock a few years ago. It's what I paid for S.T.A.L.K.E.R.: Shadow of Chernobyl. It's also what I paid for the first two Max Payne games combined. I almost bought Metro 2033 for $5, but I waited and got it for even less. I got Killing Floor for $5, a few years ago, and I've put more hours into that game than anything else I can remember. None of these games were new when I bought them, but I still enjoyed each of them at least as much as any $20 game I ever bought.
None of this is really a complaint about Scrolls or the open beta price tag in particular. But I might be more willing to spend four times what I paid for Killing Floor if I actually knew Scrolls would be a worthwhile purchase. Isn't there some way of trying out a game before its release without paying $20 for access to a beta version? Oh, yes, a free demo certainly would be nice. Maybe we'll get one of those later on... but we probably won't.
Collectible card games are an interesting thing. I want to love them, because I think they're so cool in theory. I wanted to love Magic: The Gathering, the original trading card game published by Wizards of the Coast back in 1993. The concept of such a card game is ingenious, both for its unique gameplay and — let's be honest — for its potential to rake in huge wads of cash for the owners.
Over the past 20 years, new sets of Magic cards have been released on a regular basis, and the total number of different cards seems to have surpassed 13,000, yet the game still remains accessible to newcomers. New rules are introduced all the time, and the balance is tweaked with each new expansion, but the earliest cards can still be used (outside of certain tournaments) together with the ones printed today. The number of possible combinations in a 60-card deck isn't even worth counting.
The ability of each player to create his or her own customized deck of cards, drawing from a collection unlike that of any opponent, is what makes this type of game so fun to play. Unfortunately, this makes the gameplay inherently imbalanced, unless we consider the start of the collection process to be the true beginning of any given match (and that's a stretch). Even then, a game like Magic too often requires continual monetary investment if you want to remain competitive, and this feature (while I'd like to call it a flaw) is by design. I played Magic for a brief period of time, several years ago, and my cards might have been only half-decent back then, but they're total garbage now. More powerful cards and better gameplay mechanics are created with each expansion to keep players spending their money. Of course.
There's also a certain threshold of monetary investment required in order to become competitive in the first place, and that threshold is probably going to scale in proportion to the size of your opponent's paycheck. Things might be balanced within a group if everyone involved cares enough to go on eBay to buy selectively the individual cards they need for one of a few strategies deemed viable at the expert level, but this isn't always affordable. Meanwhile, for more casual play in which most cards are obtained from random packs, the guy who wins most often is going to be the guy who spent the most money on his collection. The three pillars of succeeding in Magic: The Gathering are building a good deck, making the right in-game decisions, and (perhaps most importantly) owning better cards than the other guy (which is where the "collectible" aspect comes in).
When a video game affords even the smallest advantage to a player who spends extra money (e.g., through micro-transactions), we call it "pay-to-win" (even if this isn't literally true) and we hate it because it feels so wrong. It is wrong, because the delicate balance of the game in question is either compromised or completely destroyed. Being at a disadvantage sucks, and if you give in and buy your way to the top then the challenge is gone and the game quickly becomes pointless. (In the most extreme cases, you've essentially just paid to see the words "you win" on your screen, so congratulations on doing that.)
A lesser form of pay-to-win merely allows players to spend some extra money to skip past a seemingly endless grind, as is the case in many so-called "free-to-play" games. This doesn't necessarily destroy the game's balance of power (because the advantages being bought can also be earned through dozens of hours of play), but it does highlight the major flaws already present in the game. If a person wants to pay more money simply to get less gameplay, the game probably sucks (and the person playing it probably hasn't realized there's nothing left to do if you're not grinding).
In the video game world, all of this is positively awful, but most collectible card games are pay-to-win by nature. Sure, they're fun to play if you're up against someone whose skill level and deck quality are in the same league as yours, but if you play against a guy whose collection of cards is twice as big (and twice as expensive) then it's completely unfair.
When I first heard of Magic: The Gathering Online prior to its release in 2002, I thought it might be a little more fair (and affordable) than its tabletop equivalent. I assumed (or at least hoped) that each player would be given access to the same pool of cards, or perhaps that better cards might be unlocked by winning matches, or something. At the very least, I naively believed that players wouldn't have to buy all of their virtual cards at the same price as physical ones because... well, you know, because they're not real cards. Unfortunately, Magic: The Gathering Online is identical to the original card game except that the cards aren't made of card stock and ink.
Duels of the Planeswalkers looks like a nice alternative, even with its relatively small number of cards, until you realize that you can't even build your own deck. This is no surprise, though, since Wizards of the Coast doesn't want this game to be a viable alternative. Duels of the Planeswalkers is meant to draw in new players and get them hooked, so they become frustrated by the lack of deck-building options and graduate to buying packs of cards, be they physical or digital. The virtual cards in Magic: The Gathering Online, despite being virtual, have monetary value because Wizards of the Coast doesn't let you do whatever you want with them. Artificial scarcity makes them seem as rare as the physical cards printed in limited runs on actual paper.
Digital game distributor Steam recently unveiled its own trading card meta-game, which is still in beta, and it's proving to be a nice example of how such artificial scarcity can make something desirable even if it has no real value, no purpose, and no practical function.
Players with access to the beta test can earn virtual trading cards for their Steam Community accounts by logging play time in certain Steam games. These currently include Borderlands 2, Counter-Strike: Global Offensive, Don't Starve, Half-Life 2, and Portal 2, as well as the free-to-play games Dota 2 and Team Fortress 2 (but only if you spend money on them). You can get up to four cards per game just by playing, while eight cards from a single game comprise a complete set. The fact that you can only earn half of any set on your own means that trading (or buying from other players) is a necessity.
Once you get a complete set, those eight cards can be turned into a badge and some other items. The badge is good for nothing at all, while the other goodies that come with it are mostly vanity items, like emoticons and points to "level up" your Steam Community account. (There's also a chance of getting a coupon, but my experience with Steam coupons is that the discounts they offer are less impressive than the ones you see during a typical sale.) The whole thing seems pretty dumb, but you can already see cards for sale on the Steam marketplace, and that doesn't usually happen unless people are buying. There's also a demand for those vanity items. Apparently, some users even made a profit by buying lots of cards and then selling the goodies that come with each badge.
In general, things that were specifically made to be collected usually don't have a lot of real value to collectors. However, if you turn that collection process into a game — even if it's a stupid one — people go nuts. If people are willing to spend real money on virtual trading cards just so they can earn virtual badges and virtual emoticons and level up their Steam accounts for virtual bragging rights, it should be no surprise if the same people are willing to spend money on virtual trading cards that give them an actual advantage in an online game. I can't really blame Wizards of the Coast for taking advantage of this kind of behavior. But when the game is a competitive one, I just don't like the idea of buying victories, even if it's done in an indirect and convoluted way.
A true trading card game, even if its entirely virtual, is going to have some level of imbalance. If each player draws cards from a unique collection, it's never going to be completely fair. All of this might be okay, however, if everything were unlockable through in-game actions and accomplishments. Naturally, I was hopeful when I first saw Scrolls; the official website tells us items at the in-game store can be bought with the gold earned by playing matches, and this presumably includes new cards (called "scrolls" because it sounds so much cooler). However, a "small selection" of items can also be bought with "shards" — a so-called "secondary currency" which you can buy with your real-life credit card.
So how significant is this "small selection" of in-game items? How much of an advantage can you gain by immediately purchasing everything that shards can buy? I can only assume the advantage is pretty significant; otherwise there would be no point. The real question is of whether a person who paid $10 more than you (and doesn't deserve the advantage) is distinguishable from someone who played 20 hours longer than you (and earned the advantage). As long as it's possible to unlock everything that matters through gameplay alone, and as long as doing so is feasible (i.e., not a 500-hour grind), there's some hope for this game.
Mojang has claimed that Scrolls won't become a pay-to-win game despite its purchasable items, but developers say a lot of things before their games are released. The only reason to believe them is that the game does in fact have an initial cost — in other words, it's not "free-to-play" so the developers don't need to rely on in-game purchases to turn a profit.
The cost of access to the open beta is $20, which isn't so bad when you consider the average cost of a modern video game, which tends to be around $50 or $60 regardless of quality. (While this high cost applies mostly to console games, high-profile PC releases tend to follow the same model with some notable exceptions. Runic Games, for example, earned some praise for selling Torchlight II at $20, which gave the action role-playing game a significant advantage over its controversial $60 competitor Diablo III.) Assuming that Scrolls turns out to be a decent game, this discounted price for early access is a pretty good deal.
Unfortunately for Mojang, I've been trained by Steam sales and Humble Bundle events to refrain from buying anything unless or until it's dirt cheap. With some patience and good timing, I could buy a handful of older games for the same $20 and I'd be sure to enjoy at least one of them. It doesn't take long for the price of a game to drop, and this is especially true of PC games now that developers are realizing they need to compete with piracy instead of trying in vain to stamp it out. As a result, people who play PC games — or the "PC gaming community" for those of you who can say such a thing with a straight face — have come to expect their games to be inexpensive. $20 is a good deal, but it's not great.
I certainly don't mean to imply, of course, that we should all wait a few years to pick up Mojang's new release. After all, we don't even know if it will ever be subjected to such brutal price-slashing. Furthermore, Scrolls is a multiplayer game which might only be fun for as long as the number of players remains high, so the time to buy is now, if you want it. The problem is that the game is a risky investment and my spending limit for such a risk is so low.
That limit — the point below which a risky investment becomes a risk worth taking and any potential buyer's remorse becomes bearable — is different for everyone. For me, it's about $5. That might seem like a ridiculously small figure, but it's what I paid for BioShock a few years ago. It's what I paid for S.T.A.L.K.E.R.: Shadow of Chernobyl. It's also what I paid for the first two Max Payne games combined. I almost bought Metro 2033 for $5, but I waited and got it for even less. I got Killing Floor for $5, a few years ago, and I've put more hours into that game than anything else I can remember. None of these games were new when I bought them, but I still enjoyed each of them at least as much as any $20 game I ever bought.
None of this is really a complaint about Scrolls or the open beta price tag in particular. But I might be more willing to spend four times what I paid for Killing Floor if I actually knew Scrolls would be a worthwhile purchase. Isn't there some way of trying out a game before its release without paying $20 for access to a beta version? Oh, yes, a free demo certainly would be nice. Maybe we'll get one of those later on... but we probably won't.
Labels:
alan wake,
bioshock,
diablo,
killing floor,
magic the gathering,
max payne,
metro,
minecraft,
scrolls,
stalker,
steam,
torchlight
Subscribe to:
Posts (Atom)