Tobold's Blog
Monday, September 01, 2014
 
Hasta la Vista!

I bought a new PC recently. The 3-year old PC I used before I gave to my wife. Which meant her old computer, a bit over 5 years old, was to give away. Now a computer tends to accumulate a lot of personal data over the years. I don't like to just uninstall stuff and then find out later that somewhere hidden the bowels of the operating system there were still a bunch of my passwords stored that are now available to the new owner of the computer (even if we tend to give the old computer to friends or family). So what I like to do is to format the hard drive and give the computer away with a fresh install. A "factory reset", so do say, although as I don't buy brand computers they tend not to have that as an actual option.

The only problem was that I didn't want to give away the operating system and installation disks of the computers I am still using. So I reinstalled the operating system that was on that computer: Windows Vista. Now Microsoft has a strange policy of alternating okay versions of Windows with really, really bad ones, and Vista is one of the bad ones. Plus it is now completely outdated.

The first problem was that Vista freshly installed didn't have any default drivers that would make the network card work. Fortunately I found the disk with all the drivers for the motherboard, including audio and network, so after installing that I could connect to the internet. Then I wanted to download the Nvidia graphics card drivers, but that required downloading a lot of other stuff, like Java and Visual C++.

Then I thought I just run Windows Update and that would put Vista in a decent state. No luck! There is a major bug in the original Vista which makes Windows Update freeze when you run it. I found out that I first needed to download and install service pack 1 to fix that bug, and while I was at it I also installed service pack 2. That wasn't all that obvious because the pre-installed Internet Explorer 7 was so old that even the Microsoft website refused to work with it. And the IE7 update function didn't work either. So I had to install a new browser, download and install the service packs, and then I finally could get Windows Update to run. Which promptly downloaded 150 urgent updates, taking hours to download and install.

Overall it took me all afternoon and evening to get Windows Vista installed in a state where I could give the computer away with a good conscience. I found that while the 5-year old hardware was still perfectly adequate, the 5-year old operating system was a huge problem. I'm glad to be finally rid of Windows Vista for good. Now all of the PCs in my house run Windows 7. Even the new one, as I didn't want Windows 8. I'd rather wait for the next decent OS from Microsoft, which on past form should be Windows 9.

Friday, August 29, 2014
 
Ethical game journalism requires the journalist not to play games

I tend to see the world not in black & white, but in scales of grey. So I can't give you a clear yes or no answer on the question whether I consider myself as a game journalist. Obviously my activity, writing opinions about games on my blog, resembles game journalism. I once ran around a Blizzard convention with a press pass around my neck. On the other hand that is not my job, but just a hobby. So it is somewhere in a grey area where I am in part a game journalist, and in part I am not. So the part of me that is somewhat a game journalist is interested in the issues of game journalism, and the ethics thereof. For example I do have a strict disclosure policy, where I disclose if the product I am reviewing was a free review copy.

Lately the ethical questions about game journalism got somewhat reversed: Before the question was usually whether a game company gave money or things of value to a game journalist. Today the question is in the other direction: Does the game journalist give money to the game designer? Because if he does, he could be said to have a special interest in the success of that game designer, and thus not be objective. This sort of consideration caused Kotaku to post a new policy prohibiting their game journalists from supporting game designers on Patreon.

Now people point out that Patreon is just a single platform on which a game journalist could financially support a game designer. What about other platforms, like Kickstarter, or Steam Early Access? And ultimately, what about a game journalist buying a game, in which case part of his money also goes to the game designer?

So if you are a game journalist and you get a game for free, you can't be objective. And if you buy the game, you can't be objective either. I assume stealing the game isn't part of an ethics policy either. Which means that an ethical game journalist cannot play the game he is reviewing. He has to rely on YouTube or Twitch to see other people play it (now that explains the recent interest if internet giants in Twitch). I must say that there are game journalists around that are apparently far more ethically advanced than I am. I've read a lot of game reviews that made it quite plausible that the author writing the review never played the game in question.

I'm afraid that my blog has an unethical policy: While I do sometimes comment on games that I haven't played (for example because they don't exist yet), I don't put the word "review" on a post unless I have played the game. And in the large majority of cases that means that I have bought the game in one form or another. I do accept donations from readers to finance buying those games. I wonder when that will be considered unethical.

 
DM techniques for running D&D encounters faster

I talked this week about the dual role of the dungeon master (DM) in a game of Dungeons & Dragons or similar tabletop role-playing game: Prepare and improvise. In this post I'm going to talk exclusively about the preparation part. Advance warning, if you aren't planning to play a pen & paper role-playing game anytime soon, this post isn't going to be very interesting to you.

Me and my players love the tactical combat encounters of D&D 4th edition. We love having lots of options in each round of combat, and not just announcing basic attacks. And we love the tactical options that come from using figurines on a battlemap. For a combat to be tactical, it must last several rounds, so that the effect of tactics has more impact on the fight than lucky dice rolls. All that means that tactical combat takes a certain amount of time. But how much time it takes depends very much on DM preparation. If you hear from people who say it took them hours for a simple fight, you know that encounter was badly prepared. If you don't bring the tools to run tactical combat quickly, it is like digging a tunnel with a spoon. I recently watched Wizards of the Coast playing the first combat encounter of the 5th edition Starter Set on YouTube, and it took them 1 hour. I can play a 4E encounter of the same size in the same time, or faster, with the preparation I will list in this post.

So what is my secret weapon? Sorry, it isn't something fancy like a 3D printer. I am using a regular color laser printer. I prefer laser because the ink doesn't smudge when handling the paper, and the stuff I print for games gets handled a lot. And what I use a lot for the printed game material I use is thin cardboard, 210 g/m², which is thin enough to work with my printer, but thick enough to be a lot more resistant than regular paper.

The first tool for running encounters faster is printing all the powers and magic items the players have on little cards, the size of playing cards. I have to print those because I play in French, but at one point in time one could also buy power cards from WotC. What I also use is deck sleeves, the kind that players of Magic the Gathering or other trading card games use. So the at-will powers go into green sleeves, the encounter powers in red sleeves, and the daily powers in black sleeves, making it easier to find the power you need. I also have cards for action points and magic items, and each player gets a Deck Protector box with all the cards of his powers and stuff. The result is that nobody at my table needs to look up the details of his powers during combat, we basically never use the Player's Handbook during play unless there is a rules question we aren't sure about.

On the DM side I pack everything I need for one encounter into one clear sheet protector: Battlemap, monster stats, tokens, and initiative riders. I use Campaign Cartographer / Dungeon Designer software to print my battlemaps, unless I have a poster battlemap from a published adventure. For the characters of the players one of my players provided painted metal figurines. But for monsters I use 2-dimensional tokens. Some tokens I get from boxed adventures or the Monster Vault. But if I need my own I print them on 1" cardboard squares, which I stick on 1" square self-adhesive felt pads, the sort you can buy to stick under the legs of your chairs to not scratch your floor. Printed tokens have one advantage over figurines in that you can print numbers on them, which makes it easier to keep track of which orc got hit for how much damage or is suffering from which status effect. Speaking of which, I printed little rings on cardboard with status effects like ongoing damage and use them for both figurines and tokens on the battlemap. Finally I print 1" x 2" cardboard initiative riders, which I fold in half and place on the top of my DM screen, showing the order of initiative to both my players and me. By having the monster stats printed on paper I don't need to refer to pages of the Monster Manual or the printed adventure, and can also track health and status effects on that paper. With all that neatly packed together in one clear sheet protector, I can set up an encounter in a very short time without causing a huge pause in the narrative.

Outside encounters I use much less prepared material. I have Paizo Face Cards to represent my NPCs, because NPCs are more memorable if they have a face. I have the occasional handout, for example for quests, or to show images of a location. But most of the adventure information I have just stored in my brain, because things like NPC motivations and likely course of actions are just the basis for improvised role-playing, and not something you print and hand out.

All this preparation obviously takes some time. I don't mind, because while I prepare those encounters I can think about how to play them, which then helps me to run them better. Ultimately the goal is to make encounters interesting and memorable, and good preparation helps a lot there. You get a lot better immersion if your encounter isn't interrupted by organizational chaos or the DM having to look up stuff. Preparation not only cuts down the time spent on combat encounters, it also creates a smoother flow and better narrative.

Thursday, August 28, 2014
 
Investigative adventures in Dungeons & Dragons

I was reading this article on investigative adventures in D&D on Sly Flourish. Very interesting, especially to me right now, since in my campaign we will start an adventure like that next Monday. In the past, and with a different DM, we had adventures in which the players were supposed to investigate go wrong and stall, so this is kind of a danger zone for us. I think it helps to consider some human aspects here, starting with expectations.

We've all read or seen detective stories, from Agatha Christie's Hercule Poirot to Inspector Colombo. Being familiar with the format evokes a certain set of expectations when you try to play through something similar. But the detective in such stories cheats. There is only one author who controls both the murderer and the detective, so the detective can't fail to find all the clues, in the right order, and to put together the pieces to come to the right conclusion. The moment that you turn that into an actual multi-player game, with the DM having set the scene, knowing who the killer is, and having set up the clues, while the players need to discover all that, there is a significant chance that the players won't end up as successful as a Hercule Poirot.

The first advice here, based on own experience, is that a played murder mystery has to be significantly less complicated than one from a book or TV show. There need to be less locations to investigate, and less witnesses to question. That is especially important for a group like ours, as we only play twice per month maximum. If it takes us 6 sessions to investigate all locations and speak to all witnesses, that means that by the time we finish with the last, we have already forgotten the clues from the first, which was 3 months ago.

The advice from Sly Flourish is related to that: The players don't usually know where the clues are, and might well investigate a location that you as the DM didn't foresee, or talk to an NPC that you hadn't considered in your murder mystery. If the adventure doesn't limit the number of locations and NPCs somehow (murder in an isolated location like the Orient Express, boat on the Nile, lone manor, etc.), but happens in the middle of a city, you could end up with way too many locations and people to handle. So the trick is to *not* first place all the clues, and then hope that the players find them. Instead just make a list of the clues as bits of information, and be flexible where those bits of information can be found. If the players have an idea to search a place or talk to somebody, and the idea is somewhat reasonable, just decide that the clue is there. That might feel a bit like cheating, but it ends up having a flow that corresponds to expectations: The TV detective doesn't lose endless time by searching the wrong places and talking to the wrong people either.

My final advice is in disagreement with the Sly Flourish article: Yes, "players want to feel like their decisions matter and their actions lead somewhere". But that doesn't mean that the game world and the villain NPC have to be passive and sit and wait for the players to work through all the clues. Instead the villain NPCs have to be handled like characters with their own motivation, goals, and means. The villain should react to the investigation of the players. Again that conforms to expectations, detective stories frequently have the murderer kill another victim because the detective came close to getting a vital clue from that person. Because this is D&D and not Agatha Christie, the villain NPC might have far more possibilities in a D&D adventure, up to and including attacks on the players.

I have this concept in mind of the "turn-based" approach to role-playing. Basically the risk in D&D sessions that are heavy on role-playing and light on combat is that certain players take the lead and go off on long solo performances, while the other players fall asleep and the story isn't moving forward. Thus I try to gently nudge the role-playing into a structure where I give turns to other players, and to NPCs. Thus if one player goes on endlessly negotiating with a merchant, I say to the next player "Okay, so while Bob's character is negotiating with the merchant, what do you do?". And once I've given every player the chance to act, I think what a reasonable response or action from the NPCs, especially the villain, would be. That concept is explained beautifully in the recent WotC adventures Murder in Baldur's Gate and Legacy of the Crystal Shard. The main advantage is that it kind of puts the adventure on a clock: The game world is alive and stuff happens, even if the players dawdle. Once the players realize that, it creates better drama, because they KNOW that they don't endless time to find the solution.

So the next adventure will be an experiment on how successfully me and my players can handle an investigative adventure in a city. If that doesn't work at all, I will have to rethink my idea for my next campaign, because the adventure path I have chosen has a lot of investigative parts as well. Dungeon crawls are comparatively easy, but I hope that we can do more than that.

Wednesday, August 27, 2014
 
A gender-neutral thought

I totally get where this article on sexist video-gamers being terrorists is coming from. Nearly everything in that article is true. But I feel that there are two issues here, and mixing them up that way isn't all that helpful. One is sexism, which most certainly exists, and the other is video gamers behaving extremely badly under the cover of internet anonymity, which also most certainly exists. But if you drew a Venn Diagram of this, you would find that while there is a substantial overlap, the overlap isn't total.

For example the terrorist accusation has as example the bomb threat called in on a plane carrying SOE's John Smedley. Which is certainly an example of extreme video-gamer behavior, but not motivated by sexism. So is the example of the gamer calling a SWAT team to the house of his opponent after losing at Call of Duty. I mean in no way to excuse the abominable behavior recently shown by gamers that *are* based on sexism. But I think that it would be better to separate those two issues. If we would magically end sexism tomorrow, the problem of video gamers calling in bomb threats on video game executives would still remain.

Feminism is a broad church that is not speaking with one voice, but with millions of them. Many of those voices speak out against actual discrimination and are totally right to do so. But some other voices are fueled by hate against anybody with a Y-chromosome. And just like you can be a true Christian without supporting everything the extreme Christian Right says, you can be for gender equality without supporting everything the extreme feminists say. And in the above case it becomes very hard to stand up against video gamer hate if that means having to subscribe to feminist hate to do so. We could get a much broader support, especially from men feeling uncomfortable with some parts of some feminists' agenda, if we considered the two issues here separately. That doesn't mean you can't fight for both issues, but we should accept the two issues as different and quote sexism as an example instead of the underlying motivation for all video gamer hate. The kind of video gamers we are talking about really just hate about everything, not just women or feminists.

That brings me to the gender of the video gamer spewing hate on Twitter. Twitter has 271 million monthly active users. And increasingly the tweets are hateful in nature. There is something about the format that makes it easier to fire off a short hateful remark than a balanced, reasoned opinion. And sorry, but that isn't limited to male users of Twitter. Even on videogames you can find extremely nasty tweets written by women. While I am pretty much convinced that the majority of video gamers spewing hate is male, again it wouldn't be correct to paint that 100% as a gender issue. I am also pretty sure that the majority of the video gamers spewing hate is under 35 years of age, but it wouldn't be helpful to dress this discussion as a generational issue either.

We live in a civilization based on laws and certain rules of civilized behavior. Some people have discovered how internet anonymity can sometimes allow them to act outside of these laws and rules without consequences. The long-term effect of this will most certainly be that we will lose our right to remain anonymous on the internet. Everybody who uses that anonymity for a fake bomb threat or similar illegal activity makes it harder for the rest of us to insist on our right to privacy on the internet. As video gamers, regardless of gender, we need to speak out against the lawless sub-culture of video gamer hate. Because we don't want to mention at the water-cooler that we play video games and get a reply "Video gamers? Isn't that this terrorist outfit I hear so much about in the news?".

 
On rose-tinted glasses

Telwyn is discussing his notion that most people in the MMO blogosphere have rose-tinted glasses and are "idolising the past". I'd like to point out that many of the "classic" MMORPGs like Ultima Online or the first Everquest are still around. The fact that not many people play them any more tells me that they don't compare that well to modern games. Having said that, everybody has his first MMORPG, and that one is likely to have a profound impact on the thinking of that player. Because every MMORPG after your first is a mix of new stuff with features you already know, and thus is somewhat less impressive.

Old MMORPGs serve one important purpose in the context of blog discussion: They tried out a lot of ideas that ultimately didn't work. The experience players and game companies had with this classic games had a strong influence on how later games were designed. If you played Ultima Online early on, you will have a very different understanding of why in modern games PvP is often so limited. If you played Everquest 1, you will have a very different understanding of why modern games have flight paths, teleports, and other forms of fast travel. Everquest 1 is also fundamental to understand the quest-driven gameplay of World of Warcraft and beyond. So it is not so much "idolising" past games as being able to quote them in the context of brilliant new ideas that were in fact already discarded a decade ago. If we don't remember the past, history repeats itself, "first as a tragedy, second as a farce".

But of course those rose-tinted glasses exist. People say the "remember" those old games, when in fact they have a curiously selective memory that blends out anything that doesn't fit in their world view. Thus instead of remembering how after the split of UO nobody wanted to play in Felucca any more and Trammel was overcrowded, they choose to remember how "great" unlimited player-killing was before the split. If only the devs hadn't allowed all the potential victims to escape to safety! Ignoring that if the devs hadn't done that, the game would have died, because those victims were already running away by quitting the game.

Curiously people also sometimes forget the things that did work. How often have you heard that "forced grouping" doesn't work? The developers of several quite successful games before WoW would beg to differ, it worked quite well at the time. The negative effect of lone wolves not wanting to play such a game is compensated by the positive effect of people enjoying to play with others and making friends. Social bonds are stronger if you actually *need* other people to progress yourself. You might get less players on day 1, but then you don't have two thirds of your players quitting the game on day 30, which overall might be healthier for the game.

Games can serve as huge social experiments, but that only works if you compare the game with itself, before and after a change. You can't take the fact that people tend to flock to a new game as proof that a specific feature of the new game is better than a specific feature of the old game. Even the fact that World of Warcraft had a peak subscriber number 30 times higher than the previous games doesn't mean that *every* feature and design decision of WoW was better than the equivalent of the older games. People tend to like game for the overall impression that game makes on them, it rarely boils down to one specific feature.

 
D&D is only as good as the DM

I recently argued that pen & paper roleplaying had fallen out of favor because it is so much harder to organize a tabletop session than to organize some other game online. But the 5th edition starter set has resulted in a lot of podcasts and YouTube videos of different groups recording their session of playing the same adventure with the same rules. And one can't help but notice that the quality varies widely. So if you think of a hypothetical group of teenagers trying to get into D&D without outside help, just armed with the Starter Set and the Basic Rules pdf, there is an obvious pitfall: A DM who is new to both playing and leading a game is quite likely to be bad at it. And that might turn the whole group away from that hobby.

Now the good news is that D&D, even if some people would like you to think otherwise, is not *one* game but a million different ones. There is no such thing as the one true way to play Dungeons & Dragons, however much some people might preach their way. You can run a game with an adventure that has a predefined story with a beginning, middle, and end. You can also run a game which is more or less pure sandbox, with no story at all. And everything in between.

Those two extremes point towards two main qualities that a DM must have: To run an adventure with a fixed story and fixed encounters, he must know the adventure very well, know the rules, and come to the session well prepared. Especially if you play tactical encounters with figurines/tokens on a map, preparation makes a huge difference on how smooth and fast that is going to run. The second quality comes from the sandbox aspect of D&D: A DM must be good at improvising. Even if the players are supposed to follow a story, it is always possible that they make some unexpected decision that leads the events in a different direction. And the DM must be able to come up with a believable response of the game world to whatever action the players perform. You probably hadn't thought the wizard would use a fireball in the bar room brawl, so how does your city react to the tavern being on fire?

Every DM needs both of those qualities. Being good at improvisation doesn't absolve you from having to know the rules and your game world. Whatever you improvise today will be canon lore tomorrow, so you will have to remember what told your players about some NPC or location. And if you make an improvised rules decision, that better fit with the existing rules. Otherwise your overly generous bonus you gave a player for throwing sand in his enemies' eyes will become a new house rule that leads to every player carrying a bag of sand around.

In my eyes a computer usually makes not a great DM. A computer is good at consistency and speedy delivery of prepared rules and story. But a computer is lousy at improvisation. I'm currently playing Divinity: Original Sin, which makes a great effort to have the game world react in different ways to different approaches that you can take in any given situation. But you can't help but notice that things like destructible environment are frequently limited: You throw a fireball into a room and the chair gets destroyed, but the tapestry doesn't; the chair was programmed as possibly destructible object, the tapestry is just a texture on the wall and can't really be interacted with. Thus typical computer game problems of world-saving fantasy heroes being stopped by a knee-high fence.

But if you compare a computer game with a tabletop game, it is perfectly possible for the DM of a tabletop game to be worse than the computer. A human DM can be bad at *both* improvising and prepared content. In 30 years of tabletop roleplaying I certainly met my fair share of bad DMs that would have made me choose a computer instead if I had been given the option. A computer is some sort of baseline mediocre at running a good game, and many human DMs can do a lot better, which is why I prefer pen & paper roleplaying to the computer version. But I can just as well imagine a group of teenagers trying out D&D for the first time with a DM who is badly prepared and bad at improvising, and concluding that their computer games are better than that.

Tuesday, August 26, 2014
 
Speed!

Strictly speaking a computer doesn't have any speed at all, as you measure speed in meters per second, and a desktop computer tends to be rather stationary. But of course you can measure the speed of a computer in many other ways, by setting him a task and timing how long he takes for that. There are units of measurement like megaFLOPS, but such units are more useful for scientific calculation speed than for the speed of a gaming PC.

Thus when I ordered a new computer, I invested some money in 3DMark, which is now available on Steam, which makes it a lot more user-friendly to install and handle. The result was that on my old computer the DirectX 11 Fire Strike benchmark had a score of just under 4,000. Today I received my new computer, and ran 3DMark again for comparison: 7,500 in the Fire Strike benchmark. Which means that my new computer is nearly twice as fast as the old one if it is graphics speed that concerns you most.

I have a sneaking suspicion that what will make more of a difference is that I have now a much larger SSD drive. On the previous computer I had 256 GB SSD, which was enough to have Windows and some favored applications run from that drive. But I couldn't put my whole Steam library on that, so some games I ran from the slower, regular hard drive. On the new computer the SSD is twice as big, with 512 GB. Which means that I can install most of my games on the SSD drive. And that should cut down loading screen times a lot. And ultimately a few seconds saved on each loading screen feels a lot faster than a higher framerate.

Monday, August 25, 2014
 
Still playing Divinity Original Sin

I spent most of this weekend playing Divinity: Original Sin, and I'm still only half way through. This is a really epic game, and that suits me just fine. In fact I find myself continually making plans on how I would make a different build and setup of characters for a second playthrough. I've been playing this first game with a relatively simple and efficient build, based on the talents Lone Wolf, Zombie, and Leech. What that means is that I'm playing with a party of 2 and can't use additional companions (the more "usual" game would have you controlling 4 characters), I don't use regular healing, and I heal instead of taking damage from two of the more common sources of damage. Even after the recent nerf to Leech that is still on the overpowered side, with some undead simply unable to damage me at all.

While efficient, I can't help but ask myself how the game would play if I would use a more "normal" setup, not being immune to poison and bleeding, using regular healing, and playing with 4 characters instead of 2. I'd also would like to try a character with dexterity, using ranged weapons and backstabs instead of my classic sword and board melee fighter. I'm looking forward to trying all that out, but first I'd like to finish the first game. While the "normal" setup is probably less easy, I like the idea of having to approach the fights very differently. I figure the combat experience will be much different if I play through the game with a build without those three talents.

Having said that, I'm not sure I'll manage a complete second playthrough. Curiously enough in Divinity Original Sin combat is relatively rare. This is not like Diablo, with monsters behind every corner. You spend a lot of time exploring, clicking through various containers for loot, dealing with traps, crafting, or taking decisions in dialogue. While I would take the talent that allows me to talk to animals, thus opening more dialogue options, in the second game, I am afraid that the replayability of the exploration part of the game isn't as good as the replayability of the combat part. The sense of discovery is much diminished by experiencing the same story in the same environment a second time, even if you make some different choices and some random outcomes are different.

In Dungeons & Dragons there are a few adventures (Ravenloft, Madness at Gardmore Abbey) in which major aspects of the story are determined randomly at the start. A player who plays through the adventure twice might be surprised when the story is not the same the second time around. I haven't seen anything like that in a computer role-playing game, although there are some examples where the ending of the story is determined by the actions of the player, which is already something. Until then we need to live with that disadvantage of story-heavy role-playing games having a diminished replayability.

Friday, August 22, 2014
 
Playing for challenge vs. playing to win

In the "real gamer" discussion the proponents of the term linked it to challenge. Quote: "A real gamer then would be someone who sees games in general or even only a specific game not as something to just have fun with but as an actual challenge.". They see people who play for the challenge as real gamers, and those who play for fun, for the story, for exploration, for social contacts, or for a myriad of other reasons as not real gamers. But is that a useful distinction, players who care for the challenge and players who don't? One other commenter asked: "Would sombody who uses cheats on their games ... be considered a true gamer?". And that question reveals a whole other dimension of player behavior.

Obviously the player who cheats cares for the challenge. A casual player who just plays for fun, for the story, etc., doesn't cheat because that wouldn't align with his goals. But while the player who cheats thinks the challenge is important, he doesn't actually want to beat it. He just wants to win, have the status of a winner who beat the challenge, without actually having to go through all of the effort.

Google the name of you favorite game and "cheat", and you will find tons of offers helping you to cheat with the game. Many game companies running competitive multi-player games spend the majority of their operating expenses on anti-cheating measures. There is a constant arms race between people who program cheat software and people who program anti-cheat software. Video game cheating is a multi-million dollar business.

But in other games the distinction between people who play for the challenge and people who just want to win is a lot more subtle. Take MMORPGs for example: You would assume that somebody who plays for the challenge will try to increase the challenge. But the most frequently observed behavior is one of trying to diminish the challenge: Players want the best possible gear, they want to play with others only if those others are highly competent, and they want to raid only dungeons where everybody is well prepared and well trained for every encounter. Apart from Gevlon there aren't many people who say "I raid for the challenge, so I'll raid in blue gear". Nobody says "I raid for the challenge, so I am grateful for the other players in my raid that don't play so well and thus increase my challenge.". Few people raid for the challenge and go into the raid dungeon without having studied internet sites telling them how to beat the bosses. You will find guilds boasting about their "server first" raid achievement, without mentioning that this server first was carefully orchestrated and made easier by a month of training the raid on the test servers. It is very clear that all of these people play to win, and not because they enjoy an actual challenge.

People really just wanting to be seen as winners are also behind many of the social conflicts in MMORPGs, for example the endless discussion about welfare epics or easy mode dungeons. Playing for the challenge is a very personal thing, nobody else but yourself can tell you whether you deserve to be proud of having beaten a challenge. If you play for the challenge, you don't care what gear somebody else is wearing or what places he is allowed to visit. Playing for winning status symbols is a social thing: Epics are not just making the next win easier, they also serve as a social status symbol distinguishing the "winners" from the "losers". So other people being able to get those status symbols in a different manner is a big thing if you play to win, and not just for the challenge.

I believe that many of those who attach the silly label of "real gamer" to themselves are not actually playing for the challenge. They play for the status that comes with beating a challenge, even if they have to cheat or manipulate the circumstances in their favor to get the win without much of a challenge. Challenge is just an euphemism, and not a widely shared real value.

Thursday, August 21, 2014
 
An ailing hobby

In many ways a tabletop role-playing game is very social. You sit around a table with friends and interact a lot with each other during hours. In other ways however the hobby is somewhat insular: Your table is the virtual world, and that world does not necessarily have much connection with other virtual worlds or players out there. Even the companies making those pen & paper role-playing games aren't quite sure how many people are actually out there playing, as any given sold rules book could either be long lost in the garbage, or be the centerpiece of a group of several people. Having myself played tabletop RPGs, mostly various editions of Dungeons & Dragons, for over 30 years, I always considered this to be an active hobby with many other players out there, even if I didn't see them. I might have been wrong.

In a recent market study, the North American "hobby game market" was found to have hit $700 million at retail in 2013. But of those $700 million collectibles made $450 million, miniatures $125 million, board games $75 million, non-collectible card and dice games $35 million. What about tabletop role-playing games? Only $5 million. Wow! That is nothing! There are single Facebook games that earn more money than that!

While it is theoretically possible that people play on forever with old books, such low sales volume are indicative of an ailing hobby. With a game like World of Warcraft making over 100 times more money per year than all pen & paper role-playing games together it appears obvious that people interested in fantasy role-playing today are online, and not sitting around a table with friends. And if you look around for example for role-playing material on YouTube you'll find that the people there don't exactly look like teenagers; this is a hobby with not much fresh blood and a lot of 40+ year old players.

Obviously Wizards of the Coast hopes to revive the hobby with the 5th edition of Dungeons & Dragons. I've seen several games stores reporting the new Player's Handbook having sold out on the first day. I went to a local games store yesterday and could only get hold of a Starter Set. There are a lot of things that make 5th edition quite suitable for people new to the tabletop role-playing hobby: The Starter Set is affordable, the Basic Rules are free, and while 110 pages of rules might still seem daunting to some people, that is already a lot less than previous editions of Dungeons & Dragons or Pathfinder (and many of those pages are actually spell lists).

The biggest obstacle to playing a tabletop role-playing game is organization. Already in MMORPGs it is only a small fraction of the players who meet online regularly for a continuous block of several hours to play together. A pen & paper game not only requires that block of hours, but also for people to physically travel to the same location, and you'll probably want some food and drink there as well. But as a reward you get a game which feels a lot less restrained by the limits of technology and the imagination of some game designer. Instead of meeting to kill the same boss mob for the tenth time, you get a fresh story every session, limited only by the collective imagination of all the players around the table. That is well worth the organizational effort. I hope that the role-playing hobby can recover from it's current low.

Wednesday, August 20, 2014
 
Real gamers

Advance warning: If you consider yourself a "real gamer", you might not want to read this post.

Apparently there has been a heated discussion on Twitter and the games blogosphere about what defines a "real gamer". Basically there is a group of people out there who would like that to be some sort of exclusive label, some sort of badge of honor, some sort of true achievement. The discussion then starts because anybody who even wants to be included in the definition of "real gamer" then wants basically that his own level of skill/expertise/hardcoreness/dedication/whatever you want to call it is still included in the definition of what a "real gamer" is, while anybody who is slightly less skilled/expert/hardcore/dedicated/whatever should definitely be excluded and be branded a "fucking n00b" instead.

The whole exercise is so pathetic, it kinds of makes one sad. Just imagine it, there is somebody out there who is extremely proud that he beat some game at a higher difficulty level than you did. THAT is his greatest achievement in life, the thing he is most proud of, the defining feature of his self-worth, and how he sees himself. What kind of a loser does one have to be if the greatest thing one achieved in life is being good at a video game?

Social Identity Theory is full of this sort of behavior: A) We want to belong to a group, but B) we want to group to be exclusive and see it as being better than any other group. That already causes enough problems if the group is well defined, if by your passport, origin, or religion you can without doubt say to what group you belong or don't belong. But it gets completely silly if you need to apply fuzzy adjectives like "real" in your definition. Reminds me of an episode years ago where somebody in chat was looking for a group, but only wanted "serious" players with a gear score of at least 6,700. Guess what gear score he had. If everybody defines "real" or "serious" as "me, and everybody better than me", we never even get two people to agree on one definition of who is member of that group and who isn't.

Defining yourself as a "gamer" in the most general and most inclusive definition of the word can actually serve a purpose. There is market research that is quite interested in the question how many people would be interested in spending at least part of their disposable time playing games. The overall number of "gamers", if you define it as people who are willing to buy a game or otherwise spend money on one, is growing; and that has consequences: If there are more "gamer" potential customers, more games get produced. And yes, you can sub-divide that group of "gamers" into sub-groups that also make sense from a market point of view. How many "console gamers" are there? How many "mobile gamers"? How many "PC gamers"? Or even how man "first person shooter gamers"? If you have an answer to these questions and could track the evolution of these numbers somehow, you would have information useful in deciding what kind of game to develop.

In comparison to all that, a definition of what a "real gamer" is just serves no purpose at all other than stroking the ego of the person who twisted the definition to include himself in it. What kind of sensible game design or marketing decision can you make based on that definition? Sell T-shirts that say "I'm a real gamer, but you're a n00b!"? Being marginally better than somebody else in playing a specific videogame under specific conditions just serves no useful purpose at all in life. Everybody else who sees you in your "real gamer" T-shirt will only translate the term into "basement-dwelling no-life loser", even if that is obviously a crude simplification as well. The very idea that anybody could possibly look up to you because you are a "real gamer" and they are not is completely idiotic. On any scale people tend to despise the people above them at least as much as the people below them. "Real gamers" don't impress anybody.

‹Older

  Powered by Blogger   Free Page Rank Tool