There was a time (which I now find hard to believe) when I did not own my own PC. At the tender age of 13 I made the discovery that certain video games would not run on my family computer, which, at 32MB of RAM, was even at that time horribly outdated. Until that point I had not really thought much about playing games on the computer. I had got along quite happily with my PS2 until that point. Once I had saved the pennies and bought my first gaming computer, the metaphorical slippery slope become a sheer cliff face comprised of banana skins.
I sit here, nine years later, in front of three monitors hooked up to something not unlike HAL 9000 from 2001: A Space Odyssey. I’ve built and customised three separate gaming computers, and have spent countless hours (and pounds) researching parts, combinations, framerates and benchmarks. Yet, at no point do I feel part of what, as a collective (and only half jokingly), the PC gaming community calls itself: “The Gaming Master Race”. Originating from a joke made by Ben “Yahtzee” Croshaw in his review of The Witcher in 2008, it has come to be used as a term for the PC community and its general dislike or disdain for “dirty console peasants”.
The evidence for superiority often cited by PC gamers is that of seamless online community interaction, enhanced graphical capability and, of course, the staying power of the PC as a machine. In some ways the arguments can be upheld. A large majority of PC games have modding communities that improve or fix things in games that otherwise may be overlooked by console developers or their patches. The thriving PC modding communities of the last three Elder Scrolls games are a great example of this. The keyboard and mouse can also be argued to be a better control method for some types of games (namely FPS and RTS). Moreover, the techno-race between AMD, Nvidea and Intel is light-years ahead of the one between Sony and Microsoft’s consoles.
This year, though, I have begun to notice a change. Usually the console-exclusivity policies of Microsoft and Sony rarely bothered the PC gaming community. Halo was, after all, a Microsoft produced series, and it stood to logic it would be available on PC. Likewise, Sony produced few games that were of great interest to PC gamers. One game released in 2013 has made this issue rear its ugly head: The Last of Us. As if suddenly spurned by the fact the for-certain game of the year would never grace their machines, PC gamers took to forums and webpages to vent their ire at other examples – Red Ded Redemption and GTAV being prominent. Suddenly they were not being considered the pinnacle of gaming any more, and this scared the PC “Master Race”.
There is much to be said about the often abrasive attitude PC gamers have towards console gamers, but there is also a case to made about the opposite. At times I have often been mocked as being part of the “Master Race” – usually when my friends are destroying me at console-based FPS games. The term has been embraced so fully by both sides that the fact that a gamer can posses both a console and a PC never really enters people’s minds. I own a gaming PC but also play my old PS2 and occasionally fire up the GameCube for parties. My parents, predictably, own a Wii, so I’m always having to play MiiGolf whenever I’m visiting home. Even on my mobile, I play TempleRun and Angry Birds.
I suppose what I’m trying to say is this: I am a PC gamer, but I am not a member of the “Master Race”. I am a console gamer, but not a fanboy. I play PS2 and GameCube, but I’m not a hipster. I play on my phone, but I’m not a filthy casual. We’re all gamers, wouldn’t it be nice if we just embraced that fact and leave something as ugly as segregation behind us?