Remember when a game’s worth was determined not by how many polygons it pushed around the screen at any point in time, or how many pixels there were, but because it was fun?
This ridiculous snobbery - manifesting most keenly in message board arguments over who has the better console - often makes me wonder if people have forgotten just why they decided they enjoy playing games in the first place. It shocks me, and to say the least annoys me, to see such an emphasis on graphics in this day and age. Every time I go onto my Twitter feed or my gaming newsfeed, all I see is “Xbox One 1080p” here and “omg y isnt ac unitee 4k on ps4″ there. Have graphics really become that important?
Let’s go back a few years to the mighty and triumphant start of a little game we like to call Minecraft. Hopefully, you’ve all heard of it. If you haven’t, then a) have you been stranded on a desert island somewhere for the last five years? and b) go and download the free version right now and educate yourself. Minecraft took the internet by storm due to its massive replayability factor and retro, almost-beautiful visuals, born from both necessity and the game’s origins as Markus “Notch” Persson’s personal experiment in game design, but since becoming so iconic that you can look at any screenshot of the game and instantly know what it is. But hang on a second, how on earth could a cubic game with 8-bit textures be considered beautiful in this modern era of video game graphics?

Minecraft’s first Alpha release
For the first time in years, people actually began to consider graphics as a compliment to the other elements of a game and an artistic choice, rather than a driving force. Some would argue that Minecraft’s graphics were indeed so passé for the time that in the end they were considered good. Others argued that the surge in pixel-art indie games was down to a mere bit of nostalgia for our retro 8-bit games. And there’s certainly an argument to be had about whether Minecraft is responsible for meteoric rise in popularity of retro graphics in modern games, that Minecraft made it acceptable to adopt a lo-fi approach to visual design. So if Minecraft can suddenly become one of the top 10 highest-selling, grossing and played games of all time with what are considered to be less-than-ideal graphics for the technology of today, why aren’t other games given the same amount of consideration?
Most of these recent complaints and news stories about graphics have emerged following the release of the latest generation of consoles, the Xbox One and PlayStation 4. With them, came shiny new hardware for developers to make the most out of and so their games’ visuals and frame rates became far more agreeable to the current state of the industry, certainly from a PC gamer’s point of view.
Having played Assassin’s Creed 4: Black Flag on both Xbox 360 and Xbox One, the moment I tried it on the latter console I was blown away by the difference in visual fidelity. The crispness of the palm trees blowing in the wind and the vivid reflections and refractions of the light on the vast, almost endless, oceans just took me by surprise throughout the opening sequence. I was immediately pleased with my decision to have bought the game again, but this time for a more powerful console.
But in recent months, the emphasis has been on exclusively next-gen games releasing on both Sony and Microsoft’s latest consoles, and the differences between them on each platform.
Let’s get the facts straight first: the latest consoles can both run games at 1080p at 60fps. Actually, that’s not quite accurate. In fact, the latest consoles can both run games of up to 1080p at 60fps in terms of resolution; but it doesn’t mean that all games will run at these rates.
Let me turn your attention now to the PC industry. You just got your fancy new gaming pc and you start up a game that has four graphics presets: Low, Medium, High and Ultra. Now, you’ve played Crysis 3 on Ultra settings at a gaming Expo with some of the nicest looking visuals the industry has to offer, but despite having set you back best part of two month’s wages your shiny new system still can’t maintain a consistent framerate on the Ultra preset. Yet, you don’t complain. You simply crank the graphics to High instead. Crysis 3 still looks gorgeous on High, there’s just a small difference in a few effects like lighting, draw distance etc. It’s still a beautiful game. So why is it that on consoles, just because a game may not run natively at 1080p and 60fps, people feel the need to make a fuss?
I for one couldn’t care less about whether I’m playing a game at 30fps or 60fps or maybe 1920×1080 pixels or 1440×920 pixels. As long as the game runs to a decent standard, the gameplay is satisfying, the narrative is good, the sound design is immersive, and the acting is decent, I’m fine! Notice all the separate elements, however. I would rather have an across-the-board mediocre game, than a game with outstanding graphics, but bland gameplay, dodgy controls, a voice actor who makes you hate your ability to hear and a narrative so bad that it comes across as though it was written on the back of a cigarette packet by the neighbour’s dog.
Let me now take you back to Half-Life. At the time, its graphics were remarkable and revolutionary. Now, not so much. The textures are low-resolution and pixelated compared to what we play today, the skybox is repetitive, the crowbar’s noise is almost as annoying as that of neighbour mowing his lawn on a Sunday and yet, I utterly love it and still return to it every now and then. Its narrative is gripping and the gameplay was, for the time, a game-changer (get it?!) for the industry.Returning to Black Mesa in all its late-nineties visual glory is as enticing a prospect to me now as it was when Valve’s masterpiece was first released back in 1998.
But we do have to consider the other side of the argument.

CD Projekt RED’s upcoming The Witcher 3: Wild Hunt is taking visual fidelity to a whole new level.
Brand-spanking new consoles cost a heck of a lot for what they offer. If you’ve just upgraded from the trusty Xbox 360 to this mysterious wonder that is an Xbox One, or have taken the plunge from a PlayStation 3 to a shiny new PS4, then you expect decent results. I understand why people become frustrated at the possible release of upcoming games that don’t run at the full resolution and frame-rate which the new consoles can accommodate - especially the lingering feeling that you chose the wrong console when your shiny new game looks ever-so-slightly better on the console that you didn’t pick - assuming you didn’t buy both, of course.
But if people really want to complain to that extent about upcoming video games, at least go beyond the simple idea of the resolution and frame-rate. Look at how the anti-aliasing behaves with fast movement, examine the use of motion-blur and whether it is used too sparingly or too often, examine the camera position and field of view, scrutinise how environments react to the lighting in a room and whether the rates of reflection and dissipation along with the ambient occlusion are satisfactory. Are you are seriously that bothered with the resolution that the lack of an effective polygon shape bump mapping system would bother you? Your game still plays fine, it just might have a few fewer pixels on screen at any point in time.
I find it difficult to understand how we have managed to go from adoring the little pixelated Pokémons moving from one position to a completely different position in a split second before using Flamethrower, to suddenly becoming pedantic about the standard of graphics in our industry. In all honesty, if the graphics bother you that much then by all means buy a PC and wait 2 or more years before the PC release is out and play it to its fullest capacity. Or feel free to criticise all the other aspects of the game in such detail.
So far, I’ve discussed the PC, Xbox One and PlayStation 4 when discussing this visual arms-race. But there’s another player to consider: Nintendo. Upon its release, the Wii U was roundly mocked by many for Nintendo’s failure to make use of the latest technology available to it. I’m not sure why people were so surprised by this; the company has long been vocal about its disinterest in continuing to compete with rivals more interested in pushing system specs to the limit, instead taking a more cost-effective and pragmatic approach better suited to the kind of games they want to make, and the costs involved in creating them.
But looking at some of the games available on the Wii U, and you’d be hard-pressed to say that the relative lack of grunt in the Wii U has been to the detriment of its games. Super Mario 3D World, Super Mario Kart 8, and the gorgeous Bayonetta 2 are among some of the most visually striking games released for any system in the last 12 months. Sure, they may not be pushing around as many polygons as something like Destiny, or feature all the snazzy lighting effects and detailed foliage as The Witcher 3: Wild Hunt, but when something looks this good, does it matter? Super Mario Kart 8 looks fantastic. When I’m tearing around Yoshi Valley in all its burnt-orange glory, powersliding around a corner only to be dazzled as the glaring sun bounces its rays off the screen, I’m not thinking “this would look better if only the Wii U had a more powerful GPU”. I’m thinking “fuck me, this is fantastic!”
We need to start appreciating graphics as a compliment to the other elements in a game, rather than being the deciding factor in the eternal question that is “to buy, or not to buy?”
And if more developers and publishers start taking the same approach and focus more on fun than fidelity, who knows? We might actually start to get more interesting games. Who can possibly say that would be a bad thing?