We need to stop being obsessed about resolution and framerate.

graphics

Remember when a game’s worth was determined not by how many polygons it pushed around the screen at any point in time, or how many pixels there were, but because it was fun?

This ridiculous snobbery - manifesting most keenly in message board arguments over who has the better console - often makes me wonder if people have forgotten just why they decided they enjoy playing games in the first place. It shocks me, and to say the least annoys me, to see such an emphasis on graphics in this day and age. Every time I go onto my Twitter feed or my gaming newsfeed, all I see is “Xbox One 1080p” here and “omg y isnt ac unitee 4k on ps4″ there. Have graphics really become that important?

Let’s go back a few years to the mighty and triumphant start of a little game we like to call Minecraft. Hopefully, you’ve all heard of it. If you haven’t, then a) have you been stranded on a desert island somewhere for the last five years? and b) go and download the free version right now and educate yourself. Minecraft took the internet by storm due to its massive replayability factor and retro, almost-beautiful visuals, born from both necessity and the game’s origins as Markus “Notch” Persson’s personal experiment in game design, but since becoming so iconic that you can look at any screenshot of the game and instantly know what it is. But hang on a second, how on earth could a cubic game with 8-bit textures be considered beautiful in this modern era of video game graphics?

AvAtr

Minecraft’s first Alpha release

For the first time in years, people actually began to consider graphics as a compliment to the other elements of a game and an artistic choice, rather than a driving force. Some would argue that Minecraft’s graphics were indeed so passé for the time that in the end they were considered good. Others argued that the surge in pixel-art indie games was down to a mere bit of nostalgia for our retro 8-bit games. And there’s certainly an argument to be had about whether Minecraft is responsible for meteoric rise in popularity of retro graphics in modern games, that Minecraft made it acceptable to adopt a lo-fi approach to visual design. So if Minecraft can suddenly become one of the top 10 highest-selling, grossing and played games of all time with what are considered to be less-than-ideal graphics for the technology of today, why aren’t other games given the same amount of consideration?

Most of these recent complaints and news stories about graphics have emerged following the release of the latest generation of consoles, the Xbox One and PlayStation 4. With them, came shiny new hardware for developers to make the most out of and so their games’ visuals and frame rates became far more agreeable to the current state of the industry, certainly from a PC gamer’s point of view.

Having played Assassin’s Creed 4: Black Flag on both Xbox 360 and Xbox One, the moment I tried it on the latter console I was blown away by the difference in visual fidelity. The crispness of the palm trees blowing in the wind and the vivid reflections and refractions of the light on the vast, almost endless, oceans just took me by surprise throughout the opening sequence. I was immediately pleased with my decision to have bought the game again, but this time for a more powerful console.

But in recent months, the emphasis has been on exclusively next-gen games releasing on both Sony and Microsoft’s latest consoles, and the differences between them on each platform.

Let’s get the facts straight first: the latest consoles can both run games at 1080p at 60fps. Actually, that’s not quite accurate. In fact, the latest consoles can both run games of up to 1080p at 60fps in terms of resolution; but it doesn’t mean that all games will run at these rates.

arma3_screenshot_01

Arma 3, from Bohemia Interactive, is stunning to look at on higher settings.

Let me turn your attention now to the PC industry. You just got your fancy new gaming pc and you start up a game that has four graphics presets: Low, Medium, High and Ultra. Now, you’ve played Crysis 3 on Ultra settings at a gaming Expo with some of the nicest looking visuals the industry has to offer, but despite having set you back best part of two month’s wages your shiny new system still can’t maintain a consistent framerate on the Ultra preset. Yet, you don’t complain. You simply crank the graphics to High instead. Crysis 3 still looks gorgeous on High, there’s just a small difference in a few effects like lighting, draw distance etc. It’s still a beautiful game. So why is it that on consoles, just because a game may not run natively at 1080p and 60fps, people feel the need to make a fuss?

I for one couldn’t care less about whether I’m playing a game at 30fps or 60fps or maybe 1920×1080 pixels or 1440×920 pixels. As long as the game runs to a decent standard, the gameplay is satisfying, the narrative is good, the sound design is immersive, and the acting is decent, I’m fine! Notice all the separate elements, however. I would rather have an across-the-board mediocre game, than a game with outstanding graphics, but bland gameplay, dodgy controls, a voice actor who makes you hate your ability to hear and a narrative so bad that it comes across as though it was written on the back of a cigarette packet by the neighbour’s dog.

Let me now take you back to Half-Life. At the time, its graphics were remarkable and revolutionary. Now, not so much. The textures are low-resolution and pixelated compared to what we play today, the skybox is repetitive, the crowbar’s noise is almost as annoying as that of neighbour mowing his lawn on a Sunday and yet, I utterly love it and still return to it every now and then. Its narrative is gripping and the gameplay was, for the time, a game-changer (get it?!) for the industry.Returning to Black Mesa in all its late-nineties visual glory is as enticing a prospect to me now as it was when Valve’s masterpiece was first released back in 1998.

But we do have to consider the other side of the argument.

witcher06

CD Projekt RED’s upcoming The Witcher 3: Wild Hunt is taking visual fidelity to a whole new level.

Brand-spanking new consoles cost a heck of a lot for what they offer. If you’ve just upgraded from the trusty Xbox 360 to this mysterious wonder that is an Xbox One, or have taken the plunge from a PlayStation 3 to a shiny new PS4, then you expect decent results. I understand why people become frustrated at the possible release of upcoming games that don’t run at the full resolution and frame-rate which the new consoles can accommodate - especially the lingering feeling that you chose the wrong console when your shiny new game looks ever-so-slightly better on the console that you didn’t pick - assuming you didn’t buy both, of course.

But if people really want to complain to that extent about upcoming video games, at least go beyond the simple idea of the resolution and frame-rate. Look at how the anti-aliasing behaves with fast movement, examine the use of motion-blur and whether it is used too sparingly or too often, examine the camera position and field of view, scrutinise how environments react to the lighting in a room and whether the rates of reflection and dissipation along with the ambient occlusion are satisfactory. Are you are seriously that bothered with the resolution that the lack of an effective polygon shape bump mapping system would bother you? Your game still plays fine, it just might have a few fewer pixels on screen at any point in time.

I find it difficult to understand how we have managed to go from adoring the little pixelated Pokémons moving from one position to a completely different position in a split second before using Flamethrower, to suddenly becoming pedantic about the standard of graphics in our industry. In all honesty, if the graphics bother you that much then by all means buy a PC and wait 2 or more years before the PC release is out and play it to its fullest capacity. Or feel free to criticise all the other aspects of the game in such detail.

So far, I’ve discussed the PC, Xbox One and PlayStation 4 when discussing this visual arms-race. But there’s another player to consider: Nintendo. Upon its release, the Wii U was roundly mocked by many for Nintendo’s failure to make use of the latest technology available to it. I’m not sure why people were so surprised by this; the company has long been vocal about its disinterest in continuing to compete with rivals more interested in pushing system specs to the limit, instead taking a more cost-effective and pragmatic approach better suited to the kind of games they want to make, and the costs involved in creating them.

But looking at some of the games available on the Wii U, and you’d be hard-pressed to say that the relative lack of grunt in the Wii U has been to the detriment of its games. Super Mario 3D World, Super Mario Kart 8, and the gorgeous Bayonetta 2 are among some of the most visually striking games released for any system in the last 12 months. Sure, they may not be pushing around as many polygons as something like Destiny, or feature all the snazzy lighting effects and detailed foliage as The Witcher 3: Wild Hunt, but when something looks this good, does it matter? Super Mario Kart 8 looks fantastic. When I’m tearing around Yoshi Valley in all its burnt-orange glory, powersliding around a corner only to be dazzled as the glaring sun bounces its rays off the screen, I’m not thinking “this would look better if only the Wii U had a more powerful GPU”. I’m thinking “fuck me, this is fantastic!”

We need to start appreciating graphics as a compliment to the other elements in a game, rather than being the deciding factor in the eternal question that is “to buy, or not to buy?”

And if more developers and publishers start taking the same approach and focus more on fun than fidelity, who knows? We might actually start to get more interesting games. Who can possibly say that would be a bad thing?

Oliver McQuitty
A student with a passion for video games, cats and music-making, Oliver has always adored anything that involves sitting down, relaxing and enjoying good entertainment. He writes about anything from gaming news to opinions on the state of the industry.
Oliver McQuitty

@oliverfrenchie

Hambleton | Deep House Button Presser
@RokoReyes ❤️ - 8 hours ago
Oliver McQuitty
Written By
 

Related posts

  • Fetch > Delsin

    I agree. The only real advantage the PS4 has is slightly higher resolution but I don’t think it’s that much of a difference to be honest. XB1 is actually the better choice right now.

    • drjonesjnr

      Lmfao the xbone is irrelevant mate wii u has sold more. Better choice? Why ? I believe that console can’t even do screen snaps yet? With all the apparently great system updates i thought that would be one of the 1st things on the list be years till they have something like sonys share play

      • Fetch > Delsin

        No, it still can’t do screen shots yet. They have better games though. We just have indies.

        • drjonesjnr

          Well that’s not true is it? The highest rated game on both consoles is the last of us remastered on ps4

          • Fetch > Delsin

            Which was also on PS3.

          • drjonesjnr

            And your point is ? You can play titan fall on 360 forza h2 ? Halo ?

    • demderp

      http://www.metacritic.com/browse/games/score/metascore/all/ps4?sort=desc

      http://www.listwar.com/released-games/

      PS4 has good games already with more on the way. They both have good games not on the other console, it depends on your gaming tastes.

      Retail metacritic 75+ not on Xbox: The Last of Us: Remastered - 95, Final Fantasy XIV - 86, MLB 14: The Show 14 - 83, Injustice: Gods Among Us - 80, Infamous: Second Son - 80, Samurai Warriors 4 - 78

      Littlebigplanet 3 and Guilty Gear Xrd in 2014 should both rate highly in reviews.

      PS4/Sony exclusives 2013-2014: Killzone: Shadowfall, Knack, Resogun and Heroes DLC, Flower, Sound Shapes, Flow, Doki Doki Universe, Infamous: 2nd Son and First Light, MLB: The Show 14, TLOU Remastered, Dead Nation, Entwined, Hohokum, Counterspy, Velocity 2X, Driveclub, LittleBigPlanet 3, Natural Doctrine, Samurai Warriors 4, Guilty Gear Xrd, Akiba’s Trip, Singstar: Ultimate Party, #KillAllZombies, Pix the Cat, more digital games, etc.

      2015: The Order: 1886, Bloodborne, Ratchet & Clank, Uncharted 4, Tearaway Unfolded, Until Dawn, Persona 5, Everybody’s Gone to the Rapture, Rime, Let it Die, Deep Down, more.

      PS4 is selling on game price/performance, 1st party studios, more developer support including indies and Japanese developers, PS+ value, and better PR.

      Naughty Dog, Santa Monica Studios, Media Molecule, Japan Studio, Polyphony Digital, etc. will put great games on PS4. Infamous and MLB The Show are 80+ on PS4 already.

      1st/2nd party studios include Japan Studio, Polyphony Digital, Naughty
      Dog, SCE Bend, SCE San Diego, SCE Santa Monica, Sucker Punch, Pixel
      Opus, Evolution Studios, Guerrilla Games, Guerrilla Cambridge, SCE
      London, Media Molecule, Ready at Dawn, Quantic Dream, etc.

      The gaming division is well funded, in a net profit, and making quarterly profits. PSN is being invested in to handle the doubled userbase since Destiny launched.

      • drjonesjnr

        Wow well said theses xbots are delusional

  • Dikan45

    If the consoles were capable of the optimum output for playing games on a tv(1080/60p) this issue wouldn’t exist! The fact that they aren’t will always be a problem

    • drjonesjnr

      The lack of pc triple A exclusives games is a problem Imo

      • John Iyney Iye Slade

        so true - if games were made for PC then ported to console things might be better…. Pcars/Star Citizen etc will show if thats the case i guess…

  • Jan Compaf

    yes, we should use resolution and framerate only when it suits our fanboi side, yes

  • Dennis Crosby

    Frame rate only matter if its not stable we need stable frame rates to keep the experience complete. It’s doesn’t matter if the developers are aiming for 30 or 60 it has to be stable also Art style and Textures trump Resolution all day long

  • drjonesjnr

    What a boring article

  • Sheldon Prescott

    These “new gamers” are ruining this generation with this resolution obsession. Instead of focusing on making a good, fun game, devs wanna hit that magic 1080p mark. Just focus on making good games!!!!!!!!!!!!!!!!!!!!!

    • John Iyney Iye Slade

      i have been gaming since the 80’s and i am entirely unsatisfied with 1080 - so as an ‘old gamer’ i want something to run at my monitors (higher than 1080) native resolution. Its nearly 2015 so i don’t think that’s too much to ask.

      Sadly devs keep pumping out the same crap year after year with shinier graphics but until people refuse to put up with it, they’ll just continue…

  • Matt

    Control is affected by framerate though, and games before PS1, before 3DO, when they were on sprite-handling machines… those just ran at the refresh rate of the TV, didn’t they? I don’t think they had a “framerate,” but it was smooth like 60. It’s part of what made them enjoyable.

    It’s also a fact that many PS2 and Dreamcast games ran at 60fps. I think it has to do somewhat with the balance of the hardware components at the time. For PS2 and DC the games were usually about 640×480 in resolution, about the max of CRTs at the time, the DVD 4:3 standard, and processing power for 3D graphics (polygons) had caught up enough that it was easy, almost a given, to get high framerates, so even a lot of the launch titles for PS2 were that way. It was a struggle though when the resolution standard was bumped up, as it has been again for this new gen.

    N64 games often ran slicker than PS1 and Saturn and Saturn ran slower frames than PS1 with a lot of titles.

  • Nettrick Nowan

    When it comes to multiplats, as a multiconsole owner, I do care about res/fps. It makes sense to me to buy the version of the game with better graphic detail and that takes advantage of the hardware’s power. For exclusives, I’ll take what I can get-res/fps don’t matter then.

  • John Iyney Iye Slade

    What we need to do is boycott developers who punt out shoddy unfinished titles with sub par graphics having previously released misleading demo footage to unsuspecting console owners. Luckily on PC we have modders who actually care and end up fixing the shameful broken ports which devs can’t even be bothered to send to QA first.
    I feel sorry for the people conned into getting the ‘next-gen’ con-soles as they make a big fuss of 60fps and 1080p. whoop-de-do 1080 yay. If a PC game isnt even able to be made in dx11 with the ability to accommodate 2560*1440 and up because its just another ported waste of space then i won’t be giving my money away to play some boring repetitive 5 mins and i’m done rubbish. It’s hard enough to find a game that doesn’t bore the sh** out of me let alone having to put up with it looking like it was released 10 years ago as well. It’s usually worth waiting for the pc release if its after the console one as generally it might actually work by then too. Just look at the Ubifail of late - one after another borked release yet people still throw their money away.

    par example: Cloud Imperium can have my money. Ubisoft cannot. Although they can do their best to have people play down the fact that its nearly 2015 and some people actually expect better from these companies.

    mediocre is frankly not good enough - in all aspects of a game yet sadly we weem to be getting a lot of that of late. Visually and regarding gameplay many games are simply poor - talking of Half Life - I’m having fun playing the FF Cinematic mod of Half Life 2 at the moment - brings the graphics up a notch and is great fun - more so than most of the titles i have looked at this year.
    If you dont care about to don’t recognize the difference between 30/60/120 fps then you probably will love the new consoles - if however you are used to a higher res monitor and are sensitive to not only fps but also HZ or mouse acceleration etc then you might not be so happy (you probably will have a pc in that case id guess)- and i’d rather be a snob about it as without people causing a stink about shoddy developers then games are only going to get worse and worse - case in point: the watch dogs dumbed down graphics scam - if a PC modder hadn’t got in and found that the ‘e3′ graphic settings were still there but turned off, nobody would know that they were dumbing things down so that console versions don’t look bad compared to PC.

  • Wall BS Betel

    I wholeheartedly agree, sir 😀

Top