One would be tempted to describe the comments on innovation in gaming as a near-liturgical refrain denying the very existence of any such thing. Listening to the hacks leaves an impression of an industry that is nearly completely barren of creativity; exploited for all it is worth. An industry, moreover, that marches stoically to its doom, despite the solution to all its problems being readily available.
Let us, once and for all, put this malignant myth to rest.
Before we begin, let me just say that I do not assume that everything is perfect. There are issues, serious issues, but they are not as all-pervasive as some would have us believe.
The safe harbor offered by innovation, or at least that non-corporeal apparition which many name ‘innovation’, is a siren call to the naïve, and I suspect it is just as treacherous as the paradise promised to Odysseus. For a start, ‘innovating’ is both incredibly expensive - in so far as it often involves the development of new technologies and a degree of trial and error - and difficult, in that it requires an abundance of technical ability as well as creativity.
The hack may be quick to jump on my choice of words: “Ha! So you admit that there is not enough creativity to innovate”. No. I’m saying that, whilst many studios can produce incredibly polished and competent games, there are few who can think of truly new ideas in an age where almost everything has been done at least once before. It’s not a question of creativity; it’s a question of what has or hasn’t been done before.
Think about the last console which was heralded as the savior of gaming: the Nintendo Wii. To call the Wii innovative would be a misnomer; it was less a case of innovation and more an instance of advancing the boundaries of an existing technology and then building a console focused on that singular technology. That was really the only inherently innovative part of the Wii; the aim of basing a whole console around motion controls. As it so happens, it was probably that last decision which doomed the Wii to the position of ‘whipping boy’ for every snide remark that passed a journalist’s lips for years. Yes, the console sold well (I suspect on the back of the brand name and the attraction of Wii Sports), but the software adoption rate was atrocious.
Another example where innovation was not a guarantee of success: the Virtual Boy, the world’s first virtual reality console. The Virtual Boy was a complete failure; it didn’t sell well, was widely reported to cause headaches and received almost no support from developers of any stripe.
Now consider innovation in games like Mirror’s Edge, a game that is the incarnate definition of ‘flawed gem’ and unlike any game that came before it. Whilst it did meet with some commercial success (and it divided critics like few other games have managed), the game is widely accepted to have several fundamental flaws. The odd thing is, nobody can agree on what these flaws are: some suggest level design, while others argue that the level design is fine; it’s making clear the path that should be taken that is the issue and there are yet more arguments for and against that.
It may be tempting to point to games such as, and most notably, Portal, Originally released in the now legendary Orange Box compilation. This example is a red herring; Intended to be a bonus game, an experiment, the success of Portal was completely unintended. Portal was a mistake, as were Minecraft and Journey. Just as all previous attempts to create memes have failed, so have most attempts at creating innovative games. It’s a tricky business, striking out with something genuinely new; you have no idea, no basis for the belief, that it may catch on with gamers. Remember: a lot of gaming conventions have become conventions because they are the best way of doing things, whether this be convenience for the gamer, or making possible the impossible for developers.
The expense of innovation makes it a difficult proposition to put to an investor; their investment may not pay off. Why risk millions on an investment which only has a small chance of paying off, when you could invest those same millions in a company which is all but guaranteed to produce a high quality product that will lead to a good return on the investment? An answer of ‘on principle’, displays a lack of understanding of the raison d’être of investment. The argument of principle is doubly flawed in that it does not take into account the impossibility of the implication; that every new game, even in series, should be entirely innovative. This is simply not possible. If you then use the argument that only a limited number of innovations need be made, you open the door to an over-reliance on gimmicks and we end up in the situation we are now in; innovation has become a code-word for gimmicky mechanics that receive far too much attention and damn the game.
Most of the innovation these days, and it does exist, goes unnoticed by gamers and press at large. The reasons this are simple: how could you notice an innovative development method; one that cuts costs whilst maintaining, or even improving quality? Why does no-one notice the constant increase in graphical fidelity without an improvement in hardware? The current generation of consoles are 5-7 years old, yet developers are constantly increasing the level of graphical fidelity on display without an increase in resources.
Many will be tempted to argue that “graphics don’t matter! It’s all about the gameplay”. This is a position that I both understand and have little time for. I understand it because I, personally, am willing to look past a lower level of graphical beauty; I am of the opinion that Final Fantasy VII is a much better game than most released since, yet it is not a pretty game by any standards. I have little time for the argument about graphics because it misses the fact that most gamers - and indeed press - will overlook, mark down or outright ignore games that do not look as good as their peers. This is a fact and is readily observed. In short, we are all a little bit to blame for our current predicament. This is not to say we are solely responsible, but we must look at, and understand, the underlying causes for the apparent lack of creativity before we can begin to find a solution.
The other reasons I have little time for the argument about ‘graphics’ is that they often overlook the important of aesthetic style and less importantly, they dismiss the technical innovation that is occurring and is pushing the boundaries of a medium that is incredibly image driven. Arguably second only to movies in how image driven it is. The emphasis on images is understandable to a certain degree: what else can we use to form a basic opinion of a game that has yet to have a demo released? We are too reliant on trailers and hands-off demonstrations, or the opinions of those few who have managed to have a few hours of gameplay.
It’s not just in the graphical department that we see almost constant innovation. Dealing with physics and multiple objects on-screen is another area in which there are major improvements. For a demonstration of what I am referring to, go back to the original Half Life and count the number of background items lying around. The amount of litter, of objects on tables in a given area. Now play Half Life 2 and conduct the same count. You should find that there are many times the number of objects in the environment.
With the same focus, go to any of the Assassin’s Creed games and count the number of NPCs on screen at once. The number of scenery objects. The question now is “what does this have to do with innovative gameplay?” A good question, if a little simple in its construction.
Take the example of Assassin’s Creed: The number of objects that you can climb up, the number of citizen groups to hide in, the number of environmental weapons to trap enemies with and the overall vibrancy of the world are all dependent on the ability of the programmers to stuff the game full of goodies. Without a constant improvement in hardware, this essentially translates to doing more with less.
Think about the work that the processor of your console is doing at any given moment in time: it is processing an ever increasing number of behavioral patterns of NPCs, working out the physical interactions of a greater number of objects that may be flying around the screen and it is doubtless doing more which I have neglected to mention. Similarly, the GPU of your console is processing increasing detail in the environment (including NPCs, texture detail and lighting detail). If the programming wasn’t advancing, then this would not be possible.
Someone may ask, again but in a different way: “what does this all have to do with innovative mechanics”. The above? Not a huge amount. It enables, in games like Assassin’s Creed, the mechanics of free-running and hiding in crowds to work; by providing more crowds to hide in and allowing more features on walls that the player can scramble up, but beyond that very little.
“Aha! So you admit that the search for graphics is holding gameplay back”! We have had magic (under a slew of names and with a slew of attached mechanics - Materia, stealing magical power from enemies, ‘equipping’ magic to stats to provide boosts, spells produced via the combination of elements and so many more), technology (doing everything from providing anti-gravity to summoning land sharks to devour our enemies), close combat (dozens of different martial arts and fighting styles), puzzle mechanics (everything from fetch to pictographic mechanics, timed and time based puzzles and mechanics, perspective based mechanics…), quick time events, mini-game based lock picking/hacking, probability and deterministic lock picking and hacking. We have even had wrathful and homicidal limbless birds flung at pigs, themselves guilty of grand larceny and kid egg-knapping!
Games where you can rewind, pause or fast-forward time; RTS games where you can go back to the beginning of your build order, in game, and make a change then watch as this change propagates through the rest of the time you had played; FPS games where you can age enemies and parts of the environment out of existence or rewind time to bring back something that had long since submitted to the inevitable march of time and the elements, crumbling to dust. We have had games where you evolve entire species from their very first moments as bacterium.
On settings: we have explored under the ground, the land, the sea, the skies, space, different dimensions. Games have been based from pre-Biblical times to the impossibly far future, and everywhere in between. Virtual worlds and realistic worlds; Insane worlds; demonic worlds; virtuous worlds; impossible worlds; mundane worlds; the afterlife, and pre-life. We have had games that incorporate character deaths into the story and so many more besides.
Please tell me what stone has been left unturned?
Independent developers are a section of developers that I have a love, hate relationship with. Love for their character and creativity, their boldness. Hate, often through no fault of their own.
Many say that the future of innovation in gaming is almost entirely going to come from ‘indie’ developers. I actually understand this argument; an independent game is often cheaper to produce so many mechanics come into existence there, where investors have no say and profit margins are an after-thought. In bedrooms and basements, after hours - so many brilliant games have been conceived in these places and at these times.
It may come as a shock to some that there are many, many more independent developers than publisher supported developers. Sometimes there are reasonably large studios like Insomniac and sometimes they are small one or two man teams, such as the creators of Dwarf Fortress.
Take this fact into consideration when you now consider: how many really good indie games are out there? And I mean really good - the type of game that will have your mouth agape and your mind frozen in awe? Now ask yourself the same question about games from larger, publisher-supported teams. People’s tastes differ wildly, but I suspect you may get a creeping suspicion that there are a lot more of the latter than the former.
Let us, for a moment, imagine that there are more awe-inspiring indie games than publisher supported titles. You could say, well that’s proof that indie developers are better than non-indie developers. Consider the ratio of good games to bad games in the indie scene. Remembering, of course, that there are literally thousands of indie games released every year, the ratio will no doubt be quite low. I think independent developers probably outnumber publisher owned developers by at least one order of magnitude, maybe even two orders of magnitude. Now consider that same ratio, good games to bad, of publisher owned developers. How does it compare to the independent developers’ ratio?
The above is a little mental exercise. It’s not proof of anything; it’s designed to get you thinking about the implications of claiming that indie studios are the only ones who produce anything worthwhile. Should the facts; and there has not been, to my knowledge, any rigorous research conducted on the above criteria, turn out to support my opinion then wonderful. If the facts disprove my opinion, I will be very surprised (as if anyone is ever anything other than surprised when they discover that their belief is incorrect).
“But quality is not the same thing as innovation”. True, but if our games are of a bad quality, however ‘innovative’ then we are wasting our time. It does not matter how innovative a game is, if it isn’t fun or at least interesting to play, then why play it? Why bother? This is like saying that a food, which doesn’t provide any nutritional value and doesn’t even taste nice, is better than those which do provide nutritional value simply because it is ‘innovative’.
To pre-empt the pedant, who will no-doubt raise the point of objective quality, I would argue that the quality of a game should be dependent on the amount of ‘fun’ that can be derived from it. This is subjective and will therefore change from person to person, but who really cares? Perhaps it’s time that score-based review systems die out completely because they’re also subjective, as is every comment made about a game in a review. The only exceptions which come to mind are the reporting of bugs. Either the bug exists, or it does not. There’s no middle ground.
To be clear, I’m not saying that independent developers don’t make valuable contributions to gaming, because they do. They really do. But they are neither infallible nor are they the only source of innovation. Publisher-based studios may innovate at a slower rate than the independent developers, who constantly struggle to differentiate themselves in the hope of garnering some attention, but they do innovate. That innovation is often in areas that independent studios or developers can’t innovate. An example would be the integration of motion capture into games. The animation quality is vastly improved, but only those games with the biggest budgets can afford it because it’s an incredibly expensive and relatively new technology. New, at least, in its inclusion in games.
The gaming industry is doing just fine. It’s up to you to look for the games that you like. The variety of games on offer is simply astounding and I think it beyond the ability for anyone, no matter how well versed in gaming lore, to recite from memory even a tenth of the many options available to the gamer.