How videogame culture is changing to promote better, smaller, less expensive titles.
The picture above is from Forbes’ Journey review. Go read it to get a grasp of how much we embraced the game last year.
The videogame industry is one bound by technology. Studios can’t overstep what’s available to work with, thus as technology evolves, so do the games it helps create. That limitation has led to some of the greatest works ever produced in the medium: development started on Mario 64 as an SNES title; as well, Nintendo began conceptualizing Super Mario Galaxy based on a Gamecube tech demo.
That evolutionary push has, in some ways, helped promote the graphical enhancement between generations. The clarity with which a game is presented is a matter of importance, up there with exclusivity and review scores. Game reviews often cover graphics (or the more modern term “visuals”) as a factor in determining those scores. And readers, insomuch that comments reflect how people feel, generally mention graphics as a concern.
But technology has improved to a level where graphics aren’t as advertisable. Games are abound with realism, in some cases hyperrealism, wherein scenes are feverishly detailed. As a result, realism is plateauing. Most of the modern combat shooters look and feel and play similarly because a long-held focus for the genre has been on realism. In the World War II frenzy before Call of Duty 4 came along, to better encapsulate the grittiness of 1940s warfare set a game apart. That was partly due to the inundation of titles in the time period, but the ideology never let up.
This next generation will be the first where a game’s visual quality isn’t as dependable. Otherwise what the publishers would call “marketable”. Looking on to gorgeous digital landscapes will still have that resounding awesomeness, but most games for PlayStation 4 and Xbox One will share that. Any ensuing mayhem will be game loyalists screaming about specifics: frame rate, definition, etc. (Of course, most armchair knights will argue to their last breath on forums regardless. It’s the nature of the Internet.)
For most people, those specifics don’t matter. Personally, visuals tend to take a backseat to more important aspects like story and character development. Those elements are better remembered (and best to write about as a blogger) than how beautiful things look. Crysis is maybe the only series where graphics became a focal point, because it had been designed to showcase Crytek’s new baby, CryEngine.
Our collective insistence pushing for greater visual quality needs to change. Game budgets are notoriously high nowadays and expected to grow as new consoles roll in. Square Enix noted financial troubles in March following the “disappointing” sales of Tomb Raider, even though Lara Croft’s rebooted origin story sold 3.6 million copies in that month alone. Any higher and it would have been among the fastest-selling games in history. But the number didn’t meet sales projections and Square may have just broken even on the game.
Obviously, that shouldn’t happen. The formula of publishers losing millions over a two-year development period is unsustainable. Quality titles like Tomb Raider are unaffordable because the publishers can’t risk the financial gamble. We may hate Call of Duty with a passion, but Activision is breaking the trend by releasing a game annually and having it remain hugely profitable. Grand Theft Auto is the other major example.
With publishers looking to lower budgets, the shift has begun. Though, oddly, not with the publishers themselves. Recent successes in indie development, some through Steam’s Greenlight program, show that industry focus is shifting. Five years ago, a game like Gone Home may never have been reviewed. In that same period, thatgamecompany’s Journey would certainly never have been mentioned in the Game of the Year conversation. A category most often reserved for big budget successes.
Journey, though, is one in a handful of console arcade titles to receive that sort of recognition. Gone Home was for PC, where shorter, more experimental titles have a better chance at succeeding. It may take some years before the console market gravitates that heavily to an arcade title again. If Microsoft continues its crusade against discouraging people from indie games (by making the menu impossible to find), the wait will be longer.
In the meantime, as a new console generation approaches, the industry must move in this direction. Exploding costs and frequent cases of overworked, underpaid developers work against sustaining the industry. Personnel should be happy, costs should be mitigated, and the games should be profitable. Smaller and smarter games are the answer.
If you like this post, please share it. You help grow the blog! As well, follow Holygrenade on Facebook and twitter and be sure to spread the good word. Enjoy the rest of your Friday and the weekend.
Share, would you kindly?
Holygrenade on Twitter
- Joshua on I Was A Bully Once, On Xbox Live
- Dal Dudas on What If Microsoft Killed Xbox?
- Jrpaper on Continuum: Season 3, Episode 1 “Minute by Minute” Review
- MicShazam on I Was A Bully Once, On Xbox Live
- You are flat out wrong on What If Microsoft Killed Xbox?
- © 2014 Holygrenade. All images are copyrighted by their respective owners.