Denis Dyack, head of game development studio Silicon Knights, may know his way around code to craft virtual experiences, but once on the consumer side, he seems lost. He's one of the many developers and publishers speaking out against used game sales, his latest chat with Industry Gamers sparking a number of questionable and baffling bullet points.
Dyack states that the industry is moving away from single player experiences, and they're being forced into creating multiplayer in an attempt to extend the longevity of their product... but that doesn't seem to be the case. There are major players in the world of multiplayer gaming: Activision's Call of Duty, Bungie's Halo, and EA's Battlefield. That's it. Everyone else simply becomes secondary behind these behemoths. Multiplayer doesn't create communities unless you have one of those names slapped on the box.
So, why are developers switching to multiplayer? It's cheaper. It's a simpler process to create a multiplayer map in a first-person shooter like Call of Duty than it is to bring in writers, scripters, voice actors, and various designers to make a story-driven solo outing than it is to play test a handful of versus-oriented maps. What developer charging $60 for a product wouldn't want to extend their profit margins via multiplayer, using that as an excuse for shorter single player?
"I think there's a statistic I saw that most of the boutique retailers are making more money and more sales off of used games than they are off of new games."
Well, of course. In the used market, the store sets the price they pay for a product and the profit margin. With new product coming from either a publisher or distributor, they have little control. Who's at fault there? Not the store. If anything, they have to sell used product to stay afloat, the margins on new software or video game hardware thin on its best day. The money is in that second hand market as a ripple effect cause by outrageous software prices, gamers eager for new releases, but reluctant to spend the full $60. What do they do? They trade up, those used games get recycled into the market, and in turn find new homes where someone else can trade up later on. If anything, without those used game profits, stores probably wouldn't be so swift to stock shelves (or for that matter, be able to) with Dyack's latest magnum opus.
Also, keep in mind, every used game was new at some point.
Dyack's final point praises the idea of cloud computing and its potential for gaming, citing Netflix as an example of how it can change the industry. Let's forget for a second that cloud computing would have screwed those consumers who chose to save their games on Sony's cloud service during the recent PlayStation outage, but how could the Netflix model even come close to what this industry is doing? If anything, the idea of a flat fee to experience all of the content one can consume in a month should be terrifying, far less profitable and restrictive. Plus, having data stored centrally in an industry filled with those technologically aware enough to take down entire networks? No, thank you.
The video game industry is in total upheaval, desperate to find its sales identity and instead of looking for solutions, everyone gives the consumer a slap on the wrist for doing something they've been doing since the industry came into being. The outspoken nature of developers and studio heads, continuing to berate their customer base, blaming those who fork over $60 for a single product for their own failures, is unbelievable.
They're the ones who put themselves here, creating an industry that thrives on hype that simply disappears after the first week of sales, bloating development costs to deliver a minimum of ten-hour experiences to justify their increased sticker prices, and failing to understand that funds are limited on the consumer front. Maybe we're just tired of paying $15 for expansions, and games that use tactics are merely being traded in faster?