Game Developers as Chefs
Estebon had an interesting comment on my prior Entitlement Culture post, in defense of the experts:
There is, unfortunately, a general zeitgeist of mistrust toward expertise in the world today, which has bled over to gaming. Gamers, particularly of the self-identified variety, make for an especially fertile ground for that sort of thing, for cultural reasons.
Game devs are supposed to be the experts in their field. They’re the ones who, at least in theory, beat the hiring/funding gauntlet on their merits. That their opinion on how to make a good game ought to carry greater weight than that of the person in the street used to be… more or less self-evident, as with any other profession.
It’s difficult to imagine a set of statements that I disagree with more strongly on a fundamental level.
First, suggesting game developers are “experts in their field” because… they’re game developers… is a tautology. We might assume that these bigger game companies have some kind of hiring standards, but that never really seems to be the case. Instead, it’s often more recursive like “previously sold a popular game” or “already worked for us in QA” or “nobody else applied.”
Remember Greg Street (aka Ghostcrawler) of WoW (in)fame(y)? From his Wikipedia article:
Street graduated from McDaniel College in 1991 with Bachelor of Arts degrees in Biology and Philosophy, later earning a PhD in marine science. Between 1996 and 1998, Street worked as a Research Assistant Professor at the University of South Carolina. […]
Game Design career
Ensemble Studios, the team behind the real-time strategy series Age of Empires, employed Street as a designer in 1998. With no education or experience in the game industry, Street suspects he was accepted due to his “writing and teaching experience, historical breadth, personal hygiene, gudd speling [sic], creativity, [and] my talent at capturing live alligators”, as well as the user-created scenario for Age of Empires he submitted with his application, which later appeared in Age of Empires: The Rise of Rome. Street helped develop every Age of Empires game from Rise of Rome on, until his departure from the company. At first he designed in-game scenarios and maps, and later graduated to being the team’s lead designer.
Street was hired by Blizzard Entertainment in February 2008, and was the lead systems designer on the MMORPG World of Warcraft until November 2013.
Now, you can hate Ghostcrawler’s philosophy during his WoW tenure – I personally thought it was fine overall – but the fact remains that this marine biologist worked for like two years, wrote an Age of Empires scenario, and then a decade later became a billion-dollar franchise game developer (or a prominent cog in the machine thereof). Twice! We have to either assume that Ghostcrawler is a hidden genius, or there are no particular standards that apply to game designers generally.
There is a third option too: the M. Night Shyamalan effect. You know, the producer of the 1999 cultural touchstone film, Sixth Sense? He followed-up with Unbreakable and Signs which were whatever. After that, it was solid decade of unremitting garbage films. Shyamalan is a supposed expert in his field, as evidenced by movie companies continuing to hire him, but clearly he lost whatever magic he had. Or perhaps more likely, the seam of magic he just happened to tap into shifted, and he wasn’t able to find another.
I bring a lot of this up because I find the hero worship of brands or developers (or anyone) to be… misguided, at best. For one thing, if these people were “experts in their field,” one would expect less game studios to be closing down or laying off staff. As I pointed out a few years ago, most of the same people have been working on WoW this whole time, so any declines in perceived quality can be attributed to the Shyamalan effect.
The only measure that matters for an expert (game developer) is continued, consistent results. Did they make your favorite game back in the early 2000s? Good for you… but why are you still waiting for them? It boggles my mind whenever someone talks about Bethesda and Morrowind, for example. That game came out in 2002. It can still be great, but you knew after Oblivion that something changed. How many new Shyamalan films are you going to sit through before you give up?
From the player side, Estebon pointed out:
J. Allen Brack got memed for his “you think you do but you don’t” line, and devs and customer relations reps have long been trained to pay lip service to the idea that the untutored mob knows best, but people routinely say and demand things that are not remotely reflected in their behaviour or proclivities as reflected in the internal metrics available to game developers. Elsewhere, insane fortunes have been built by paying attention to what people do, not say, and giving us things we never asked for or imagined we needed.
I actually agree with that. Players are generally bad with coming up with the solutions to their problems, even when the solutions aren’t inherently contradictory. What players are exceptionally good at though, is identifying that a problem exists in the first place. The problem might only be impacting them, specifically, but that’s all that really should matter to them or anyone.
All of this is to set up my title analogy.
Game developers are chefs. You don’t need to go to culinary school to be a good chef, and having a degree doesn’t mean you always cook tasty food. Being the best chef in the world will not stop a dish tasting like shit if there is too much salt/it’s burnt/etc. We might expect a master chef to avoid rookie mistakes, but there is another integral component to the dish: the tastes of the person eating it.
In a restaurant, we can assume the customer is choosing a dish they think they will like. If it comes out too salty to their taste, no one bats an eye at said customer complaining about it. “Entitled diners not wanting their food caked in salt!” The relationship is inherently transactional, and there is an expectation of quality. There are limits, of course; no one should expect Chik-Fil-A or KFC to sell burgers, for example. It is also unreasonable for ten chefs to cater to the individual palettes of ten million individuals.
Is that going to stop you from complaining when you get served a salty steak, or if the French Fries are limp at a chain restaurant? Should that stop you? No. I couldn’t cook a restaurant-quality meal, but I sure as shit can criticize one if it comes out poorly. Gaming today is no different.
Granted, it used to be different. The last bit of Estebon’s comment was:
I struggle to think of any other form of entertainment where the audience claims the right to meddle in the details of the creation process quite to the same extent, as opposed to just letting the product succeed or fail as a whole, in a binary way.
Back in the day, games were done. Cartridges were manufactured, CDs were pressed, and physical media was sent to stores. If there were still game-breaking bugs or exploits that got past QA, well, hopefully they weren’t bad enough to sink an entire $10+ million investment. Games in that era were more akin to traditional entertainment like movies or books in the sense that fans could only possibly influence future decisions. Once it was out, it was out.
As we are abundantly aware today though, games are now a service. Something like a Day 1 patch clocking in at 40 GB is not uncommon. No one expects to unwrap a PS4 on Christmas and immediately start playing anything. Moreover, game developers want us to know that development is an ongoing process. A game in maintenance mode is “dead,” and one which is no longer receiving updates is “abandoned.” We barely even have the language to describe a finished product anymore.
Gamer entitlement didn’t get us here. Game makers leveraging social media for free PR and turning “lip service” into a competitive advantage got us here.
Which is just as well, because I’m not especially convinced anyone knows what they are doing. Did Notch know he created a $2 billion game when he released Minecraft? The original dev team for WoW certainly didn’t know they would have 8 million subscriptions by the end of 2008, nor have they been able to do much to stem the bleeding over the last decade. We can’t attribute all of this to corporate malice, because that doesn’t explain why these rockstar developers can’t recapture lightning in a bottle when they move elsewhere.
If you can’t reproduce results, what does that say about your expert game development science?
I think the important thing is to not put game developers on a pedestal. They aren’t scientists (anymore) doing peer-reviewed studies changing the way we understand the world. They’re just people who have eaten food before and think they could come up with something better. Occasionally they do, and even more occasionally they do it on purpose. But can they do it again?
Posted on December 17, 2018, in Commentary, Philosophy and tagged Developers, Entitlement, Game Design, Ghostcrawler, M. Night Shamamamalan, Notch. Bookmark the permalink. 7 Comments.
Software development is almost more art than science most days out of the week, so it is easy for me to accept that programming what is essentially an entertainment medium trends far more in that direction.
Customer feedback is excellent for refining an idea. That doesn’t mean you should fix everything people complain about, but if people complain about something you should probably examine what it means to the game. The answer can be “no,” and probably should be a lot of the time. Customers often complain about a symptom rather than an actual issue, and sometimes it is a symptom of them actually wanting some other game not an actual problem with the game in question.
But customers for big ideas, for the meta theme of a game… that isn’t so reliable. We’ve been through the era where everybody wan’t more WoW… usually via the round about method of complaining about how every new MMORPG wasn’t WoW… and how that played out.
Yeah, I think the term “Game *developer*” is the misleading term if you try to reference software developers in general. The general populace thinks “Game developer = someone who works on a game” and if it’s not, say, a concept artist or QA person, then all other roles seem to be smushed together as “developer”.
It’s a lot more conflated than in other fields – sure there are people who could quit somewhere else today and work in games tomorrow – but it’s not a given. I surely shouldn’t start trying to develop an indie game, it’s an overlap of skills I don’t have. The bigger the studio the more niches I could easily fill, I guess – just keep me away from anything artsy :P
Glad to see the comment got you writing. I thought the ‘will warrant its own post’ bit was just a gentle way of pointing out how wordy I got. And I like the chef analogy, not least because if you really wanted to take an axe to my argument, you’d have gone with vignerons. That’s where the real bullshit hides.
It is true that many chefs are not formally trained, but by the time they’ve worked their way up the stages to head chef, they’ve certainly received an education. A Cordon Bleu imprimatur may shave years off that journey, but the end result is expertise. As you acknowledge, something like drowning a steak in salt is not a mistake a professional chef would make; most would be horrified and grateful if you pointed it out. It’s almost the equivalent of a game’s executable not launching at all. On the other hand, to borrow an old trope, is the customer who demands their Wagyu steak (or, perish the thought, venison) be ‘well-done’ actually right? Or ignorant and doing a disservice to their taste buds?
I agree that the question turns on the validity of what you called my tautology, i.e. the developers are experts because they’re there, past the barriers to entry, doing the thing. Maybe those barriers aren’t as lofty as I suggested, but unless you really believe that success is stochastic and having worked on a project for years gives you no additional insight into it, I think that statement continues to hold. Sure, devs may come from a variety of backgrounds and sometimes rise from obscurity within the team, or unexpectedly impress investors or hiring managers… and yes, sometimes just get purely lucky… but by the time the product has launched and they’re out there enduring the attitudes of their customers, they’re pretty well informed about game development, especially the particular game they’re developing. Even those who eventually fail tend to gain a lot of experience and knowledge. The batting averages, outside of reliably churning out FIFAs, CoDs and Candy Crushes, may not be great, but you could just as easily conclude that it’s simply a tough industry as opposed to no one really knows what they’re doing.
I don’t think this is hero worship, really. Just a certain amount of respect for having earned one’s chops.
Now, the bit that gives me, in turn, the ‘fundamental disagreement’ willies:
That’s not a premise I’m willing to grant and, in my mind, the two sentences contradict each other. A problem that only impacts a single individual is not really a problem identified. It’s a useless datapoint, and the customer base as a whole is a never-ending scattergun of such problems. And none of us should meander about our lives, gaming or otherwise, solely as vectors of our own self-interest, wilfully blind to the bigger picture.
In fact, that’s the set of goalposts here, as laid out by the original MMOBro piece: we’re talking about the customer who wants to pwn harder or experience a happier ending, or whatever it is, and blames the devs instead of considering him or herself as the major factor. More than that, and more than merely voicing an opinion, this customer feels entitled to have the grievance addressed, and will typically rage about betrayal until it is. No regard to unintended consequences of the change, or the interests of other constituencies within the user base. Such feedback is obnoxious and useless.
Compounding this uselessness is the fact that the Shyamalan effect works both ways, kind of. Even well-behaved gamers often don rose-coloured nostalgia glasses, or other forms of warping eye-wear, and try various ways to articulate to a studio what is basically an impossible demand, to recapture a lost feeling of mind-blown from the past, be it SW:Galaxies or Morrowind or Deus Ex. The most common example, probably, of passionately identifying a problem that does not exist in the first place.
As I mentioned in the original comment, there are structured ways of consulting, or simulating, the base. Since you brought up Greg Street as an example, he mentioned on his blog that playtesting teams were his innovation and strategic staple in all three game franchises he’s worked on. EVE Online’s CSM is an excellent idea in theory, less so in practice for reasons beyond our scope. Some allowance should definitely also be made for outsiders who have studied the game, or games in general, beyond what the ordinary whinging gamer can claim, and at least try to be objective and comprehensive. Your major theorycrafters, for example. An experienced peer. The game journalist or blogger who is at least somewhat accountable to their reputation if nothing else. But I’m sorry, Azuriel, not the free-range berks.
Last thing, before the length of this starts making me look entitled to a megaphone, myself: absolutely agree that game companies have encouraged all this and are complicit in fostering what I will continue to think of as gamer entitlement. However, once the race to the bottom started, what choice did they have?
I’m partly torn. On the one hand, I have an appreciation of elegant (game) design, when systems seem to come together to reinforce each other naturally, when things simply work. These are aspects of game design that can exist independent of an audience that can appreciate the nuance and the art. Someone who can (consistently!) create such systems is indeed an expert in their field, and I applaud them.
In all other cases, what meaning is there in considering someone an “expert” of a commercial failure? Or one that cannot consistently repeat their past success?
Or consider the case in which a game succeeds in spite of the ministrations of a bull-headed developer. Was vanilla WoW made better by fewer graveyard locations, forcing instanced dungeon wipes to march across entire zones to get back to where they were? Was that intended? Negligence? Technical limitations? There are aspects that clearly (or perhaps not so clearly) had no positive contribution to a game’s overall success, that could have been improved or changed, and a more expert game developer would have hypothetically fixed.
It is important to me, philosophically, that no authority is beyond reproach. For one, it’s a bit of an existential threat – there wouldn’t be a lot of posts on the blog if I were not able to critique games. For another, are we ready to state that an given experience is impossible to improve further? I would much rather inhabit a culture of refinement and iteration than one of unquestioned experts which hand down clay tablets to the masses in their beneficence.
Having said that, let me be clear: game developers have no moral obligation to listen to anyone.
It is fully right and proper that a consumer do everything in their power to try and influence a producer to improve (to them) a product. It is also entirely within reason for said producer to ignore the request. Or address it, or eviscerate it on Twitter, or let it hang in the wind for all to see and judge its silliness on its own merits. Do some people “expect” developers to bend to their every whim? Sure. Calling said people “entitled” might describe the sentiment, but the phrasing itself is corrosive and splashes everywhere. More importantly, labeling those people in the first place is irrelevant because they are not the ones making any decisions regarding the outcome. If their complaints and rage are ultimately ineffectual, why are we talking about them at all? Are we scoffing at their audacity to criticize their betters?
I suppose there is a legitimate fear the “race to the bottom” is going to contain a curve that includes democratic game design. While it may be a horrifying prospect, I do believe that it will largely be self-correcting. Supercell tried this out already, for example, and held a poll to decide which of their Clash Royale cards would be fixed in a particular month. It was precisely as dumb as it sounds, at every level.
Regardless, all of the noise and static is worth it to me though, if we arrive at a place in which one well-reasoned argument can sway the design direction of a game to the better (for given amounts of “better”). A world in which blind deference to “experts” is accepted practice is essentially one in which ad hominem attacks successfully counter rational thinking. It’s important to me that, however unlikely, it is not impossible for a free-range berk to have a point, and have that point acknowledged.
I think that “long walk from graveyards” was actually PVP consideration.
People did hunt other faction at dungeon entrances on PVP realms, and instant res nearby could skew fight dynamics a lot – and even on PVE realms “flagging” rules sometimes meant entire raids could get flagged from helping one pvp-enabled member (who could come for raid straight from battleground/arena) – and then fight raid-to-raid right at entrance.
That became less relevant as game matured and moved away from Alliance vs Horde fights – perhaps because other designers weren’t so keen on considering PVP aspects while building the world, or perhaps because this particular angle was not longer seen as game’s “selling point”.
And then some dungeons were actually on enemy territory, which meant instant flagging for other side (and lvl 60 horde camping Scarlet Monastery to kill lowbie Alliance)…
I was just about to post on this very topic, following on from the “release” of Atlas and a particularly egregious error in GW2’s WvW, and then I came here and realized you’d already covered most of what I was going to address. I might still post but this is a really excellent discussion and I’m not sure there’s much I can add.
I can see merit on both sides but on balance I very much favor Azuriel’s position. There’s a huge difference between being experienced at something, which developers who have worked in gaming for years self-evidently are, and being “expert” at it. Expertise is not an automatic consequence of experience. I’m sure we’ve all worked with people who’ve done the same job for years and still are neither very good at it nor very knowledgable about what they do. Organizations foster ignorance and incompetence just as much as they foster excelence and expertise and the bigger the organization, the easier it is to hide.
As for “none of us should meander about our lives, gaming or otherwise, solely as vectors of our own self-interest,” well of course we should! Vectors of our own self-interest is precisely what human beings are. Anything that happens is due to self-interest. Even altruism is self-interest in that people are altruistic because of the positive way it makes them feel about themselves – or at least the mitigating effects altruism has on how bad they feel when they are not altruistic.
In gaming terms, if a player is unhappy with a process or an outcome, expressing that disatisfaction, driven by self-interest as it must be, is also a social act. By raising the issue in a public forum – be that the official website, reddit, twitter, a blog, a youtube video or a comment – said player is doing more than opening a discussion, valuable enough in itself though that would be. They are creating a doorway through which others can pass. It’s that much easier to aknowledge a problem when you know others share it.
This is not “entitlement”; it’s discourse. Terms like “entitlement” exist to close down conversation. They are never justified and usually applied with wilful intent to obfuscate the argument and drive away the arguer. Developers don’t need to respond but they do need to be aware and any attempt, even with good intent, to mute the feedback, however ill-expressed, of their customers will not ultimately enhance their ability to improve the services or products they seek to sell.
All of which doesn’t excuse impoliteness, far less abuse. Discourse needs to be civil. If civility is achieved, however, It doesn’t need to be moderated or controlled. Invalid arguments will burn themselves out. Valid ones will find a listening ear. Or so we hope. It’s certainly better than the alternative, which is silence.