How video game difficulty became a cultural battleground
Hard truths.
In the 1990s a group of Japanese video game designers were faced with a curious problem. Most games at the time came with three difficulty options, escalating in arduousness from "Easy" through "Normal" up to "Hard." In this way, a player could match the game's challenge to their skill and the potential audience for the game broadened from the talented to the talentless, and all of us who muddle away betwixt. The shoot 'em up designers at Toaplan, Cave and Psikyo, however, wanted to work with a finer, wider scale. Their games began to come in six or more shades of difficulty. The problem: what to call these new modes?
So began a lively if short-lived literary tradition in which designers would compete to find the funniest and, often, most disparaging terms for the lightest difficulties. DonPachi's designers opted for the cutesy "Little Easy" for their game's most accommodating set-up, while Battle Garegga's crack game-makers offered the dignified "Training" for theirs. Psikyo was far crueller. Gun Bird 2's options descend, in hurtful steps, from "Easy" to "Very Easy" to "Child" to "Baby." Strikers 1945, the World War II-themed shooter recently re-released for the Switch, is even coarser: its easiest difficulty is named, abusively, "Monkey".
These terms are loaded with a witty scorn that obscures their raison d'etre. The range of difficulties was not principally included to flatter or shame players, but to give arcade operators options that could be tweaked in order to maximise their profits in the wild. With arcade games, as the novelist David Mitchell once wrote, you pay to delay the inevitable. In other words: failure is certain. But an arcade game that is too challenging produces players that feel short-changed and resentful. A spread of secret difficulty levels enables an arcade operator to calibrate a game's challenge behind the scenes, and, having monitored the effects on his public, maximise profits. For this reason every Neo Geo game comes with no fewer than eight difficulty levels.
Video game difficulty was, then, a commercial elaboration, not an artistic one. For many developers, it was a requirement that distracted from their ideal vision for their game. After all, it is not difficult to make a difficult game. You simply weight the numbers (how quickly enemy bullets travel through time and space; how little health the player's avatar is given; how much damage inflicted by a fireball) and stack the odds against the player. The much harder task is to create a perfectly calibrated piece of work, one that is, to lean on this medium's pet cliché, easy to learn but hard to master.
Yet, the terminology used to describe these difficulty options had already forged a firm link in players' minds between challenge and pride. Games that did not present much impediment, induce much perplexity, or require much perseverance were seen as somehow lesser works, made, in Psikyo's language, for babies or monkeys. Difficulty was fast becoming a term that could be used to exclude, to erect border walls.
More recently the imprecise term "video game" has come to house a far broader church of styles, modes and, for the designers behind them, artistic intents. Dear Esther, progenitor of the disparagingly termed "walking simulator", simply asks players to wander an exquisite island while listening to snippets of spoken word poetry. It essentially strips out the challenge of a Call of Duty, or an Ocarina of Time, keeping only the parts of those games that encourage meandering and intrigue.
Others followed each more weird and specific than the last. 2007's Coolest Girl in School, for example, presents its player with the dilemma of how to make it through the day if you end up with a period stain on your skirt. 2011's The Cat and the Coup puts us in the paws of Dr. Mohammed Mossadegh, the first democratically elected Prime Minister of Iran's pet cat. Coming Out Simulator 2014 is a simple, autobiographical game about a young man's experiences trying to tell his parents about his sexual orientation.
This broadening of the definition in no way detracted from the sports-like tradition of the arcade games, which typically seek to find the fastest, the strongest, the most skilful players and report the results via the billboard glory of the high score table. Nevertheless, some felt that their personal idea of what a video game should be (some kind of elaborately engineered impediment that demands practice and pain in order to sort the men from the monkeys) was under threat. Moreover, the video games that neglected challenge in search of other artistic effects were principally being made by new kinds of artists, often without computer science degrees, who, thanks to democratising tools like Flash, Unity and Gamemaker, were able to express hyper-specific ideas and interests without the anachronistic commercial hang-ups of the arcades.
All the while, difficulty in games was becoming a kind of shibboleth: challenging games were made by experts for 'true gamers'; non-challenging games were made by amateurs for who-knows-whom. The border walls were crumbling. Action prompted reaction. Online movements soon sprang up with a Trumpian cry to put them up again.
Video game reviewers, that most simultaneously scorned and, by a few naïve youngsters at least, envied group, have been caught in the crossfire. Those writers who through the advent of video, have revealed their ineptitude at challenging games on camera have faced ridicule, calls for resignation, and, in the most extreme cases, harassment. Their critics argue that reviewers should be, not insightful thinkers, but principally brilliant players. It's not an entirely unrealisable demand: a book reviewer who is unable to make it to the end of all but the most simply written book is clearly in the wrong job. But the movement against some game reviewers based on their perceived lack of skill has become a proxy war staged by those who want critics to play the role of guardians of a particular tradition, rather than interrogators of a richly evolving medium.
This battle is founded on a misunderstanding not only of video game history, but also of the role of the critic in the jostle and dance of a maturing form. John Updike, the late American novelist and book critic, once laid out his rules for constructive criticism. It's an essential list that is relevant to all kinds of artistic endeavour. The first of these rules is applicable, not only to critics who want to be better, but also for players who want to be better: "1. Try to understand what the author wished to do, and do not blame him for not achieving what he did not attempt." In other words, the creator of a bullet hell shooter should not criticised for not making her game more accessible for those unable to tuck and weave through the rolling curtain of danger. Likewise, the creator of a ponderous game about death or flowers, bureaucracy or race hate should not be criticised for not making his game an arena in which players can demonstrate their dexterity or quick-wittedness.
Updike's final comment echoes clearly for us today. "Do not imagine yourself a caretaker of any tradition, an enforcer of any party standards, a warrior in an ideological battle, or a corrections officer of any kind," he wrote. To riff on Sony's current advertising slogan: video games are for everyone, even if some video games are specifically for someone.