5:54 pm Jun. 20, 201221
At the E3 video game expo in Los Angeles earlier this month, a crowd of gamers cheered on a camo-clad tough guy as he went room to room beating people to death and setting them on fire.
It was a typical third-person-shooter video game action, except that the realism was stunning. A nearsighted person without glasses could have mistaken it for a movie scene.
I wasn’t at the event in person. I just watched the game demo play on YouTube, with the “live” audience response mixed in. But they seemed to be thinking the same thing.
Just when it seemed a brawnier guy with a two-by-four would overpower our hero, a petite female sidekick stabbed the assailant in the back. The hero kicked his attacker to the ground and leveled a shotgun at his face. “No, no, NO!” the brawny guy shrieked, a split second before the hero unloaded. After the man’s jaw shattered in a spray of blood and bone—smash cut—the title THE LAST OF US, white-on-black, filled the screen. The crowd roared.
The video game industry is currently in a war that the movie industry fought and decided last decade. It’s a struggle between loud, assaultive, photorealistic game design that rewards wispy attention spans while demanding minimal problem-solving skills of its players and … games where shotguns to the face and chainsaws to the jugular are not so essential.
The American film industry settled on high-resolution ultraviolence as the default multiplex experience sometime after 9/11 and sometime before its superheroic screen response, The Dark Knight. The violence is not necessarily a matter of content but of the graceless way shots jam up against one another now, keeping us invested through a constant state of agitation where narrative suspense used to do the trick.
During that decade, many viewers retreated from mainstream blockbuster cinema into the bosom of what critics call a television renaissance. So many smart, adult, spellbinding, hilarious TV shows, the story goes. Any stragglers still hoping for an immersive experience at the multiplex were suckers and masochists.
Meanwhile, games for the major home entertainment consoles, XBOX 360, PlayStation and the upcoming Nintendo WiiU, went to HD resolution, after the fashion of home video, and employed increasingly cinematic techniques. The cut scenes that punctuate game play began to weave in plots as intricate as any Hollywood screenplay. And the resurgence of 3-D at the movies was concurrent with new 3-D gaming technologies.
2009’s Call of Duty: Modern Warfare 2, the second highest-selling video game of all-time in the U.S., was in many ways the Dark Knight of gaming. It solidified an ongoing franchise while exploring themes of terrorism and reprisal from the point of view of American protectors working Dick Cheney’s “dark side” to produce results. As Seth Seichel wrote in a New York Times review of the game, “Basically, the player, in the guise of an American commando, can participate in a massacre of unarmed civilians. ‘It will cost you a piece of yourself,’ your commander says of the mission. ‘It will cost nothing compared to everything you’ll save.’”
A slew of movies and TV shows like "24" and "Alias" worked that dark side of the street all last decade, alternating between discussions about torture and civilian casualties as unfortunate necessities in a new kind of war, and scenes of the brutality in question. But a fidgeting camera and disjunctive editing became the standard for representing both action and the most casual dialogue. Panic and disorientation were built into every frame.
Games didn’t want to be left out. The Call of Duty: Modern Warfare titles established a standard of documentary war-zone faux realism and moral handwringing that dozens of other games soon emulated.
In this climate, game developers who seek to captivate the player-viewer through more sensual and less violent means are odd men out. Game reviewers almost unanimously praised two grisly action games as highlights of the E3 Expo, the aforementioned The Last of Us and yet another spin on espionage, hacking and surveillance, Watch Dogs. (In Watch Dogs, the hero wields technology that identifies any passerby’s vital information, from date of birth to HIV status.)
The odd men out tend to be Japanese game developers. Japan, formerly the epicenter of world game culture, is generally perceived as lagging in appeal behind blockbuster North American and European games, with their emphasis on shock and awe. In Japan, certain game creators have attained auteur status largely through works that encourage the players' creativity rather than their flight-or-flight response.
Shigeru Miyamoto, the legendary creator of Nintendo’s Super Mario, Donkey Kong and The Legend of Zelda, provided a striking contrast to other corporate reps straining to come off as hip and friendly gamers (rather than anxious middle-aged shareholders) as they forecast world domination for their products. Boyish and beaming energetically at 60, Miyamoto described what sets his company apart in the Call of Duty era: “At a show like this, it’s my job to show we’re all having fun. People come to E3 and they want to talk about competition and who won the show, and all these companies combating one another. But what we’re meant to be doing is bringing fun to the world. So rather than focusing on competition, I feel it’s my job to go up on stage and show how I can bring fun to the world by having fun myself.”
Miyamoto’s classic games aim for the simple problem-solving pleasures and sensory delights of childhood. A combination business Titan-eternal boy wonder like the late Jim Henson comes to mind.
Wonderment and exploration also characterize a designer whose work was curiously absent from E3, Fumito Ueda (Ico, Shadow of the Colossus). A 2009 teaser for his long-gestating project The Last Guardian is a breathtaking fantasy, with elegant visual choreography worthy of Steven Spielberg and animator Hayao Miyazaki. A set of claws descends out of the shadows into a pool of light on what looks like a dungeon floor. Some kind of dragon-hyena winged beast chases a little boy down a castle corridor until the child reaches a precipice. Nowhere to run. The beast follows him out into the blaring sunlight, nearly tripping over itself to avoid falling into the chasm. Instead of devouring the boy, the creature plays around with him like a devoted puppy. A montage synced to a musical cue from Carter Burwell’s lovely Miller’s Crossing film score advertises the game as some kind of friendship odyssey in the spirit of Old Yeller-through-War Horse. But the grace of its execution, told in clean, sweeping gestures that flow according to very subtle emotions (rather than jumbled fragments that merely goose the viewer), betrays the humanist, poetic influence of French animator Paul Grimualt upon Japanese animators and game designers.
So it's with a great deal of unintended irony that film critics use the phrase “just like a video game” as a putdown for CGI-choked superhero movies. The standard retort from gamers is to accuse such critics of never having actually played a video game at any length. A more relevant accusation is that many of those Westerners who persist in belittling games as mere "games" are responding to the Call of Duty: Modern Warfare culture of loud and homicidal pageantry.
This isn’t the nerds versus the jocks. This is the killers versus the poets.