Are Video Games Evil?

Are Video Games Evil?

Chris Suellentrop

Violent video games teach our kids to point and shoot, say their critics. The truth may be every bit as frightening to members of a generation raised to believe they’re thinking outside the box.

Share:
Read Time:
21m 57sec

On a monday evening last fall, in the Crystal Gateway Marriott a few blocks from the Pentagon, a group of academics, journalists, and software developers gathered to play with the U.S. military’s newest toys. In one corner of the hotel’s ballroom, two men climbed into something resembling a jeep. One clutched a pistol and positioned himself behind the steering wheel, while the other manned the vehicle’s turret. In front of them, a huge, three-paneled television displayed moving images of an urban combat zone. Nearby, another man shot invisible infrared beams from his rifle at a video-screen target. In the middle of the room a player knelt, lifted a large, bazooka-like device to his shoulder, and began launching imaginary antitank missiles.

The reception was hosted by the Army Game Project, best known for creating America’s Army, the official video game of the U.S. Army, and was intended to demonstrate how the military’s use of video games has changed in just a few years. America’s Army was released in 2002 as a recruiting tool, the video-game version of those “Be All You Can Be” (now “An Army of One”) television ads. But the game has evolved beyond mere propaganda for the PlayStation crowd into a training platform for the modern soldier.

If you have absorbed the familiar critique of video games as a mindless, dehumanizing pastime for a nihilistic Columbine generation, the affinity between gaming and soldiering may seem nightmarishly logical: Of course the military wants to condition its recruits on these Skinner boxes, as foreshadowed by science fiction produced when video games were little more than fuzzy blips on the American screen. The film The Last Starfighter (1984) and the novel Ender’s Game (1985) depict futuristic militaries that use video games to train and track the progress of unknowing children, with the objective of creating a pools of recruits. (The code name for America’s Army when it was in development was “Operation Star Fighter,” an homage to its cinematic predecessor.)

Some members of today’s military do view video games as a means of honing fighting skills. The director of the technology division at Quantico Marine Base told The Washington Post last year that today’s young recruits, the majority of whom are experienced video-game players, “probably feel less inhibited, down in their primal level, pointing their weapons at somebody.” In the same article, a retired Marine colonel speculated that the gaming generation has been conditioned to be militaristic: “Remember the days of the old Sparta, when everything they did was towards war?” The experiences of some soldiers seem to bear out his words. A combat engineer interviewed by the Post compared his tour in Iraq to Halo, a popular video game that simulates the point of view of a futuristic soldier battling an alien army.

To view video games merely as mock battlegrounds, however, is to ignore the many pacific uses to which they are being put. The U.S. military itself is developing games that “train soldiers, in effect, how not to shoot,” according to a New York Times Magazine article of a few years ago. Rather than use video games to turn out mindless killers, the armed forces are fashioning games that impart specific skills, such as parachuting and critical thinking. Even games such as those displayed at the Marriott that teach weapons handling don’t reward indiscriminate slaughter, the shoot-first-ask-questions-later bluster that hardcore gamers deride as “button mashing.” Players of America’s Army participate in small units with other players connected via the Internet to foster teamwork and leadership.

Nor is the U.S. military alone in recognizing the training potential of video games. The Army’s display was only one exhibit at the Serious Games Summit, “serious” being the industry’s label for those games that are created to do more than entertain. Games have been devised to train emergency first-responders, to recreate ancient civilizations, to promote world peace. The Swedish Defense College has developed a game to teach UN peacekeepers how to interact with and pacify civilian populations without killing them. Food Force, an America’s Army imitator, educates players about how the United Nations World Food Program fights global hunger. A group of Carnegie Mellon University students, among them a former Israeli intelligence officer, is developing PeaceMaker, a game in which players take the role of either the Israeli prime minister or the Palestinian president and work within political constraints toward a two-state solution to the Israeli-Palestinian conflict.

The very phrase “serious games,” however, suggests that unserious games may well be the societal blight that many believe them to be. It’s easier to vilify games such as those in the Grand Theft Auto series, in which the player’s goal is to rise to power in various criminal organizations by carjacking vehicles and killing their owners with a variety of weapons—a baseball bat, a Molotov cocktail, an AK-47. But Grand Theft Auto and its sequels are popular not just because of their transgressive content, but also because they are designed to allow players to roam freely across a gigantic three-dimensional cityscape. (With their combination of technical accomplishment and controversial subject matter, the Grand Theft Auto titles might be the video-game analogues of movies such as Bonnie and Clyde or, more recently, Pulp Fiction.)

As far back as 1982, when video games consisted of simple fare like Space Invaders—a two-dimensional arcade game—a rabbi warned on The MacNeil/Lehrer NewsHour about their dehumanizing effects: “When children spend hours in front of a screen playing some of these games that are inherently violent, they will tend to look at people as they look at these little blips on the screen that must be zapped—that must be killed before they are killed. And it is my concern that 10, 20 years down the line we’re going to see a group of children who then become adults who don’t view people as human beings, but rather view them as other blips to be destroyed—as things.”

The rabbi articulated an objection that has been heard repeatedly as video games have grown from a pastime for awkward, outdoors-fearing children into a form of mass entertainment enjoyed mostly by adults. Last year, Americans spent a total of $7 billion on almost 230 million computer and video games, according to the Entertainment Software Association, an industry group. Both of those numbers—sales revenues and units sold—have roughly tripled over the past 10 years. Defining who is a “gamer” can be tricky, as the definition can include everyone who has played Minesweeper on a personal computer or who kills time at the office with computer mahjong, but studies conducted by the ESA and others estimate that roughly half of all Americans play computer and video games. According to a study released in May by the ESA, the average American gamer is 33 years old. A full quarter of gamers are over 50, while only 31 percent are younger than 18. Playing video games is still a predominantly male pastime, but almost 40 percent of gamers are women; more adult women play video games than do boys 17 and under.

Those who assume that video-game players are a bloodthirsty lot might be surprised to learn that of last year’s 10 best-selling games for the PlayStation and Xbox consoles, not one was a shoot-’em-up. Six of the most popular games were sports titles—including Madden NFL, a cultural juggernaut among athletes and young men—and the other four were Star Wars games. The bestselling PC game last year was World of Warcraft, a multiplayer swords-and-sorcery game that millions of subscribers pay a monthly fee to play. World of Warcraft is the latest and most popular in the genre of massively multiplayer online role-playing games, commonly called “virtual worlds.” In these games, thousands of players can interact with each other by connecting simultaneously over the Internet. (There’s a debate among specialists whether some of these worlds, such as Second Life, which offers its “residents” no competitions or quests, even qualify as games.)

Despite their popularity, video games remain, in the opinion of many (particularly those who don’t play them), brainless or, worse, brain-destroying candy. But for as long as critics have decried video games as the latest permutation in a long line of nefarious, dehumanizing technologies, others have offered a competing, more optimistic vision of their role in shaping American society. Opposite the rabbi on that MacNeil/Lehrer broadcast a quarter-century ago was Paul Trachtman, an editor for Smithsonian magazine, who argued that video games provide a form of mental exercise. Ignore the dubious content, the “surface or the imagery or the story line,” he suggested, and you will see that games teach not merely how best to go about “zapping a ship or a monster.” Underneath the juvenilia is “a test of your facility for understanding the logic design that the programmer wrote into the game.” Games, in short, are teachers. And electronic games are uniquely suited to training individuals how to navigate our modern information society.

As the gaming generation has matured, it has advanced this idea with increasing vigor. Last year, Steven Johnson published Everything Bad Is Good for You: How Today’s Popular Culture Is Actually Making Us Smarter, which included a brief for an idea that has been gaining currency among academics and game developers: All video games, even the ones that allow you to kill prostitutes, are a form of education, or at least edutainment. Games can do more than make you a better soldier, or improve your hand-eye coordination or your spatial orientation skills. They can make you more intelligent.

On one level, this argument isn’t very surprising. Games of all kinds are a part of almost every human society, and they have long been used to inculcate the next generation with desirable virtues and skills. We enroll our kids in Little League not only so they will have a good time, but also to teach them about sportsmanship, teamwork, and the importance of practice and hard work. The Dutch historian Johan Huizenga argued in Homo Ludens, his 1938 ur-text of game studies, that the concept of “play” should be considered a “third function” for humanity, one that is “just as important as reasoning and making.”

In the case of video games, even their critics acknowledge that they are instructing our children. The critics just don’t like the form and the sometimes violent and sexually explicit content of the instruction, which they believe teaches children aggressive behaviors. Yet if such games are nothing more than “murder simulators,” as one critic has called them, why is it—as gaming enthusiasts never tire of pointing out—that the murder rate has declined in recent years, when there are more video games, and more violent ones, than ever? Why do IQ scores continue their slight but perceptible rise if an entire generation of children, the oldest of whom are now in their thirties—a cohort to which I belong—stunted its development with electronic pap? The important thing to find out about video games isn’t whether they are teachers. “The question is,” as game designer Raph Koster writes in A Theory of Fun for Game Design (2004), “what do they teach?”

The generally uncredited father of video games was William A. Higinbotham, who, while working as a government physicist, invented a game of electronic Ping-Pong and displayed it during a visitors’ day for the Brookhaven National Laboratory on Long Island in October 1958. By the next year, the game had been dismantled because its computer and oscilloscope components were needed for other jobs. Higinbotham’s game might have been forgotten—except by readers of the Brookhaven Bulletin, which published a 1981 story speculating that he had invented the first video game—were it not for the fact that one of the lab’s visitors that day was high school student David Ahl, who would write the 1978 book Basic Computer Games and become the editor of Creative Computing. From the pages of this magazine for computer hobbyists, Ahl proclaimed Higinbotham the grandfather of the phenomenon in 1982.

The more influential and more commonly acknowledged grandfather was Steve Russell. As a Massachusetts Institute of Technology student in 1961, Russell created a rocket-ship duel called Spacewar! that could be played on one of MIT’s handful of  computers, the PDP-1. Then, in the same way that Microsoft packages its Windows operating system with solitaire and other games, Digital Equipment Corporation, the manufacturer of the PDP-1, began shipping it with the game preloaded in memory, influencing computer science students around the country.

In 1972, Magnavox introduced Odyssey, which, like Higinbotham’s game, was an adaptation of Ping-Pong (for whatever reason, table tennis was the game of choice for early video-game creators) that was the first home console for video gaming. The next 30 years saw the introduction of Atari, Nintendo, Sony’s PlayStation, and Microsoft’s Xbox, not to mention the many games designed for the growing numbers of personal computers. Higinbotham’s black-and-white blips have, over the past half-century, morphed into sophisticated displays of computer animation that increasingly resemble films, with original scripts, music, and often-breathtaking visual beauty. The King Kong video game released last year to coincide with Peter Jackson’s film remake featured an arresting parade of apatosauruses marching through a valley on Kong’s home of Skull Island. The sequence was so gorgeous that I set down my controller and just marveled at it for a while.

As was true of games before the digital age, there’s a remarkable array of video games. Chess and bowling aren’t very similar, but we intuitively understand that both are games, if different species of the genus. Likewise, video games encompass everything from simple online puzzles to simulated football games and professional wrestling matches to the “God game,” in which the player adopts an omniscient view to influence the development of entire societies. In The Sims, the best-selling PC game of all time, players control the lives of individual humans as they go about their mundane lives. (It may sound unappealing, but The Sims comes from a long tradition. It is, in effect, another way to play house.) New genres frequently emerge. A “music” genre has arisen in response to the popularity of Dance Dance Revolution, a game in which players must move their feet in time to music on different areas of a dance pad. It’s basically a fast-moving, musical, single-player version of Twister.

Exactly what is new about video games, other than their electronic nature, can be difficult to pin down. In the 21st century, almost all children’s toys have an electronic component, but that doesn’t make them all video games. In The Ultimate History of Video Games (2001), game journalist Steven Kent cites pinball as a mechanical ancestor of today’s digital games. Pinball created a panic in some quarters—no pun intended—as a new and dangerous influence on society. Foreshadowing the antics of today’s antigaming politicians was New York mayor Fiorello La Guardia, who smashed pinball machines with a sledgehammer and banned them from his city in the 1930s, a prohibition that was not lifted until the 1970s. (To be fair to La Guardia, governments have long perceived societal threats from new games. In the 1400s Scotland banned golf, now its proud national pastime, because too many young men were neglecting archery to practice their swings.)

Nowadays you can play pinball on your PC, as every Windows XP machine comes packaged with a video-game version. The difference between this digital pinball and its mechanical predecessor is, at root, aesthetic. The rules of the game are the same, just as the rules and gameplay of computer solitaire and chess are identical to those of their analog forebears. (Beyond the translation of playing cards and chess pieces into pixels, there are some key differences, of course. For one thing, the computer doesn’t let you cheat—or, in pinball, “tilt.”) Jesper Juul, a Danish video-game theorist, defines games such as pinball, solitaire, and chess as “emergence” games, by which he means that the gameplay emerges from a relatively simple set of rules. Football and basketball—whether played online or off—are also emergence games, as are chess, backgammon, Othello, and board games such as Risk and Monopoly. All those games can now be played using computers, but that doesn’t make them new, exactly.

The first game that diverged from this 5,000-year-old emergence model was a 1976 computer game called Adventure that combined the elements of narrative with gameplay. Adventure was essentially an interactive text, somewhat similar to the books in the Choose Your Own Adventure series. While reading the story, the player typed in commands to tell the character what to do and to learn what happened next. Juul calls Adventure the first “progression” game, a new model that inspired most of today’s video games, from Grand Theft Auto to Halo.

Nongamers who watch their slack-jawed, twitchy-thumbed children and conclude that they are brain dead are making the mistake of observing the spectator rather than the game itself. Research has shown that playing video games can help people improve their ability to manipulate spatial information, and that as little as 10 hours of play can improve a person’s ability to process visual information. (These studies were approvingly cited by the deputy director of the Army Game Project last fall.) But focusing on how video games improve coordination and memory misses the point. In a recent issue of Wired, well-known game designer Will Wright compares this mistake to studying film by watching the audience rather than what’s on the screen: “You would conclude that movies induce lethargy and junk food binges. That may be true, but you’re missing the big picture.”

Wright proposes that video games teach “the essence of the scientific method,” that “through trial and error, players build a model of the underlying game.” To succeed, a player must establish a hypothesis about some aspect of the game, test it, and evaluate the results of the experiment. The organizer of a playground game explains the rules in advance, but a video game often hides its rules, revealing them only as the player figures out how to unlock the game’s secrets. And when that happens, a game player can experience an ecstatic Archimedes moment.

Perhaps most important of all, the game adapts itself to the player’s ability. “The secret of a video game as a teaching machine isn’t its immersive 3-D graphics, but its underlying architecture,” writes James Paul Gee, an education professor at the University of Wisconsin, Madison, and author of What Video Games Have to Teach Us About Learning and Literacy (2004). “Each level dances around the outer limits of the player’s abilities, seeking at every point to be hard enough to be just doable. In cognitive science, this is referred to as the regime of competence principle, which results in a feeling of simultaneous pleasure and frustration—a sensation as familiar to gamers as sore thumbs.” It is in that spirit that Atari founder Nolan Bushnell has said, in a statement that probably best distills the gamer ethos, “The way to have an interesting life is to stay on the steep part of the learning curve.”

Despite the omnipresence of video games—on our computers, our televisions, our phones, and now the back seats of our cars in handheld units—most people who don’t play them still fundamentally misunderstand them. Nongamers often assume that video games, like so many electronic media, are designed to deliver instant, electronic gratification. The opposite is the case, Johnson insists in Everything Bad Is Good for You. The best video games are brilliantly designed puzzles. The Grand Theft Auto titles can take as long as 60 hours to complete. Finishing them requires discipline, problem solving, decision making, and repeated trial and error.

In a recent New York Times column, David Brooks suggested that delayed gratification is the key to success in school, work, and life, and that it is a learned trait. If that’s true, and if the mental gymnasium of video games teaches delayed gratification, then gamers should be, on average, more successful than nongamers. No researcher has proffered that comprehensive a thesis yet, but the authors of Got Game: How the Gamer Generation Is Reshaping Business Forever suggest that gamers do come out ahead in the world of business. John C. Beck and Mitchell Wade surveyed 2,500 Americans, mostly business professionals, and came to the provocative conclusion that having played video games as a teenager explains the entire generation gap between those under 34 years of age and those older (the book was published in 2004, so presumably the benchmark is now 36).

Beck and Wade argue that the gamers somehow intuitively acquired traits that many more-senior managers took years to develop and that their nongaming contemporaries still lack. According to their survey, video game players are more likely than nongamers to consider themselves knowledgeable, even expert, in their fields. They are more likely to want pay for performance in the workplace rather than a flat scale. They are more likely to describe themselves as sociable. They’re mildly bossy. Among these traits, perhaps the most important is that gamers, who are well acquainted with the reset button, understand that repeated failure is the road to success.

The very purpose of every game is to become boring, as the player develops successful strategies to defeat it, the game designer Raph Koster observes. The best video games are designed to assist players in figuring out those strategies. The video games that are the most like the real world are often the least fun to play, because they don’t do a good job of communicating to the player what is important and what isn’t—which paths should be taken and which can be safely ignored, which items need to be collected and which can be safely left behind. But the real world doesn’t come with big blue arrows pointing toward the next door you need to open. The real world doesn’t always let you hit the reset button and start over. In the real world, there isn’t always a way to win.

As games become better at adapting to the talent and skill levels of their players, more video games will be decoding the players as much as players are decoding the games. “Soon games will start to build simple models of us, the players,” Wright predicts. “They will learn what we like to do, what we’re good at, what interests and challenges us. They will observe us. They will record the decisions we make, consider how we solve problems, and evaluate how skilled we are in various circumstances. Over time, these games will become able to modify themselves to better ‘fit’ each individual. They will adjust their difficulty on the fly, bring in new content, and create story lines. Much of this original material will be created by other players, and the system will move it to those it determines will enjoy it most.”

It feels preposterous and yet believable to suggest that the adaptive nature of video games might be one reason for the rise of the Organization Kid, a term coined by David Brooks when he visited with Princeton students for a 2001 story in The Atlantic Monthly. “They’re not trying to buck the system; they’re trying to climb it,” Brooks wrote of the respectful, deferential students he met. A Princeton sociology professor Brooks interviewed could have been describing ideal soldiers when he said of his students, “They’re eager to please, eager to jump through whatever hoops the faculty puts in front of them, eager to conform.” Brooks summarized the love-the-power worldview of the Organization Kid like this: “There is a fundamental order to the universe, and it works. If you play by its rules and defer to its requirements, you will lead a pretty fantastic life.” That’s a winner’s ideology: Follow orders, and you’ll be just fine.

Whether you find the content of video games inoffensive or grotesque, their structure teaches players that the best course of action is always to accept the system and work to succeed within it. “Games do not permit innovation,” Koster writes. “They present a pattern. Innovating out of a pattern is by definition outside the magic circle. You don’t get to change the physics of a game.” Nor, when a computer is the referee, do you get to challenge the rules or to argue about their merits. That isn’t to say that there aren’t ways to innovate from within the system. Gamers are famous for coming up with creative approaches to the problems a game presents. But devising a new, unexpected strategy to succeed under the existing rules isn’t the same thing as proposing new rules, new systems, new patterns.

Our video-game brains, trained on success machines, may be undergoing a Mr. Universe workout, one that leaves us stronger but less flexible. So don’t worry that video games are teaching us to be killers. Worry instead that they’re teaching us to salute.

More From This Issue