Why Video Games are Great…and Scary

Coming soon to your local video-game store: beautiful, brainless robot women that your 14-year-old son can train and manipulate in the family room. N.U.D.E.@ (Natural Ultimate Digital Experiment), developed by RED Entertainment and Microsoft, will launch in Japan in the first half of 2003. U.S. release may soon follow.

The previews are, uh, interesting. This, for instance, comes from a game description at www.xboxsolution.com:

The goal of [N.U.D.E.@] is to take a robot that starts with virtually no intellect and nurture her to a point where she can communicate with a rich range of emotions. Players use their microphones for voice input and nurture the robot through constant communication. At first, the robot will only repeat words, but through the repetition of words and phrases she will begin to speak independently, and will eventually express emotions (e.g., smile, get angry, get sad) and engage in conversation. In this title, players [develop] a cohabiting communication doll that they turn into their very own beautiful robot.

Ironically, N. U.D.E.@ has no nudity, and it’s certainly one of the most arresting game concepts to come along in years, a tribute to the rapidly increasing power of computer-game artificial intelligence.

But for some, it sounds just a little too creepily like an interactive Stepford Wives. And as for those legions of under-30 males bathed in the luminescent glow of electrically charged hormones and blood-sport video games—well, judging by the Internet discussion boards, very few plan to raise their “very own beautiful robot” to be a philosophy major.

Entertainment software has traveled light years since the day in 1961 when an MIT student launched the first computer game—Spacewar—on a mainframe computer that took up the floor space of a small house. If money talks, the computer and video-game industry now has both hands firmly on a bullhorn. Tim Gilmour, 29, a Los Angeles-based digital artist and Web designer who cut his creative teeth on Atari home computers in the 1980s, puts it this way: “Gaming is huge and getting bigger; it’s at least as big culturally as TV or talking films in the 1930s.”

And the numbers bear him out.

Computer and video-game software sold more than 221 million units in 2002, nearly two video games for every household in the United States, with $6.9 billion in revenues—an 8 percent increase from 2001. Toss in game hardware sales and accessories, and the annual revenues are well north of $9 billion. And that doesn’t include an estimated $3 billion of revenues lost to game-software piracy (copying a program without buying it).

While games still fall shy of total movie industry revenues, which include box office, cable, and DVD and VHS sales, the gap is closing. Video-game system sales already outpace DVD player sales, three to two. In fact, Sony has shipped more than 50 million PlayStation 2 game systems since introducing it in March 2000. And while competitors Microsoft (Xbox) and Nintendo (Gamecube) enjoy much smaller market shares—they launched a year later than Sony—they’re hardly teetering on bankruptcy. Microsoft’s debut last November of Xbox Live, an online gaming service, broke all company projections.

Indeed, about 60 percent of Americans, some 145 million people, play interactive games on a regular basis. That’s a lot of garners—which means that computer and video games are now serious business, and budgets and risks reflect it.

Mark Surfas, CEO of GameSpy, an online PC-gaming Web site that reaches 16 million customers a month, says, “Game-making is a lot like the movies. It’s a very risky business. You need the right product for the right market at the right moment. Making a movie is actually easier, because you have multiple ways to sell a film—at the theater, video sales, cable TV.” In contrast, computer and video games rise or fall by the shelf space they get in software stores, a ferociously competitive environment. So mediocre titles have short, unhappy lives.

“This is not an industry where you’ve got a couple of guys sitting around in a garage cooking up video games,” Surfas says. Ten or 15 years ago, maybe, and exceptions happen even today. But in 2003, development budgets typically run $3 million to $5 million per title. “And on a game like Lord of the Rings: The Two Towers, which had significant licensing costs and 130 people working on it for a year, the budget could easily be $15 million.”

Despite the uncertainty, the hugely lucrative nature of hit video games makes the risks worth it. Electronic Arts, publisher of popular sports titles like the football video game, Madden 2003, showed a 48 percent increase in its January 2003 quarterly revenue report, with $250 million in net profits.

Even the army has choppered into the act. Last summer, it released America’s Army—a game designed by the Modeling, Virtual Environments and Simulation Institute at the Naval Postgraduate School in California. The game doubles, not surprisingly, as a marketing tool. It targets avid male game players at exactly the right age to consider a military career.

And finally—a sure sign that something Officially Important is going on—video games have tripped the radar of mainline scholarship. Games now draw the serious gaze of Web sites like game-culture.com, game-research.com, digra.org, fragment.nl, and ludology.org. They inspire articles and research papers like “Leaping Into Cross-Gender Role-Play” and “Virtual Team Interactions in Networked Multimedia Games; Case: ‘Counter-Strike,’ Multi-player 3D Action Game.”

If, on some primordial level, Homo sapiens is also Homo ludens—“playing man,” as Johan Huizinga argued more than 60 years ago—and our instinct for play is one of the wellsprings of human culture, then computer and video games are a lot more than empty distractions. As MIT scholar Henry Jenkins says, “Computer games are art—a popular art, an emerging art, a largely unrecognized art, but art nonetheless.”

What Guys Really Do in the Basement

Art wasn’t on my mind, though, when I spent five nights in a row last year hand-converting—line by line in my basement home office—the computer code of 2,000 little files in a game called Black Thorn. What was on my mind, and the minds of a bunch of cyber-buddies from California to Germany, was getting this Windows-based software to play on my Macintosh computer, which I did. Sure, it occurred to me that I might be surfing a little too high on the foamy crest of obsession. But I had plenty of Internet company, and after all, my Dad had his workbench in the basement.

People love to play, and they love to play computer and video games for three main reasons: They’re fun, they’re challenging, and they often become a shared experience with friends and family. The industry’s Interactive Digital Software Association (IDSA) has the data to prove it.

Entertainment software covers a vast array of titles, from the simple computer card games secretaries play on their lunch break, to complex economic simulations, action games, military strategy programs, and role-playing games. Forty-three percent of game players are female. Forty percent of computer garners are age 36 or older. And while violent or soft-porn titles get much of the media attention, 68 percent of all computer and video-game software is rated “E” for Everyone by the industry’s Entertainment Software Ratings Board.

Games create powerful fan communities, which is why people meet, marry, and break up over computer use. Massive, online, role-playing games like EverQuest and Ultima Online have hundreds of thousands of players globally, who together spend millions of hours living as imaginary characters in worlds that don’t really exist. And they invest a lot more than their time. EverQuest’s unreal world of Norrath generated $5 million in very real cash transactions last year as players sold each other virtual helmets ($137), battleaxes ($150), fortresses ($1,200) and even piles of extremely rare manure ($380) for use in the game.

The genius of the best games is that the players themselves cocreate them. In EverQuest, Sony Corporation provides the playground, but the players create the story by their interplay. Action games like Half Life have spawned a vast underground fan industry where the players themselves create new maps, missions, weapons, and characters (called “mods” or modifications) to extend a game’s enjoyment.

The key advantage that sets computer games apart from other entertainment media is control. Games are interactive in a way TV, films, and even theater can’t match. And that connects viscerally with the human need for power over our environment, competition, puzzle solving, adventure, and escape.

Play isn’t exclusive to humans. Games have always helped mammals learn about themselves, each other, and the world. But today’s video games get their special appeal from massive jumps in the quality of microchips, graphics cards, animation software, and artificial intelligence. Games are no longer just interactive—they’re lifelike and immersive.

Add high-speed Internet service and you have a network that allows 36 players from three different continents to join in the same tactical combat game (and chat while they’re doing it). The kind of cyber-living that goes on in EverQuest or Ultima Online comes straight out of the novels of William Gibson (Neuromancer) and Neal Stephenson (Snow Crash). A decade ago, it was science fiction. Now it’s plausible.

“Games are starting to affect people emotionally as story-driven experiences,” says Geoff Keighley, 24, who’s covered the video-game industry for Entertainment Weekly, Business 2.0, and other publications since he was 13 (“On the Internet, nobody knows you’re a teenager,” he jokes).

In the past, you couldn’t make people cry with a game because the technology just wasn’t good enough, he says. But that’s changing fast. Games are now closing in on true cinematic quality, with life-like faces, movements, and environments, “so a lot of creative people who otherwise might be great writers or artists are finding a home in the video-game industry, because it’s a great way to bring together multiple skills,” he says.

Keighley sees video games emerging as a new narrative form, a tool for creating and living stories in a fresh way. “Linear media just won’t be very interesting” in the future, he suspects. “There’s something compelling about being able to interact with a game, and that ‘something’ will draw more and more people to [entertainment software] in the years ahead.”

Through the Looking Glass

You can weigh the power of any new thing by the number and the intelligence of the people who worry about it. The video-game industry has a phalanx of very smart critics.

One of the toughest is David Grossman. A former paratroop officer and West Point psychology instructor who wrote On Killing: The Psychological Cost of Learning to Kill in War and Society (Little, Brown), Grossman drew national attention after the 1999 Columbine massacre for attacking the role of violent video games in shaping the mind of the killers.

Grossman doesn’t condemn all video games. In his book, he notes that many of the games “develop trial and error and systematic problem-solving skills, and they teach planning, mapping, and deferment of gratification.” He even suggests that video games are preferable to most television.

But he also writes that some games are superb at teaching violence, “violence packaged in the same [simulated] format that has more than quadrupled the firing rate of modern soldiers.” In a post-Columbine interview with the online newsletter Skirmisher, he argued that the most violent video games serve the same purpose as flight simulators—in effect, they’re killing simulators. “Violent games are particularly dangerous to children,” he said, just like guns, tobacco, pornography, and drugs. And sooner or later a jury will find the video-game industry criminally liable for its marketing.

In its December 19, 2002, “Video Game Report Card,” the National Institute on Media and the Family voiced many of the same concerns. The report argued that the potential harm from video games is much greater than previously understood. It especially worried that “the best-selling games of the past 12 months are not only ultra-violent but feature brutal violence” toward women.

It also named another growing problem: game addiction. In one study cited by the report, 20 percent of the children who took part were judged to be currently addicted to computer-based video games. In another study of about 4,000 EverQuest players, 62 percent said they were hooked on the game.

Whether game addiction really differs from the “television addiction” already enjoyed by so many American families remains unclear. And a lot of TV programming, especially on cable, can be extremely violent. But games are interactive, television isn’t. Along with its wild popularity—more than 430,000 players pay monthly fees that make the game a cash cow for Sony—EverQuest has left enough psychic body bags strewn around the Internet to earn the label “EverCrack” and spawn online support groups like EverQuest Widows.

What appeals so strongly about massive, online role-playing games is that they offer parallel realities—persistent fantasy universes where real-life people can take on new virtual-reality identities.

Liz Woolley knows about it firsthand. Her adult son, Shawn, shot himself in November 2001 playing EverQuest at the computer in his apartment. She found his body on Thanksgiving Day. Sean had 15 different EverQuest characters at the time, including a monk, a lizard, and other creatures—all of them in play as his various alter egos—and he hadn’t left or cleaned his residence in weeks.

Woolley will never know for sure whether the suicide resulted from one of her son’s characters being jilted in a romance within the game, but she thinks it did. The clues point in that direction. After the funeral, she joined EverQuest Widows and later started www.olganon.org as a support group for game addicts. It has about 800 members. Woolley holds Sony and similar companies “100 percent responsible” for tragedies like her son, because, in her view, the companies program, market, and constantly upgrade their games to deliberately encourage addiction. And Woolley’s story is not unique. In October last year, a 24-year-old South Korean man collapsed and died after playing computer games nonstop for 86 hours.

Other critics, like educational psychologist Jane Healy, Ph.D., feel that video games overstress competition and breed impulsivity. The author of Failure to Connect: How Computers Affect Our Children’s Minds—for Better and Worse (Simon & Schuster), Healy argues that many computer games disrupt sleep patterns, interfere with social and personal growth, create depression, and “desensitize children to other people’s hurt, harm and suffering.” She’s not alone. Celeste Thomas, a veteran speech pathologist and teacher in Colorado’s public schools, sees a decline of language and communication skills in students who habitually play video games. And since language is the basis of thought, reasoning and imagination also suffer.

Today’s video games don’t get much love from an older generation of game designers, either, many of whom started as war garners and went on to work as defense consultants and simulation designers for government think tanks. Jim Dunnigan and Al Nofi—the Lennon and McCartney of war games, with hundreds of successful conflict-game designs between them, plus books, articles, and numerous military-related projects—shrug off most of today’s software as “twitch games” that don’t teach anything. Chris Crawford, another legendary simulation designer, feels that most current video games are still just high-tech versions of puzzle solving and throwing rocks. In his view, the games are actually narrower and “much, much nastier than they were 20 years ago.”

But the words that may linger longest come from Eugene F. Provenzo Jr., author of Video Kids (Harvard) and the forthcoming Children and HyperReality: The Loss of the Real in Childhood and Adolescence. Provenzo, a cultural theorist and professor of educa¬tion at the University of Miami, argues that the biggest difference in video games between ten years ago and today is that “now, instead of looking in at the game from the outside, we’re part of the program.” We’re inside the box; we’re inside the screen. “Go back to Scripture,” Provenzo says, “and look up that verse about seeing through a glass darkly.” It’s a good image, he feels, because in our video games and in a hundred other artifices, we’re now looking out at the real world in a way that’s filtered, skewed, and obstructed by the simulated realities we’ve created. We’re Alice, and we’re very comfortable on the wrong side of the glass.

The Future of Meat Puppets

Of course, most video garners—the overwhelming majority—don’t shoot themselves or anybody else, don’t turn into addicts, and do go about their daily lives responsibly. And that’s why critics of video games don’t make much of a dent in sales, which continue to climb through the roof despite a recession in nearly every other industry. Most gamers, like most other American consumers, hearken to the words of those two great English moral philosophers, Mick Jagger and Keith Richards: “I know it’s only rock and roll—but I like it.” Video games are only video games, but we like them—quite a lot. Why?

Because, at their best, they excite us, absorb us, amuse us, and transport us outside ourselves—like any good book, which in a different age, was an equally dangerous technology. And they can also help us learn. It’s no accident that economists and political scientists game the future, or that the U.S. army has an entire command dedicated to simulation development—the Program Executive Office for Simulation, Training, and Instrumentation (www.stricom.army.mil).

Julian Dibbell, one of the cyber scene’s sharpest observers and a visiting fellow at Stanford’s Center for the Internet and Society, says that while some video-game behaviors seem empty—like buying virtual suits of armor with real money for a fantasy game world that doesn’t really exist—people inevitably “pick their own poison” when it comes to entertainment. For Dibbell, the value of anything is community-determined, so what makes a community of meat puppets (you and me in flesh and blood) more “real” than a community of digital puppets (you and me in cyberville)?

We already live in an artificial environment, he notes. It’s the habit of our species to make it so by shaping and changing nature. And why does anyone pay $10 million for paint on a canvas that just happened to be put there by Picasso? What makes that inherently more valuable than a virtual suit of armor? In a real-world nation obsessed with productivity and profit, Dibbell is glad to see some people carving out a little virtual reality refuge from the rat race, even if it consists of electrically charged computer guts and phone lines.

It’s a brave new world that’s only getting bigger and, for parents, more harrowing. When asked what video games would look like in another 15 years, Web designer Tim Gilmour said, “They’re going to look like life.” And that will probably include direct neural hookups, like in the Ralph Fiennes film Strange Days. Gene Provenzo said pretty much the same. Wait till we have 360-degree surround vision in these games, he said, a totally visually immersive environment. Pretty soon somebody will start putting pain in the machine—a little shock every time a cyber bullet hits you. “We have that technology now,” Provenzo said, “and we’ll absolutely get around to doing it, sooner than later.”

Pain in our games? Well, why not? It’s the flip side of pleasure, which makes the pleasure more real.

I asked Julian Dibbell if he’d hook himself up neurally to a video game if he could.

He didn’t skip a beat. “Oh, hell yes.”

Author

  • Francis X. Maier

    Francis X. Maier, the father of four, writes from Philadelphia.

tagged as:

Join the Conversation

in our Telegram Chat

Or find us on
Item added to cart.
0 items - $0.00

Orthodox. Faithful. Free.

Signup to receive new Crisis articles daily

Email subscribe stack
Share to...