Why I write longhand

As a writer and game designer, I've spent a good chunk of the past 30 years trying to do various types of creative work while sitting, standing, or slouching at a computer keyboard (and, more recently, a touchscreen). The power of those devices has grown exponentially, enabling me with a tap or a keystroke to accomplish marvels that would have been inconceivable just a few years ago. ("Upload PDF to Dropbox"; "Open Scrivener file.")

And yet I've been increasingly bemused to realize that by real-world measures of productivity — words written, problems solved, good ideas crystallized — my output has not only not multiplied along with the power of my tools, it hasn't increased one bit.

Not only that: I've had for some time the gnawing feeling that my best ideas — the ones that really make a difference — tend to come while I'm walking in the park, or showering after a workout, or talking a problem through with a friend, or writing in a notebook; i.e., almost anywhere but in front of a screen.

For a long time I tried to talk myself out of this. I figured that if my computer time wasn't maximally productive, it was because I didn't have the right software, or wasn't using it right. I tried configuring panels and preferences differently. I created keyboard shortcuts. I downloaded apps to track time I spent using other apps, apps to make it easier to switch between multiple apps. Nothing changed the basic observed fact: There was an inverse relationship between my screen time and my productivity on a given day.

I started mentioning this to people. Cautiously at first. For someone who makes his living by putting stuff on screens, to question the fundamental symbiotic bond of user and machine could seem perverse, even a sort of heresy. But the more I brought it up, the more I discovered I wasn't alone.

It turns out that some of the most productive and successful people I know still write longhand. Screenwriters write on index cards and big rolls of paper, the way I did in elementary school. One dictates his first drafts out loud and has an assistant transcribe them. Game designers and directors scribble on whiteboards and in notebooks. And some of these people were born after 1980.

For myself, I've found that I spend the vast majority of my working computer time staring at the screen in a state of mind that falls somewhere within the gray spectrum from "passive/reactive" to "sporadically/somewhat productive," and in which a few minutes can stretch unnoticed into a quarter-hour, or a couple of hours, without breaking the seamless self-delusion that because I am at my desk, at my computer, I am therefore working.

It's so easy to move words and sentences around in Word or Scrivener or Final Draft that it feels like writing, even if what I'm actually doing would rate only a 2 on the scale in which 10 is "getting an idea and writing it down." Writing down an idea, an actual idea, is something I can do as easily with a fifty-cent ball-point pen as with a thousand-dollar MacBook Air. Only with the ball-point, it's harder to fool myself. If the page stays blank, I can see it's blank.

Which is why, after years of making progressively heavier use of more apps and more devices to do things I used to do without any devices at all, I've thrown that train into reverse. I now keep my project notes and journals in actual notebooks. I've even switched to paper for my "to-do lists," and cross off action items literally, not figuratively. It's simpler and I get more done this way.

As much as I love my tricked-out MacBook Air, I try not to begin workdays automatically by lifting its lid, as if to say "I have arrived at work; now tell me what to do"; just as I try not to reach for my iPhone to fill the silence of a solitary moment. Ideally, I want my screen sessions to begin with a conscious choice, a clear intention of why I'm turning to that device at that moment and what I mean to accomplish.

It's easier said than done. The more I try, the more I realize that what I'm actually doing is fighting an addiction. The Apple II that first enchanted me thirty years ago as a tool to make fun games has evolved, one update and one upgrade at a time, into a multi-tentacled entity so powerful that it takes an ongoing effort of will for me not to be enslaved by it.

Guest article for The Huffington Post, originally published 10th of July, 2013.

20 tips for game designers

  1. Prototype and test key game elements as early as possible.
  2. Build the game in incremental steps — Don't make big design documents.
  3. As you go, continue to strengthen what's strong, and cut what's weak.
  4. Be open to the unexpected — Make the most of emergent properties.
  5. Be prepared to sell your project at every stage along the way.
  6. It's harder to sell an original idea than a sequel.
  7. Bigger teams and budgets mean bigger pressure to stay on schedule.
  8. Don't invest in an overly grandiose development system.
  9. Make sure the player always has a goal (and knows what it is).
  10. Give the player clear and constant feedback as to whether he is getting closer to his goal or further away from it.
  1. The story should support the game play, not overwhelm it.
  2. The moment when the game first becomes playable is the moment of truth. Don't be surprised if isn't as much fun as you expected.
  3. Sometimes a cheap trick is better than an expensive one.
  4. Listen to the voice of criticism — It's always right (you just have to figure out in what way).
  5. Your original vision is not sacred. It's just a rough draft.
  6. Don't be afraid to consider BIG changes.
  7. When you discover what the heart of the game is, protect it to the death.
  8. However much you cut, it still won't be enough.
  9. Put your ego aside.
  10. Nobody knows what will succeed.

These practical tips were first published in 2004, at the time of the release of Prince of Persia: The Sands of Time.

Designing story-based games

Eons ago, in 1996, Next Generation magazine asked me for a list of game design tips for narrative games. Here's what I gave them.

Reading it today, some of it feels dated (like the way I refer to the player throughout as "he"), but a lot is as relevant as ever. I especially like #8 and #9.

  1. The story is what the player does, not what he watches.
  2. List the actions the player actually performs in the game and take a cold hard look at it. Does it sound like fun? (Resist the temptation to embellish. If a cinematic shows the player's character sneak into a compound, clobber a guard and put on his uniform, the player's action is "Watch cinematic." Letting the player click to clobber the guard isn't much better.)
  3. The only significant actions are those that affect the player's ability to perform future actions. Everything else is bells and whistles.
  4. Design a clear and simple interface. The primary task of the interface is to present the player with a choice of the available actions at each moment and to provide instant feedback when the player makes a choice.
  1. The player needs a goal at all times, even if it's a mistaken one. If there's nothing specific he wishes to accomplish, he will soon get bored, even if the game is rich with graphics and sound.
  2. The more the player feels that the events of the game are being caused by his own actions, the better — even when this is an illusion.
  3. Analyze the events of the story in terms of their effect on the player's goals. For each event, ask: Does this move the player closer to or further away from a goal, or give him a new goal? If not, it's irrelevant to the game.
  4. The longer the player plays without a break, the more his sense of the reality of the world is built up. Any time he dies or has to restart from a saved game, the spell is broken.
  5. Alternative paths, recoverable errors, multiple solutions to the same problem, missed opportunities that can be made up later, are all good.
  6. Don't introduce gratuitous obstacles just to create a puzzle.
  7. As the player moves through the game, he should have the feeling that he is passing up potentially interesting avenues of exploration. The ideal outcome is for him to win the game having done 95% of what there is to do, but feeling that there might be another 50% he missed.

Crafting a video game story

In 2001, a small team within Ubisoft's Montreal studio led by producer Yannis Mallat began concept development on the project that would become Prince of Persia: The Sands of Time. Initially a consultant, I later joined the team as writer and game designer. Being part of this project was a great experience and I'm glad to revisit it for this book.

By its nature, video game writing is inextricably bound up with game design, level design, and the other aspects of production. A film screenplay is a clean, written blueprint that serves as a starting point and reference for the director, actors, and the rest of the creative team. It's also a document that film scholars and critics can later read and discuss as a work distinct from the film itself. Video games have no such blueprint. The game design script created at the start of a production is often quickly rendered obsolete, its functions assumed by new tools created to fit the project's specific needs.

In this chapter I'll try to shed some light on the creative and technical decision-making processes that went into crafting the story and narrative elements of Prince of Persia: The Sands of Time (POP for short). The team's approach was practical, not literary; our challenge was to find the right story for a mass-market action video game. In the rapidly changing game industry, each project is unique and presents its own demands and opportunities, according to current technology and the nature of the particular game. What works for one game might not work for another.

Storytelling is, of course, just one aspect of game design. For those interested in reading more about the overall production process on POP, I recommend Yannis Mallat's postmortem article (Mallat 2004).

Rule #1: Do it, don't view it.

What kind of story does a video game need?

The traditional way to tell a story in a video game is to create a series of cinematic cutscenes that serve as "rewards" — transitions between gameplay levels. However, the cool way to tell a story in a video game is to eliminate or reduce the canned cutscenes as much as possible, and instead construct the game so that the most powerful and exciting moments of the story will occur within the gameplay itself.

The screenwriting maxim "actions speak louder than words" applies to video games as well as films, but in a different way. Video games, unlike movies, are interactive. Whereas in a film it's better to show than to tell, in a video game it's better to do than to watch. Give the story's best moments to the player, and he'll never forget them. Put them in a cutscene, and he'll yawn.

Philosophically, the POP team was pretty much united in our lack of enthusiasm for cutscenes. If we could have eliminated them altogether, we would have done so with pleasure. On the other hand, our mandate was to make a successful mainstream action-adventure game on a relatively tight budget and schedule. The game concept already called for pushing the envelope in a number of ways; an overambitious approach to storytelling could have sunk the ship.


Continue reading this article in the Electronic Book Review.

The Hollywood trap

New mediums have trouble escaping the shadow of their predecessors. At the turn of the last century, the dominant audiovisual medium was the stage play. So in their quest for mass-market success and artistic legitimacy, early filmmakers strove to be theatrical: They shot scenes as if the cast were onstage, with the cameraman stationed in the audience seventh row center. Today, those early movies seem hoary.

Likewise, the evolution of video-games has been shaped by gamemakers' determination to be cinematic. A typical game features hours of cutscenes (the mini-movies shown at designated moments). Yet for the most part, cutscene-heavy games based on Hollywood mega-productions like The Matrix and The Lord of the Rings have failed to set the industry on fire. The truly seminal, breakthrough hits of the new art form — the Dooms and Zeldas and Metroids and Simses — are original properties.

That may be because movie storytelling and game storytelling follow totally different logics. A great film sequence and a great game sequence may look similar. But videogames are interactive. To appreciate a videogame, you need to play it — an experience that can consume dozens of hours, encompassing moments of joy and anguish so intense that you reminisce about them years later.

In a movie, the story is what the characters do. In a game, the story is what the player does. The actions that count are the player's. Better game storytelling doesn't mean producing higher-quality cinematic cutscenes; it means constructing the game so that the most powerful and exciting moments of the story occur not in the cutscenes but during the gameplay itself. To simply watch a few recorded snippets of game footage as you would a film is to miss the point.

One small example: In Prince of Persia: The Sands of Time, the hero doesn't realize he's gained the power to turn back time until the player discovers that he has a new controller button at his disposal — and uses it to save his life by rewinding a fatal mistake. Had this revelation occurred in a cutscene instead of during active play, it would not have the same impact.

Cinematic cutscenes have their place in videogames. But they are not the engine that moves the story forward. The key moments, emotional highs and lows, surprising twists of a videogame story are played — not watched. If the object is "Shoot every spaceship you see," packing the cinematic cutscenes full of human relationships, dialog, and backstory won't deepen the experience.

As we gamemakers discover new ways to take storytelling out of cutscenes and bring it into gameplay, we're taking the first steps toward a true videogame storytelling language — just as our filmmaking forebears did the first time they cut to a close-up. One day soon, calling a game "cinematic" will be a backhanded compliment, like calling a movie "stagy."

Guest article for Wired, published in April, 2006.