Saturday, May 3, 2014

Procedural Generation and Emergent Gameplay as Storytelling Devices

I've been sitting on this one for awhile, with a pile of hyperlinks to various articles about procedurally-generated worlds in gaming stacked up in a post titled "Drafty Draft," but it was finally starting to play Minecraft seriously again this week after not playing much since, oh, 2010 that helped me figure out what I really wanted to say. Which is, briefly, that procedural generation in gaming is mostly awesome.

Not entirely awesome, mind you. As I ranted about at length previously in my now-locked personal blog, "procedural generation" has recently gotten roped in with "retro aesthetic," "roguelike," and "RPG elements" in the quadfecta (?) of current gaming buzz-qualities that ensure that it ends up in a lot of games that don't really use it effectively, just because it looks good on a feature list. For instance, taking out procedurally generated worlds in something like Teleglitch might actually add to the tension of the game, because you'd have some idea, at least, of what's coming, precisely what to be afraid of. Not procedurally generating the worlds of Don't Starve might actually lead to the player's accumulation of knowledge about a world's consistent geography and how to use that to his/her advantage when they next respawn, creating a situation not unlike the amazingly effective respawn mechanic at play in Rogue Legacy. But then there are games like Terraria and Minecraft, which arguably wouldn't work at all without the procedurally-generated worlds that their play takes place in.
The game that taught me not only to embrace death, but to look forward to it.
But what I wanted to write about today has less to do with how procedural generation plays into a game's mechanics and more to do with how it might factor into a game's story.

I got my start writing legitimately (if you want to call it that) about games by writing about how narratives and stories function (and fail to function) in gameworlds. Given that most modern video games are a complex combination of pre-scripted "storytelling" through cutscenes and/or "events" that eliminate 99% of player agency paired with spur-of-the-moment player interactivity, I believe that games create unprecedented possibilities for cooperative storytelling/worldbuilding (they're the same, I'd argue) between the "authors" (developers) and their audience, but that they also contain unprecedented pitfalls. The only reason most people would scoff at the idea of a "Citizen Kane of video games" is because currently the pitfalls are like sinkholes and potential peaks are fragile, windswept pinnacles.
Like this one.
In this article, Beraldo argues that "the main point is that games are a shared authorship experience. You will better remember that time you managed to evade cops opening fire at you by diving into an air duct in Deus Ex than the fact that one of the bosses was an ubercyborg with an accent." And this is what really got me thinking. I'm a huge Deux Ex fan, and this describes me completely. I remember, and appreciate, the game's Gibson-influenced cyberpunk story as something engaging and interesting and fairly audacious for a video game, especially in 2000. But I remember and appreciate it precisely because it was a very effective wrapper within which my role as a player interacting with and reacting to the gameworld around me played out. I cared about the story because the unexpected moments of air-duct-diving that emerged through gameplay were so compelling. I cared more about the air-duct-diving than I would have otherwise because the story wrapped those moments in a sense of fictional gravitas that is lacking in a game like, say, Tetris or Super Mario Bros. So it's a symbiosis, in a way. But it's a lopsided symbiosis that's unevenly reliant on the strength of the gameplay. A game without a compelling story can still be a fun game. A good story without compelling gameplay is just a slow, QTE-ridden low-budget movie.
It's unequivocally true.
But I'm oversimplifying. The reason Beraldo's example works so well for me is that he's not just talking about Deus Ex as a game that contains both story and gameplay, but a game that contains both story and emergent gameplay. The reason diving into an air duct to escape gunfire in the game is such a tense, compelling, and ultimately fun experience is precisely because there is no particular "level" or scripted scene in the game in which this occurs. Instead, it occurs as a result of the player's behavior in the world, the reactions of a guard or enemy robot to that behavior, and then the by-the-seat-of-your pants improvisation the player must undergo to escape what is likely a much better armed and armored (and thus unbeatable by main force) enemy. The scenario emerges from the way the game's rules and its world are designed, not as a discrete, pre-plotted encounter, the outcome(s) of which are already predetermined by the game's code (at least, not in a direct sense).

Warren Spector believes that games that enable these types of player-driven experiences are "the only thing that sets us apart in any meaningful way from other mediums." But it's not just a matter of being able to claim a niche: he refers to games that depend on emergent play as "engines of perpetual novelty." Not only do these games offer novel play scenarios in the short term (like Beraldo's example), they also focus more in general on the player's interaction with and experimentation within the gameworld than on herding the player along one developer-curated path, the only possible "right" experience. As Spector says, developers interested in emergent gameplay should "[e]mbrace the idea that your job is to bound the player experience, to put a sort of creative box around it -- but you don't determine the player experience. It's not about 'here's where every player does X.'"
Yeah, no more of this, please.
To me, emergent gameplay adds not just moment-to-moment novelty, but potentially limitless replay value (in games like Don't Starve, for example) that isn't reliant on new DLC and achievements that require you to replay the exact same content over and over again to achieve some esoteric level of mastery. Replay values comes not from the desire to hone your skills in a vacuum to become really good at doing one specific thing, but from the near-limitless possible permutations of your gameplay and its results and your reactions to those results that can occur in the gameworld.

Of course, non-emergent types of games just are fine (Spector and I agree on this, and I recently had a hell of a good time playing ), but I've also found that his claim that "once players get a taste of [an emergence-based] game, it's very hard for them to go back" to be very true in my experience. This really started for me with Minecraft way back in 2010. There were a lot of other things that were unique about the game to me at the time, but what really stuck out (and what still does, honestly) was the experience of trying to survive the game's nights early on while slowly learning the rules of the world and building a primitive shelter. This was emergent gameplay at its finest, determined entirely by the rules of the world and my actions within them rather than the limited gameplay dimensions of reaching a particular, required goal or the need to complete a particular level.

I think that this type of emergent gameplay is something that procedural generation can mesh with quite well. Where the effect of procedural generation is perhaps blunted in something like Risk of Rain, where the ways in which you interact with the world around you are necessarily very limited (in this case by a Contra-esque sort of gameplay), in a game like Minecraft (or Deus Ex) there are enough gameplay possibilities that randomized worlds, and randomized content within those worlds actually force a much wider variation in gameplay experiences than "Oh, I'm the purple guy that shoots the big green guy with the grey gun" or "Oh, I'm the red girl that shoots the little black dogs with the yellow bow...so that's different!"
I hope you like finding the teleporter.
And this is where story comes back in. In games with emergent gameplay, procedurally-generated content does even more to encourage players to make their own compelling experiences with the "creative box" by making sure that every player's experience is unique in its details while taking place in a gameworld that would still be recognizable to other players who are familiar with the particular game.

This is how you end up with pieces like Quintin Smith's "Mine The Gap" or Tom Francis' "The Minecraft Experiment," both "virtual travel narratives" that I examined during my dissertation work.

When a player's experience is unique in the details but relatable more generically, it makes their story simultaneously more personal and more shareable, makes it seem more like a compelling narrative that they are invested in and have helped create instead of just some game that they're playing simply to pass the time. Any Deus Ex player understands Beraldo's example immediately, and likely has a similar example of their own. But, because of emergent gameplay, their own example is also importantly different, importantly personally theirs. Any Minecraft player understands the broad strokes of Smith's narrative immediately, but because of emergent gameplay and procedurally generation, they necessarily have importantly different, importantly personal stories, because it's not only their gameplay experience that's fundamentally different, it's also the world that that gameplay takes place in.

And, well, I think that's neat. 

Friday, May 2, 2014

What We Have Here Is a Failure To Communicate

So, when I started this new blog, I was really excited by the prospect of being able to write publicly and semi-officially (though not too officially) about teaching, researching, and so on, in the hope that what I had to say would both help other people who might be wrestling with the same or similar problems and encourage other people to suggest solutions for me.

Well, for the last four months, apparently I haven't actually had anything to say. Well, that's not true. Let's just say instead that expecting to be able to blog about academia regularly while also taking on a load of committee service work, teaching four classes a term, and trying to get an entirely new minor program off the ground in my first two terms as a faculty member was a fool's errand.

My workload is a bit lighter now, but I'm not going to guarantee it's going to stay that way for long. In the meantime, though, I wanted to write briefly about one of the biggest stumbling blocks I've hit in my first few terms here; again, both in the hope that it might be helpful for potential readers to read and in the hope that someone out there might have some solutions of their own that can help me reframe and better understand the issue(s) I'm facing.

So, I'm currently teaching a lot of different courses. In two terms, I've introduced six entirely new courses, and next term (starting in a few weeks!) I'm going to introduce three more classes, for a total of nine new classes in my first year. Logistically, this is a ridiculous amount of work to juggle, but thematically, it's pretty coherent. At least in my head it is.

All my courses, however they are listed in the catalogue, are under the aegis of "the humanities," and this, combined with the specificity of my own training and research experience, means that all the courses have some elements in common. Students read stuff, and write stuff. They work in groups to discuss the issues that arise in the processes of reading and writing. They make arguments, and (hopefully) back those arguments up with evidence, source synthesis, and critical thinking. Whether we're reading novels or poetry or comic books, playing video games, or watching documentaries, our approach to the material is, generally speaking, a pretty straightforward literary analysis one. This is, at least for now, a limitation of my training, but also an expression of my belief that literary analysis gives one the tools to basically and fundamentally analyze and question all works that fall under the rubric of what I call "culturally expressive media." People write novels, film movies, and make video games to be expressions of their personal and cultural experience of the world (sometimes more intentionally than others). By studying these works through this lens, I believe we can simultaneously better understand the creators' cultural situations and how various media can implicitly and explicitly make arguments, while interrogating our own assumptions about society, or what-have-you.

Pretty straightforward cultural-studies-literary-analysis-humanities-type stuff, right?

Well, it's totally not working. Which is weird, because it worked before.

I find that my students are asking variations of "Why do we have to do this?" much more frequently than they ever did at WSU and I'm finding it much harder to convince them (on average) to become invested in what they see as squishy, non-objective, non-quantifiable work beyond the level of "If I don't work sort of hard, I'll get a bad grade and hurt my GPA."

Now, the problem certainly isn't that I'm incapable of addressing the value of the humanities in today's university environment. In fact, sometimes that seems like all I do. Sometimes it feels like 90% of my job is to explain to other people why my job is important. So, I've gotten pretty good at it.

I also realize that when you're teaching at a technical institute, you're going to get a certain kind of student, to whom learning about the cultural significance of The Illiad might pale in comparison to things they do in their other classes, like build robots (no, seriously, they do).

But. I guess it's difficult for me to understand precisely how a student goes from thinking one of those things is preferable to the other to thinking that one is useful and the other one is pointless. I can throw all the statistics I want in the face of many of my students, proving that graduates with humanities knowledge are going to further down their career path than those who don't have it, and thus make more money in the long term. That employers look for critical thinking skills and cultural awareness in addition to your "piece of paper." That many of the professional associations that eagerly await our graduates to fill their open positions specifically contact us regularly to make sure that their future hirees are getting an appropriate education in ethics, logic, critical thinking, and global citizenship. And, yes, that there's really not that big of a gap in employment between those who have humanities degrees and those who have science degrees.

None of it seems to make any difference, though. Which leads me to believe that there may well be a profound shift occurring right now, not just away from liberal arts education, but from education in general. The kind of "education" many of my students seem to want to get at this school is actually what we'd call "vocational training" if there wasn't such a stigma attached to that term. And that's fine, I guess: college has become such an expensive, loophole-ridden process while simultaneously becoming so necessary for employment that it's easy to see why students want to know what they're paying for before they pay for it, and a $70,000/year job at the end of four years is a lot better return on your investment (that lands you in massive debt) than "being a better person" or "understanding other cultures."

So I get it. But, I think there are a lot of hidden costs to replacing universities with huge job-training factories, and if we continue to go down that road as a culture, I think there are a lot of potential dangers specifically in continuing to call what we're doing "education." Because I believe education should extend beyond just job training, regardless of whether students can immediately see the value in more learning more ephemeral subjects and skills or not. And I think every students who can afford it will reap the vast benefits of this broader kind of education throughout the rest of their lives whether they go on to manage a McDonald's or work as the CEO of a corporation, or do anything in between.

But when I say these things to many of my students, they just roll their eyes and settle in to suffer through 10 of their 30 hours of (30 hours in-class, our of four years of school!) of humanities education.

So, I guess my question is: with so little time in the classroom, how do we convince students that there is more to education than just skill-based training and certification? How do we convince students that sometimes it's worthwhile to invest money in something other than getting more money?

I have a lot reasons that I could share convincingly with, say, the audience of The Chronicle, but I don't have any reasons that 18 year-olds  who have already chained themselves to massive, possible lifelong debt, want to hear.