Having gotten back from GDC, and with a number of new things on my plate, there are a lot of things I should be talking about, and I plan a “Reflections on GDC” post in the near future, but until then, I feel obligated to grind out my thoughts on something that's been quite the topic of discussion amongst people in the industry I associate with.
Today topic is: Cutscenes in Games
It seems to be the general opinion floating around in most circles that cutscenes are generally a negative thing to have in games. Kind of like narration in a screenplay, they're thought to represent something that can be used to great effect, but most of the time when you see them, it just represents laziness on behalf of the writer. The thought is that when you are playing a game, you're engaging in some amount of interactivity, and cutscenes take that away from the player, and so are counter productive to the game experience. If you can't tell the story while still allowing the player some measure of control, you're better off writing for movies instead of games. Everybody with me so far?
Personally, I strongly disagree. The idea that the ultimate goal of games is interactivity confuses me. Yes, I understand that it's the salient characteristic of games, and therefore, could be argued their most important one, but art as a whole has as its goal the phenomenological experience of the person beholding it, nothing else matters.
I agree that cut scenes in games that rob the player of control are frequently just kind of a hack solution, but what about a moment in which the precise feeling that you're looking to foster is a lack of control? An appropriately dramatic cutscene is precisely what you want at the moment.
But virtually everyone can agree with that, what I have a strange time swallowing is the idea that for a story to garner a significant amount of emotional involvement, you don't need cutscenes. Everybody I've talked to has been pointing to games like Portal, Bioshock, Deus Ex, and Half-Life 2 as examples of games that tell an engaging story while not relying heavily on the cutscene format.
I feel like the technology just isn't there yet for that to be true. Yes, I loved all of those games, but I think Natalie said it best while I was up in San Mateo for GDC: “I remember playing through Half Life 2, and someone said something that revealed a facet of Gordan Freeman's personality, and I thought, 'Oh, I'm like that? I didn't know!'”
Thinking about this discussion has kind of revealed my vague dissatisfaction with sandbox style RPGs in general. I think it's time that the game industry faces up to a bitter pill: Tabula Rasa characters don't get people personally involved in stories. Period.
If I'm playing through Fallout 3, and I do a bunch of evil things in the beginning of the game, presumably I'm setting myself up for an evil character. For some reason though, Bethesda is afraid to limit your choices later on in the game too heavily, because the idea of having a lot of options open to you is some kind of holy grail that cannot be interfered with by any mere mortal man. If you force me to be evil late in the game because I was evil early in the game, you're robbing me of choices. In this case it's a partial lack of control, and in cutscenes it's a total lack of control, but the difference is quantitative, not qualitative. Again, I'm a little unclear on why this is so bad. Game developers seem to value interactivity over the general quality of the experience.
Because game developers are desperately concerned that I can choose to be evil or good at every possible intersection, the only story that can be crafted around this kind of game's main character is either two stories, one where you're Mother Teresa, and one where you're Hitler (Bioshock, KotOR), or a story that doesn't actually account for your morality in any way other than a footnote (Arcanum, Fallout 3).
Newsflash: by attempting to make a game that appropriately to the morality of your character dynamically, 99% of the time, you're actually making a game that presents a ridiculous parody of actual ethical behavior and consequences.
There's a place for this kind of game, and I enjoy Fallout 3 a ton, to be sure, but if you want to talk plot, give me a call when the technology is there, I'll be in the other room, playing Chrono Trigger on my DS.
(Oh, and for those who haven't heard, you should check here to see a conversation at GDC between myself, Jeff Ward, Corvus Elrod, Darren Torpey, filmed by Darius Kazemi.)
Subscribe to:
Post Comments (Atom)
6 comments:
The ultimate goal of games is interactivity.
The ultimate goal of videogames, on the other hand, is a bit more complicated. ;) Videogames provide two important things that traditional games (apart from tabletop RPGs) don't - immersion and identification.
Immersion is the real reason why cutscenes are considered a bad thing - if what I enjoy most in a game is the feeling of existing in a new world, a cutscene in which my avatar does something I wouldn't do would break my immersion. Tabula rasa characters are created to avoid this - Gordon Freeman (or, alternatively, Crono) doesn't talk because otherwise he'd say things the player might not want him to say, and he's supposed to be the player.
Choice in WRPGs follows from a similar logic - if the player himself was standing in front of an NPC with a gun, he would be physically capable of shooting it, regardless of his past morality.
This kind of thinking, while reasonable, becomes problematic when people consider it the only one way to take advantage of the unique qualities of the medium, and therefore the only way to advance it as an artform.
It's unfortunate, because videogaming provides some distinct advantages to empathetic identification as well - but only when cutscenes are available to use in the creation of characters worthy of empathy. Desperately forcing a dying character to keep fighting can be more powerful than watching the character do the same, but only if you're given enough reason to care about him in the first place.
Ironically enough, I think that "lack of control" is one of those feelings that can be increased by giving the player just enough control to show him that his actions won't make a difference. Cutscenes are better used when the player can't be expected to do something that defines the character as a person, like saying specific lines of dialogue.
Honestly, I don't think that's the only thing that videogames provide that traditional games don't. They also provide computing power.
You see this most clearly these days in the RPG context. Fallout's SPECIAL is a good example of an RPG system that could be run using pen and paper, but would be very cumbersome. I doubt SPECIAL would sell very well in book form; in videogame form, it's beloved. In fact, if I'm not mistaken, the system was written specifically with a videogame's computing power in mind.
RPGs are the most common these days, but I don't think they were the first genre to recognize the value of the medium for processing complex systems. That distinction, I'm pretty sure, belongs to the wargame genre. Games like the Combat Mission series are essentially board games, but the systems that run them are way too complex for practical pen-and-paper play. Even the FPS genre can be seen as an extension of the squad-level board games published in the '70s and '80s. The videogame medium provides not just for a different perspective (first person as opposed to the top-down third-person that most board games are limited to); it also allows the game to be run with a much more complex system than would its pen-and-paper incarnation.
I don't disagree with lkkin's points about immersion and interactivity, but I don't think that's what the goal of a videogame is; i.e., has to be. Plenty of highly immersive, highly interactive videogames (think of your favorite RPG) use the medium primarily merely as a play aid to help you run the system behind the game. To me, that makes it clear that "videogame" is merely a medium, which in turn puts the lie to claims that videogames have interactivity as their ultimate goal. Oil paints don't have an ultimate goal - they're a medium, a tool. The ultimate goal is the artist's, not the medium's. Same for videogames.
I think Natalie is on to something here, but I think it would be a little disengenuous to suggest that Oil Paints don't have, at their core, the idea of providing visual stimulation.
Sure, this is entirely socially constructed, but I think it then just turns into a semantic discussion about the exact meaning of the word "purpose" or "goal".
The fact of the matter, I think, is that most people think of Videogames like Oil Paints. Videogames do interactive immersion the same way that oil paints do visual stimulation.
I agree that that's a good idea, but if you can find some OTHER use for oil paints thats intriguing, go for it.
Also, I think it's important that we ask why just video games? I think that nobody thinks of table-top games in these terms because table-top games don't enjoy a large enough audience and budget for people to think they merit this kind of discussion, and that's kind of a shame.
I didn't mean to imply that any of the things that I mentioned were the only way that games could go - the fact that some of the immersion advocates do imply that is something that's frustrating to me.
In fact, I think videogames might be even more varied in terms of goals than oil paintings are. They can be, as Natalie suggested, vessels for a system of rules - or, their rules might exist only in service to an immersive world. They can provide a space for multiplayer competition - or, they can provide an opportunity for a single player to take a walk in someone else's shoes.
The lack of an agreed-upon distinction between these different portions of gaming probably hinders attempts at discourse - lumping them all into one category tends to push discussion towards the question of which goal is "right" and away from the more useful discussion of how best to meet any particular goal.
As for tabletop RPGs, I think they share a significant amount with videogames in terms of potential beyond pure play. The problem is, they run on imagination, so it's a lot more difficult to make one into a cultural artifact - if you lack the hardware for it, you can't just go pick it up at a store. ;)
I think it's a good point about the lack of appropriate distinctions inhibiting discourse. I can't remember the number of times I've been talking to someone about "Serious Games", and they went, "Oh, you mean like Final Fantasy VII?"
The sad part, is that even though I use the term to refer to games that are created for some purpose other than entertainment, I know in my heart that that's just kind of an incorrect definition. =P
Yeah, "Serious Games" are a definite example of the communication problem in game discussion. Both the words "serious" and "games" are ambiguous enough that the combined phrase might be easily misunderstood.
For serious games, though, it might be easiest to avoid the term game in favor of "interactive," to get rid of the connotation of fun inherent in it. Then again, that could sound a bit pretentious - maybe it could be contracted down into "interactivism" or something like that?
Post a Comment