Here’s a modern horror story:
“The teacher takes the girl’s paper and rips it in half. ‘Go to the calm-down chair and sit,’ she orders the girl, her voice rising sharply.
‘There’s nothing that infuriates me more than when you don’t do what’s on your paper,’ she says, as the girl retreats.
. . .
After sending the girl out of the circle and having another child demonstrate how to solve the problem, Ms. Dial again chastises her, saying, ‘You’re confusing everybody.’ She then proclaims herself ‘very upset and very disappointed.'”
Let’s briefly set aside the hilarious irony of an irate adult sending a child to the “calm-down chair,” because this is actually important. It’s not about being “mean,” or the teacher “losing control,” or whether the kids are being “terrorized” or need to “toughen up.” It’s about ideology.
That’s why this person is an idiot:
“Some parents had another view. Clayton Harding, whose son, currently in fourth grade, had Ms. Dial as a soccer coach, said: ‘Was that one teacher over the line for 60 seconds? Yeah. Do I want that teacher removed? Not at all. Not because of that. Now if you tell me that happens every single day, that’s a different thing. But no one is telling me that, and everyone is telling me about all the amazing things that she does all the other days.'”
One of the more dangerous things about the internet is that it creates the illusion that “data” just pops up out of nowhere instead of having specific contingent physical sources (also “data” is a conceptual category and not actually a type of physical thing, but that’s another story). In this case, obviously, the assistant teacher would not have been recording this incident unless they already knew that something was up, i.e. this type of thing had in fact been happening on a regular basis.
More than that, though, the fact that this is how the teacher behaves when she has a “lapse in judgment” means that the rest of the time she’s biting her tongue. What we’re seeing here is what she actually believes: that “underperforming” children deserve to be ostracized and humiliated. And the fact that the school supports her means that’s also what the school believes.
Per standard procedure, the New York Times spends the entire article wringing its hands over a bunch of nonsense, then buries the lede right at the end:
Dr. McDonald, the N.Y.U. professor, who also sits on the board of the Great Oaks Charter School on the Lower East Side, said that the behavior in the video violated an important principle of schooling.
“Because the child’s learning was still a little fragile — as learning always is initially — she made an error,” he said in his email. “Good classrooms (and schools) are places where error is regarded as a necessary byproduct of learning, and an opportunity for growth. But not here. Making an error here is a social offense. It confuses others — as if deliberately.”
Whether this is in fact a “principle of schooling” is precisely the issue. As I’m sure you’re aware, what is euphemistically referred to as “education reform” is in fact a major ideological conflict over this exact point. But even this guy doesn’t have it quite right – he’s still framing things as though a child giving an unexpected response is an “error” that needs to be “corrected,” as though “errors” are “byproducts” of growth rather than the substance of growth themselves.
Naturally, since we’re talking ideology here, this confusion is not limited to schooling. Labelling something as an “error” is a pretty obvious value judgment. What we’re actually talking about is what it means for something to be “correct” in general; what, in a practical sense, is the right thing. In the past, we had the idea that “might makes right,” that those who happened to be victorious were by that fact necessarily of superior ability or favored by god or whatever. Despite the phrase now being shorthand for barbarism, this philosophy has one major advantage: the winning party has to actually win. They have to do something to deserve it. Today, we’re enlightened enough that we don’t have to worry about reality anymore. We now live in world where “right makes right,” where what’s right is right by virtue of it being accepted as such and for no other reason, where filling in the bubble labelled “B” on a standardized test is correct if and only if the grading rubric specifies “B” as the correct answer.
Wikipedia, for example. How do you know that the information on Wikipedia is accurate? Well, if it weren’t, someone would have corrected it.
—
There’s a broad misconception that people only know things that they have been explicitly taught. This is most dramatically demonstrated in the area of language. Children learn their first language (or two) without ever being explicitly taught anything about it. It’s actually not at all clear how it would even be possible to “teach” language to someone who can’t talk; it would be very much like teaching the proverbial blind person to see colors. Yet, as children age, we cling to the idea that they must be educated out of their “errors,” that a language is a big stone tablet of rules against which one checks each utterance for “correctness.” In the saddest case, one reaches adulthood with a disorganized basket stuffed full of “rules” that they then go about waving in the face of anyone who says anything “incorrectly.”
The truth is not that verification implies correctness, but that learning implies error. Language correction is self-contradictory: the fact that you have to tell someone that they said something wrong means that there wasn’t actually anything wrong with it. If there had been, the error would have occurred organically: they would have been misunderstood. It’s clear that this is how we actually learn things about language: we fail to express ourselves, and then we try again.
(Of course, this only applies to people who are paying attention. We’re all familiar with the type of person who talks so much and so inattentively that they end up creating their own unique mishmash of noises and gesticulations, such that they are able to utter on endlessly without ever intersecting reality.)
The point is that things we are explicitly taught account for probably about 1% of our actual knowledge base. What actually happens is that we have experiences and then we try to create a framework under which those experiences make sense. As such, it’s theoretically possible to accelerate the process, to create a sort of hyper-pedagogy in which the student is constantly barraged with miniaturized interactions designed to create a specific understanding. And by “theoretically” I mean video games.
The basic framework for modern video games is the challenge/failure/retry loop. The historical-material basis for this was arcade games. Arcade games were required to eat quarters, which meant each play session had to 1) provide a dopamine jolt, 2) terminate itself (eat the quarter), and 3) provide an incentive for initiating another session (inserting another quarter). The most popular solution to this equation was something called “extra lives.” Your quarter bought you a certain number of lives (usually 3, a psychologically significant number), and then the game started trying to kill you.
The critical moment comes when you fail a challenge and then have the opportunity to try it again. If the game is at all decently designed, you’ll have some idea of what happened and what you want to try to do next time. So in your typical action game like Metal Slug or whatever, the right thing to do is to hit the enemies with your attacks and the wrong thing to do is to get hit by the enemies’ attacks. So you’ll be thinking about how to position yourself and when to attack and so forth, and you’ll want to try again in order to test these ideas out. By hitting you with this sort of scenario over and over again, the game locks you into whatever its idea of a good time is. Materially speaking, in order for the game to be as short and dopamine-intense as possible, the failure loop happens as often as possible. In other words, games are very educational.
And what’s being taught is the thing that every one of these interactions has in common. You’re presented with a given situation with given rules, and there’s a “correct” set of actions to take that will result in the outcome that has been defined by the game as “success.” Executing this set of actions is the right thing to do. Anything else is the wrong thing to do. In certain extreme cases, progressing in a game will require you to do something that is obviously wrong in terms of narrative, such as aiding an enemy or falling into a trap. In such cases, the game implicitly frames doing the wrong thing as the right thing to do.
So the original problem is obviously that games have mostly been about dumb things like avoiding projectiles and jumping on turtles. And this is still largely the case; increased substance in games has been well outstripped by increased flash and pretentiousness. More fundamentally, though, the material situation has changed. Since games have stopped needing to eat quarters, the “failure” part of the loop has atrophied. Now that we pay for games once and play them until “finished,” failure becomes a mere impediment that may as well be done away with. Instead, games now give the player some actions to perform, reward them for doing so, and that’s it.
But “bringing back” failure isn’t a solution. “Failure” on its own isn’t any more significant than “success.” In fact, there’s currently a countertrend in the form of “ultra-hard” games which jam the failure loop into overdrive, and this isn’t any better. Failing the same meaningless challenge 100 times is exactly as pointless as successfully executing the same meaningless task 100 times, in exactly the same way.
What we ought to be looking for are forms of success and failure that are interesting, that cause you to reassess your situation in some way, to question your assumptions, and to gain new insights. Which is what art is supposed to do. It’s deeply sad that people are so zealous in insisting that games “count” as art, yet so blasé about actually getting them to do the things that art is good for.
Of course, not all games are derived from the arcade model. Sim games, for example, tend to lack explicit goals and thereby make room for interesting failures. SimCity allows you to explicitly sic disasters on yourself just to see what happens. Dwarf Fortress is mostly known for players’ stories of the hilarious catastrophes they’ve suffered. And of course there are pure story games, as well as games that are entirely focused on providing aesthetic experiences. But the failure loop is still at the core of how video games are generally conceived, and these exceptions are often ones that prove the rule. For example, story games often have “correct” choices that you need to make in order to get the “true” ending.
On the other hand, I’d be remiss not not mention that, throughout the history of games, players have often ignored what games are supposed to be about and created their own goals and rules of interaction. The most popular example is speedrunning, which usually involves subverting the normal progression of a game and playing it in a way it wasn’t designed for. This provides heartening evidence that structure isn’t everything, that people can find their own truths even in the midst of the labyrinth. Still, the motivation for finding these alternate paths in the first place is often the poverty of the intended experience. The fact that people can make do with garbage is hardly a justification to keep producing more. On the contrary, this sort of player creativity provides us with new vistas to set out for.
(Games that explicitly support speedrunning have entirely missed the mark in this regard: the point is not to incorporate speedrunning as a new task in the same type of game, the point is that there are different types of games to be designed.)
If connecting all of this to the current state of society seems hyperbolic, that’s probably because it is. Video games are barely doing anything right now. We’re lucky that they’re still in their infancy; the problem is that they’re enfants terribles, and they’re going to grow up. And this may in fact happen sooner rather than later.
—
Going back to our unfortunate charter school students: what are they actually learning? For a few minutes each day, they are presented with some facts or rules or something that they are instructed to internalize. But every second they’re at school, they are being taught a deeper lesson: that the goal of life is to respond to challenges by producing the right answers. What we’re looking at here is the mentality for which “failure is not an option.” This phrase is, first of all, a category error, because failure isn’t something you choose, but more importantly, it represents an extremely dangerous way to think. It assumes that everything’s been figured out, that our society’s assumed goals are not only correct, but worth any sacrifice.
In education, it’s unavoidable that students will say or do things that a lesson planner could never have anticipated. This is a good thing. They’re children, and they’re human. When it comes to games, inconveniences like these can be abstracted away. The player can be given no actions to perform but the “correct” one, and no tools except those needed to do so. One can then be assured that they will do the right thing. Imagining such a system applied to actual humans is obviously horrendous. And yet, those who think that the purpose of education is to train children to answer correctly are advancing precisely this dystopia – the same dystopia enjoyed by millions as their primary form of entertainment. And so it is that, by an astonishing coincidence, the rise of video games has coincided exactly with the rise of neoliberalism.
When Shakespeare said that “all the world’s a stage,” what he was actually saying was obviously “I’ve got plays on the brain 24/7.” All the world’s an anything if you’re obsessed enough with whatever that thing is. We can just as easily conceive of the world as a game: one in which we are constantly presented with tiny tasks governed by rules of interaction. Every time we act, the world reacts; we get feedback; we learn something. But our actions also create the context in which further interactions happen: we’re designers as well as players. Every second of every day, we are creating ideology, and knowing this gives us a small amount of control over the process. If all the world’s a game, it’s badly designed. But we, as the players, still have a choice. We can choose to go for the high score and unlock all the achievements, or we can choose to play a different game of our own design.