Rationality Is Not Systematized Winning

Authors Note: I said in my previous post that the next would be about systems to build common knowledge. That post is running very much behind schedule, so I'll be publishing others in the meantime.

“Do not ask whether it is “the Way” to do this or that. Ask whether the sky is blue or green. If you speak overmuch of the Way you will not attain it.”

— Eliezer Yudkowsky

Rationality has been defined as: Using probability theory to pick out features held in common by all successful forms of inference [Bensinger]; The ability to do well on hard decision problems & the art of how to systematically come to know what is true [Roko]; Systematized winning [Yudkowsky]; [to] win when things are fair, or when things are unfair randomly over an extended period [Alicorn]; Drawing correct inferences from limited, confusing, contradictory, or maliciously doctored facts [Alexander]; The Way of the agent smiling from on top of the giant heap of utility [Yudkowsky].

I'm a little worried writing this post. Definitions are one of the more abused tools of thought. Originally meant to facilitate common understanding, they've become a refuge for pedants and smart-aleks to pretend at insight. It seems possible that people will ignore an essay discussing definitions on the grounds that they are trivial things of trivial consequence. Perhaps so, but I think people ignore poetry at their own peril. The systematized winning definition of rationality fails to constrain expectations, and I worry it's one of the significant things holding LW-flavor rationality back.

Systematized Winning: An Intangible Un-definition

The Bug

In his post Rationality Is Systematized Winning Eliezer writes:

There is a meme which says that a certain ritual of cognition is the paragon of reasonableness and so defines what the reasonable people do. But alas, the reasonable people often get their butts handed to them by the unreasonable ones, because the universe isn’t always reasonable. Reason is just a way of doing things, not necessarily the most formidable; it is how professors talk to each other in debate halls, which sometimes works, and sometimes doesn’t. If a hoard of barbarians attacks the debate hall, the truly prudent and flexible agent will abandon reasonableness. No. If the “irrational” agent is outcompeting you on a systematic and predictable basis, then it is time to reconsider what you think is “rational”.

"Rationality is systematized winning" is a slogan that was adopted to patch a bug in human cognition. Namely our endless capacity to delude ourselves about how we did in an attempt to save face. The concept seems to have been absorbed, but I'm skeptical it's translated into more effective action. Certainly it produced many essays on why winning isn't happening. But the fact that we've been publishing essentially the same essay for a decade now implies something fairly fundamental is wrong. This slogan was chosen because it patches the bug, but I fear at the cost of neutering our ability to focus.

The Bug, Disputed

Other, more rigorous ways of patching the bug were possible. Tim Tyler responds:

Wikipedia has this right: “a rational agent is specifically defined as an agent which always chooses the action which maximises its expected performance, given all of the knowledge it currently possesses.” http://​en.wikipedia.org/​wiki/​Rationality Expected performance. Not actual performance. Whether its actual performance is good or not depends on other factors—such as how malicious the environment is, whether the agent’s priors are good—and so on.

And Eliezer replied:

Problem with that in human practice is that it leads to people defending their ruined plans, saying, “But my expected performance was great!” Vide the failed trading companies saying it wasn’t their fault, the market had just done something that it shouldn’t have done once in the lifetime of the universe. Achieving a win is much harder than achieving an expectation of winning (i.e. something that it seems you could defend as a good try).

This is not a philosophical objection, it's a social-emotional objection. Eliezer is saying here that regardless of the correctness of this answer, people can't be trusted with it. Reader, the virtues are cruel. Often when we knowingly lapse in them we already know the direction from which danger will come, and we underestimate the magnitude. I don't want to exaggerate, but this moment of choosing convenience over rigor may have greatly sabotaged the rationalist project. It's a meme that takes beneficial, operationable knowledge and pushes it out in favor of meta hand-wringing.

I Notice You Are Confused: Try Frequentism

Later in the thread Eliezer observes:

I guess when I look over the comments, the problem with the phraseology is that people seem to inevitably begin debating over whether rationalists win and asking how much they win—the properties of a fixed sort of creature, the “rationalist”—rather than saying, “What wins systematically? Let us define rationality accordingly.” Not sure what sort of catchphrase would solve this.

There is no catchphrase that solves this, the problem as stated is intractable. You've probably heard of Nate Silver, right? He did really well forecasting elections. This made him the standard wisdom for contrarian know-it-alls in 2016. The problem in 2016 was that primary polls aren't particularly accurate and it's the primaries that were of most interest in 2016. So Nate kind of had a rough time with his primary predictions, and then he made a final analysis before the general: Hillary Clinton favored to win 71-29. Of course, Trump won. This made a lot of people very angry on both sides. Hillary didn't win, so does that mean Silver is bogus? Well, it's hard to tell. Certainly from a pure probability perspective, things which only occur 30% of the time happen daily and it's not particularly notable. But was Silver right to expect that Trump isn't favored to win? You can analyze his process and evaluate its quality & correctness, but based on outcomes alone this problem is intractable. One way to solve this is to use a scoring rule like Brier's score.

But that only treats the surface issue. Why does all this confusion and heat exist in the first place? Naively we might expect it's because people don't understand statistics, or that they're stupid. But I think it's probably more subtle than that. As Eliezer points out, we insist we were right even when our ideas were dumb to save face. There is however a related meta issue, one level up from fretting about particular outcomes of dice rolls. Namely: people have a odd tendency to be okay with letting single random outcomes decide their success, even when it's unnecessary. This is common in role playing games. Often players will run headlong into situations that kill them unless the dice come up a certain way. They take it for granted that the dice roll happens, and focus on how to make that dice roll survivable. Usually with a bit of forethought they could avoid their strategy relying on a lucky save entirely. However it's generally not until the mechanics are particularly punishing that players get smart about this. I suspect if this is common in gaming it's common in real life too. That people are getting so invested into singular outcomes because they've staked too much on them. In short: If you're finding that single micro-scale dice rolls are key components of your success, that's a strategy smell.

So what's all this got to do with traders claiming their strategy is good but chance screwed them? Well everything of course. They're essentially the same problem. We have the same key features:

And the solution is the same in both cases. You can criticize process, but if all you have to go off is outcomes then score rules over multiple trials are the best you can do. I have to wonder if the reason why Eliezer didn't notice this problem is intractable, is his insistence on a Bayesian frame. After all, the Brier score is essentially frequentist in attitude. The answer to this particular bug in the brain is to change your perspective.

Is It Really A Problem?

Changing our perspective might have significant benefits. Systematized winning is not an actionable definition. Most domains already have field specific knowledge on how to win, and in aggregate these organized practices are called society. The most powerful engine of systematized winning developed thus far is civilization. Most people trying to explain the value of rationality privilege the hypothesis. They assume that there is a such thing as instrumental rationality, methods to systematically win over and above the usual practices of civilization. It would be a mistake to assume your audience will privilege the hypothesis with you. The first question you have to answer is why rationality at all. If someone asks:

“Look, if I go to college and get my degree, and I go start a traditional family with 4 kids, and I make 120k a year and vote for my favorite political party, and the decades pass and I get old but I'm doing pretty damn well by historical human standards; just by doing everything society would like me to, what use do I have for your 'rationality'? Why should I change any of my actions from the societal default?”

You must have an answer for them. Saying rationality is systematized winning is ridiculous. It ignores that systematized winning is the default, you need to do more than that to be attractive. I think the strongest frame you can use to start really exploring the benefits of rationality is to ask yourself what advantage it has over societal defaults. When you give yourself permission to move away from the "systematized winning" definition, without the fear that you'll tie yourself in knots of paradox; it's then that you can really start to think about the subject concretely.

If not "systematized winning", then what definition is suitable? I have my answer, but I'd prefer not to lose the chance to hear yours. If you agree with me, then I think you owe it to yourself to stop letting other people tell you what rationality is for a bit. Try to name the void for yourself, then compare your answer to others.