A Brief Note On Epistemic Rationality

When I go to write a post for this blog I spend a lot of time doing research to establish that the things I'm talking about exist. It can be oddly difficult to preempt a motte and bailey argument by providing proof that people believe the things you say they believe. In todays post for example, I would like to address the prevailing attitude that epistemic rationality is a sort of red headed stepchild to instrumental rationality. The problem is while this is implied frequently, finding people on short notice who come right out and say it isn't always easy. In spite of this, a common sentiment goes that "the inclination to choose epistemic rationality is evidence of being bad at life". Nassim Taleb says there is no such thing as rational belief, only rational action. Anna Salamon wrote an entire post responding to the notion that epistemic rationality is for undisciplined nerds. That's three examples, and hopefully they're enough to establish that I'm responding to a real thing when I say epistemic rationality is actually pretty important.

But why? As we all know, it's suspiciously challenging to show the benefits. I've seen the basic answer given by Jordan Peterson during one of his lectures, in regards to self improvement:

What could you do to improve yourself?
Well, let's step one step backwards.
The first question might be: Why should you even bother improving yourself?
And I think the answer to that is something like: so you don't suffer anymore stupidly than you have to.
And maybe so others don't have to either. It's something like that.
You know, like, there is a real injunction at the bottom of it.
It's not some casual self-help doctrine, it's that if you don't organize yourself properly: you'll pay for it!
And in a big way. And so will the people around you.

People refuse to unconditionally accept the truth because 'accepting the truth' is often pretty painful. Learning that people are less nice than you think and have lots of unsavory motives for doing good things is uncomfortable. Accepting that you can't cure your pancreatic cancer with acupuncture and you have to let the doctors poke you with horrible instruments is pretty damn uncomfortable. To the extent it seems improbable god exists, that's uncomfortable. The notion of no life after death is frightening. We tell ourselves lies to deceive others, but we also tell ourselves lies to avoid pain. And the sad fact is that when you do that, you trade pain now for pain later. Usually, a lot more pain later. In the case of Steve Jobs and his cancer, denial about his situation led to lifelong regret and an early death. The first, basic reason to care about epistemic rationality is because the lies you tell yourself are just like the lies you tell others: They have a way of catching up with you.

What is true is already so.
Owning up to it doesn't make it worse.
Not being open about it doesn't make it go away.
And because it's true, it is what is there to be interacted with.
Anything untrue isn't there to be lived.
People can stand what is true,
for they are already enduring it.
    â€”Eugene Gendlin

Before you get into anything esoteric like Bayesian priors, or forecasting techniques, or performing complicated statistics to figure out when you should check the mail, it helps to start with the basics. The most valuable posts in a collection like The Sequences aren't the ones which dissect subtle errors in thinking, they're the ones which provide short memorable phrases to help you avoid doing stupid things. If I've gotten anything lasting out of my reading on this topic, it is probably the wide bank of patterns I've learned to notice in myself and intervene on. By seeing the conditions which lead to something dumb happening, I can stop and say "okay but what if I didn't do the stupid thing this time?". Part of why it's hard to point to the benefits is that they're not exactly spectacular riches and glorious accomplishment. They're subtle, they're utilitarian, they're things like "I didn't waste years of my life on mediocrity because my standards were too low", hard to prove and unimpressive. "I didn't walk into that giant spike pit, that was pretty cool." isn't exactly inspiring stuff.

In classical logic things are either true or false, the moon is made of cheese or it isn't. But the lies we tell ourselves of greatest consequence usually aren't cheese-moon lies, they're comfortable interpretations of uncomfortable facts. Some common examples:


Then the Gods of the Market tumbled, and their smooth-tongued wizards withdrew
And the hearts of the meanest were humbled and began to believe it was true
That All is not Gold that Glitters, and Two and Two make Four
And the Gods of the Copybook Headings limped up to explain it once more.

As it will be in the future, it was at the birth of Man
There are only four things certain since Social Progress began.
That the Dog returns to his Vomit and the Sow returns to her Mire,
And the burnt Fool's bandaged finger goes wabbling back to the Fire;

And that after this is accomplished, and the brave new world begins
When all men are paid for existing and no man must pay for his sins,
As surely as Water will wet us, as surely as Fire will burn,
The Gods of the Copybook Headings with terror and slaughter return!