Monday, November 14, 2011

Strong feelings undeserved and overserved

I had a good friend in elementary school who went away to Israel and came back poisoned with hatred for The Enemy.  It doesn't really matter which side he was on.  If my memory is to be trusted, I didn't much care about the situation in the middle east at the time.  I just remember being sad that my friend had been changed in such an overt and terrible way.

I have a big problem with blind hatred, which unfortunately seems to describe most hatred.  It is possible to understand something and still hate it.  But it's much easier -- and so more common -- to hate something you don't understand.  Because, though the world is a pretty ordered place, we often fail to see that order, and then we tend to assume it isn't there.  And if something makes no sense, we don't have to take it seriously.  We can drop it in the mud and walk all over it without a second thought, because it has no right to exist, making so little sense and all.  And the people who believe it are just as easily dismissed.

We try not to execute people without being pretty sure they're guilty.  Hatred is, I think, a sort of mental execution that we shouldn't be so hasty to dole out in advance of understanding.  How can hatred be so firmly buttressed by the conviction that the enemy is being unreasonable, when we don't even understand why the enemy is behaving in such a manner?  Extreme reactions should require a high degree of confidence.

So, what bothers me is hatred in advance of understanding.  What bothers me is the depressingly default assumption that the burden of understanding falls on the person (or thing) to be understood.  That it is their job to be understood by us, not our job to understand them.  That if we do not understand them, it must be that their position doesn't make sense, not that we are failing to make sense of their position.

Let me inject a little perspective.  Since the dawn of man, we have looked upon the world and seen no shortage of black boxes.  The world was impossibly mysterious at first.  We just didn't understand its workings.  In retrospect, this lack of understanding was a property of our brains, not the world.  But instead of recognizing that, we took our inability to explain things as strong evidence that they were, in fact, inexplicable.  There was no underlying order that we just hadn't discovered yet; rather, things were inherently mysterious.  Of course we gave names to the mysterious things -- called them "magic" and so forth -- but you can't actually demystify a cat by calling it Mittens.  Cats are complicated, you know.

Of course there are still many black boxes to be cracked open, although by now, in the realm of science at least, we've pretty much got the idea that the boxes have stuff in them even before they're opened.  We understand that discovering an unopened black box should probably weaken our confidence in our understanding of the world, not readily convince us that the world actually makes no sense.

But it seems we haven't quite absorbed the analogous moral on the topic of why people do what they do.

Suppose half the world believes A and the other half believes not-A.  Quite apart from the question of whether A or not-A is the truth, there is the question of why a person might think one or the other is true.  And if we can't understand why people on the other side of the table think what they think, our brains have failed.  It should weaken our confidence in our true understanding of the situation, not convince us that those people actually make no sense.

And when we fail to understand the other side, we are in no position to make a legitimate judgment about what ought to happen, or to hate anyone for supporting an outcome we don't even know how to reasonably evaluate.

Of course, transactions and debates and negotiations and wars can occur without either side understanding the other.   And I don't mean to say that's inherently wrong; it will sometimes be optimal to go to war in advance of full understanding.  But confusion should make us uncomfortable, not somehow magically bolster (!) our eagerness.

[This post inspired by a depressing display of blind hatred in a Facebook debate between several people from different countries in the middle east.]


  1. Exogen(e)ous CombustionNovember 15, 2011 at 7:57 AM

    "And when we fail to understand the other side, we are in no position to make a legitimate judgment about what ought to happen, or to hate anyone for supporting an outcome we don't even know how to reasonably evaluate."

    Perhaps. In his Theses on Feuerbach, Marx's eleventh and last point translates to:

    "Philosophers have hitherto only interpreted the world in various ways; the point is the change it."

    Perhaps the world is changed for the bad, and millions die. Perhaps for the good. A long-run steady climb of worldwide living standards and decline of poverty makes the case that it is on average changed for the good. Because we have memories and are able to keep the good shifts, we really want to sample the space of possible changes. What drives change?

    I propose it is rarely actually rational thought that shapes the world. It is passion. Sometimes the two go hand in hand. Dierdre McCloskey (I think rightly) criticizes economics for focusing too much on prudence. In a similar (but normative) manner, I think that the fundamentals of why we do what we do are driven by our great loves, joys, and hates. Rationality, or prudence, is just a tool. And understanding others is just one tool to get there, but one that may harm our end goal and so should be used judiciously.

    I think that blind hatred can so often be an efficient fuel for the engine of man. I have yet to see a consistent way of bringing up individuals who are both deeply intellectual and not self-paralyzed by their own intellect (in the way that Bertrand Russell meant when he said "The biggest cause of trouble in the world today is that the stupid people are so sure about things and the intelligent folks are so full of doubts."

    This is a tradeoff I haven't seen overcome very often and certainly not systematically. To each their own. But as to me, I'd rather try to change the world, even with only a noisy signal about truth, than interpret it perfectly but be sterile and unable to effect it.

    Interested in your thoughts, as this is why I ended up not being able to really grok well with the rationalists of the world. It seems like the people arguing on popular rationalist sites too often are paralyzed in an almost narcissistic and wholly internal pursuit (which is fine, it's just not the route that looks most important to me).

  2. EC, thank you for the pushback. I'm glad to have you around.

    First, I think this is a fair point. But let me just say that I'm not thinking about the optimal response to current equilibrium behavior so much as complaining about current equilibrium behavior and wishing we were in a different sort of world altogether.

    It's like how, in academia, people are supposed to be really committed to their point of view, and defend it against all attacks, and in the end the strongest theories are supposed to survive. If some economist acknowledged at a seminar, "This is my model, and my beliefs about its accuracy are actually properly calibrated, I don't necessarily think it's true, but it's a possibility" would not be received well. He's supposed to fight as hard as he can to convince people, not acknowledge his rational doubts! This sort of annoys me, but in an equilibrium where we expect people to fight, it's not optimal for one person to stop fighting.

    Similarly, the world would not necessarily be a better place if all Palestinians instantly turned into perfect rationalists but the other side continued to hate them blindly. I suggest that if both sides suddenly became perfect rationalists, though, all fighting would probably stop.

    Unfortunately, rationalism isn't necessarily the stuff of (evolutionary, let's say) equilibria, since as you say, a deviation to something like blind certainty goes with greater willingness to impose one's will on the world, and such people will propagate. But against this, rationalism has two potential weapons. First, it is at best like a nigh uncurable virus which cannot easily be shed once it has taken hold of a mind. Second, perhaps it has numerous personal benefits that could attract individuals to it, even if it is no way to win a war collectively. Indeed, perhaps becoming a rationalist makes you more satisfied with states of the world in which you're losing battles, so that even if it does hinder your ability to get your way, you're better off. If I could get angry with people more easily, perhaps I would get my way more often, but I wouldn't necessarily be happier.

    [You are correct that ignorance may be instrumental to achieving our goals. But if you saw the facebook argument I saw, you would be depressed too. These people think they're arguing rationally but really they are slaves to their biases. Hideous biases that have led to a lot of death and destruction, and maybe at some point in the future it will mean "advancement" via survival of the stronger party. In this particular case though I would favor advancement via preference modification that makes people actually okay with one another's existence. I do not always think that an injection of rationalism is optimal for a person, but in this case I would administer it if I could.

    Of course your commentary was more general.]