Monday, March 21, 2011

Cheating, part 2: "The situation just sucks"

In Cheating, I gave an example where it is socially optimal to cheat, given that others are cheating. The idea of the example is that when only relative ranking matters, everyone's cheating cancels out, preserving the original, socially optimal (i.e. true) ranking.

Let's tweak this a tad by saying that absolute performance matters too, but just a little. In fact, let's say that our top priority is getting the ranking right, but conditional on getting the right ranking, we also would prefer that the test scores accurately measure a student's true ability. In this case, it is no longer socially optimal for everyone to cheat; the ranking would be the same, but their abilities would be inflated.

On the other hand, it's still socially optimal for any individual to cheat, given that others are cheating. And here we come to one of the great subtleties of economics. Even though social welfare is not maximized by everyone cheating, each person is behaving socially optimally when everyone cheats!

What's going on here? Somehow, putting decisions into the hands of lots of individual actors changes the game, even when those individuals still have the same goal as the society they make up. It works like this:
  1. As an individual, if your goal is social welfare, then certainly you should choose the course of action that maximizes social welfare, taking as given all the things about the world that you can't change.
  2. Since no student has any say in the behavior of the other students, all a student can do is decide how to personally behave, taking other students' behavior as given.
  3. When everyone else is cheating, it's socially optimal for an individual to cheat, since that's the only way to preserve the true ranking of students, which is the most important thing.
  4. So when everyone is cheating, each cheater is behaving socially optimally!
As the example demonstrates, socially optimal behavior given the behavior of others is simply not the same as jointly socially optimal behavior, even when everyone is doing it.

I call this one of the great subtleties of economics because most noneconomists expect the behavior of a group of like-minded people to accurately reflect the underlying motives of those people. To be sure, this is eminently reasonable...I mean, if we all want the same thing, shouldn't we just be doing it? Without being tipped off, you could hardly be expected to realize there's even something worth considering. But in this case, the obvious thing is wrong. "Everyone cheat" is an equilibrium of a game in which everyone truly has good intentions. It's a bad equilibrium, in the sense that everyone would be happier in the "No cheat" equilibrium...but even good people can get stuck there. If everyone's planning to cheat because they think everyone else is planning to cheat, then who's to blame? It's not as if they're under the wrong impression or something; to the contrary, each person is right about everyone else, and they're doing the right thing in response! Sometimes the situation just sucks.

It's kind of important to understand the concept of a situation just sucking, without there necessarily being someone to blame, someone who caused it. To most people, when a horrible financial crisis happens, it must be that some bad people were behind it. When everyone in the bottom decile is cheating, it must be that they're bad people. But actually, it's easy to get this sort of behavior out of perfectly reasonable people. We've seen that even in the case where everyone literally wants social optimality, they can still get stuck in a bad equilibrium; relax this so that people are (reasonably!) allowed to care a little more their own wellbeing, and it's cake to construct realistic examples where a bunch of decent people collectively do a bad thing.

*

This particular brand of misguided moralizing has so many faces. Athletes and their steroids. Lawyers and their...lawyering? Should we be angry at politicians for using any means necessary to get their way, no matter how ridiculous? Or could it be that in politics, weaseling is the socially optimal response to weasels?

When in Rome, my friends. Even if they're cheaters.

6 comments:

  1. Sometimes the situation sucks because we choose to measure the wrong things. Our obsessive focus on measuring things that shouldn't be measured is part of the problem. One example: is one successful if one is happy or rich? We can't really measure the former, but we can certainly measure the latter. I suppose it could also be a combination of those two factors (which it likely is). The second we try to measure things like success (in this case by using wealth as an indicator) people will lie, cheat and steal to improve what can be measured.

    The same can be said of grades in the educational system. I don't think cheating would be very prevalent in an educational system that didn't give out grades (I'm not necessarily suggesting that, I'm just trying to make a point about our focus on measuring and some of its ill effects).

    ReplyDelete
  2. All of these posts basically follow the form of, "As long as everyone else is lying/cheating, it is optimal (socially and individually) for you to do so as well." Let's stick with the case where everyone is well-meaning. There are (at least) two equilibria, good and bad. Equilibrium selection is a touchy thing. If we all expect the bad equilibria, then we all play the bad equilibria and our expectations end up being justified in the end. Of course it could just as easily have gone the other way.

    Now suppose there is a fraction of the group who adopt a policy of never cheating regardless of their expectations of others. You claim that this policy is not socially optimal for a person who expects everyone else to cheat. However, if other people know that these honesty fanatics exist, they will consider that in their decision making. Essentially, the Boy Scouts provide the public service of equilibrium selection. It turns out that their behavior (really threat of behavior) was socially beneficial after all.

    It seems to me that one way of viewing religious leaders and other moral teachers is that they try to imbue people with a disutility of lying, cheating, etc. Once a person gets direct disutility from these behaviors, he can then credibly commit to being a Boy Scout and perform the public service of equilibrium selection.

    ReplyDelete
  3. Greg,

    Indeed! Imperfect measurement incentivizes the wrong behavior which can lead to suboptimal outcomes. Of course, that doesn't mean we should *abandon* our attempts to measure, but as you say, we should be wary of the consequences of setting up such a measure. Also, often some sort of measurement will happen implicitly even if it's not made explicit, in which case we might want to think about which is better. Several of my recent posts are relevant; see for example "A Moment of Silence."

    ReplyDelete
  4. Ian,

    Absolutely, and we will be going exactly there in just a few days. (For approximately the right qualifier, see the last paragraph of my previous post). The above example should be interpreted quite narrowly, meant only to demonstrate that "given this situation, XYZ behavior is actually socially optimal." That's something to isolate on its own, as it is (I think) not obvious to enough of my readers. We will soon take a step back and ask how we got into that situation, and what sort of person it is socially optimal to be ex ante, and what that means for rationality.

    ReplyDelete
  5. Exogenous CombustionMarch 22, 2011 at 8:27 AM

    I don't buy dynamic inconsistency much, but for those who do, something to ponder: the reason people don't cheat has to do with dynamic inconsistency and "building" a future self.

    Ralph Waldo Emerson said:
    "Watch your thoughts. They become words. Watch your words. They become deeds. Watch your deeds. They become habits. Watch your habits. They become character. Character is everything."

    So too with cheating. I care about what I become in the future. While cheating now may be rational in the sense you describe, it contributes to some addictive stock for cheating, or some more general stock of character. In this sense, "holding the line" of character even when not locally rational (in the sense you describe) to do so may just be a method of building character so that when one comes to real grey areas, one's stock is strong.

    ReplyDelete
  6. EC,

    Say there is good (socially optimal) cheating and bad cheating. It would be possible to have an inherent "cheating stock," rather than a "bad cheating stock," insofar as it's possible to have a stock of anything. And the moment we build in addiction to something that can sometimes be bad for you (the cheating stock in "bad cheating" instances where it hurts your larger goal to maximize social welfare), then it introduces long-term concerns into your local decision of whether or not to cheat, as you say.

    By the way, note that in the world of game theory, commitment issues are another sort of dynamic inconsistency with which you surely have no problem. You might think of a repeated game where people avoid cheating even when optimal in the short run, so that in the long run they have good reputations and can credibly commit to not cheating. Here, insistence on playing locally rationally strips one of the ability to commit to a course of action which is more globally optimal, as Ian says. Actually Ian is saying something more than that, but more to come.

    ReplyDelete