A striking feature of polarization is that it's predictable. If you're a liberal, you can predict that you'll continue to become more confident that Trump is a bad president; if you're a conservative, vice versa.
Does this mean that polarization is due to irrational causes? Recently I described a result suggesting it does: in standard models of rational belief, predictable polarization is impossible.
But there's a catch: those models (implicitly) assume that evidence is always unambiguous, in the sense that you should always know what to make of it. This isn't true of real-life politics. Often you should be unsure what to think: for instance, you might wonder to yourself, "Am I right to be confident that Biden will win the election—or am I being overconfident?"
That raises a question: what happens to that theoretical result once we allow for ambiguous evidence?
The key result driving this project is that ambiguous evidence can lead to predictable polarization:
Fact. Whenever evidence is ambiguous, there is a claim on which can be predictably polarizing. (The Technical Appendix contains all formal statements and proofs.)
In other words, someone who receives ambiguous evidence can expect it to be rational to increase their confidence in some claim. Therefore, if two people will receive ambiguous evidence, it's possible for them to expect that their beliefs will diverge in a particular direction.
As I demonstrated in an experiment—and as I’ll explain in more depth next week—this means that ambiguous evidence can lead to predictable, rational shifts in your beliefs.
Without going into the formal argument, this is something that I think we all grasp, intuitively. Consider an activity like asking a friend for encouragement. Suppose you just interviewed for a job, but you’re nervous and come to me seeking reassurance. What will I do?
I’ll provide you with reasons to think you will get it—help you focus on how your interview went well, how qualified you are, etc.
Of course, when you go to me seeking reassurance you know that I’m going to encourage you in this way. The mere fact that I’m giving you such reasons isn’t, in itself, evidence that you got the job. Nevertheless, we go to our friends for encouragement in this way because we do tend to feel more confident afterward. Why is that?
If I’m a good encourager, then I’ll do my best to make the evidence in favor of you getting the position clear and unambiguous, while that against you getting it unclear and ambiguous. I’ll say, “They were really excited about you in the interview, right?”—highlighting unambiguous evidence that you’ll get the job. And when you worry, “But one of the interviewers looked unhappy throughout it,” I’ll say, “Bill? I hear he’s always grumpy, so it’s probably got nothing to do with you.” Thus making evidence that you didn’t get the job more ambiguous and so weaker. On the whole, this back-and-forth can be expected to make you more confident that you’ll get the job.
This informal account is sketchy, but I hope you can see the rough outlines of how this story will go. We’ll return to filling it out in the details next week.
The point? That polarization is predictable does not mean that it's irrational. Next week, I'll explain, in particular, how the polarization found in my experiment has rational causes.