Skip to content ↓

How do reasonable people disagree?

A study by philosopher Kevin Dorst explains how political differences can result from a process of “rational polarization.”
Press Inquiries

Press Contact:

Sarah McDonnell
Phone: 617-253-8923
Fax: 617-258-8762
MIT News Office

Media Download

Against a black and red background of newspaper clippings, two illustrated silhouettes of people's heads, one primarily red and one primarily blue, face opposite directions.
Download Image
Caption: In a new paper, MIT professor of philosophy Kevin Dorst explores how people might rationally come to hold very different views about some political matters.
Credits: Image: Jose-Luis Olivares, MIT

*Terms of Use:

Images for download on the MIT News office website are made available to non-commercial entities, press and the general public under a Creative Commons Attribution Non-Commercial No Derivatives license. You may not alter the images provided, other than to crop them to size. A credit line must be used when reproducing images; if one is not provided below, credit the images to "MIT."

Close
Against a black and red background of newspaper clippings, two illustrated silhouettes of people's heads, one primarily red and one primarily blue, face opposite directions.
Caption:
In a new paper, MIT professor of philosophy Kevin Dorst explores how people might rationally come to hold very different views about some political matters.
Credits:
Image: Jose-Luis Olivares, MIT

U.S. politics is heavily polarized. This is often regarded as a product of irrationality: People can be tribal, are influenced by their peers, and often get information from very different, sometimes inaccurate sources.

Tribalism and misinformation are real enough. But what if people are often acting rationally as well, even in the process of arriving at very different views? What if they are not being misled or too emotional, but are thinking logically?

“There can be quite reasonable ways people can be predictably polarized,” says MIT philosopher Kevin Dorst, author of a new paper on the subject, based partly on his own empirical research.

This may especially be the case when people deal with a lot of ambiguity when weighing political and civic issues. Those ambiguities generate political asymmetry. People consider evidence in predictably different ways, leading them to different conclusions. That doesn’t mean they are not thinking logically, though.

“What’s going is people are selectively scrutinizing information,” Dorst says. “That’s effectively why they move in opposite directions, because they scrutinize and selectively look for flaws in different places, and so they get overall different takes.”

The concept of rational polarization may help us develop a more coherent account about how views differ, by helping us avoid thinking that we alone are rational — or, conversely, that we have done no real thinking while arriving at our own opinions. Thus it can add nuance to our assessments of others.

The paper, “Rational Polarization,” appears in The Philosophical Review. Dorst, the sole author, is an assistant professor in MIT’s Department of Linguistics and Philosophy.

Looking for flaws

To Dorst, rational polarization stands as a useful alternative to other models about belief formation. In particular, rational polarization in his view improves upon one type of model of “Bayesian” thinking, in which people keep using new information to hone their views.

In Bayesian terms, because people use new information to update their views, they will rationally either change their ideas or not, as is warranted. it, But in reality, Dorst asserts, things are not so simple. Often when we assess new evidence, there is ambiguity present — and Dorst contends that it is rational to be unsure about that ambiguity. But this can generate polarization because people’s prior assumptions do influence the places where they find ambiguity.

Suppose a group of people have been given two studies about the death penalty: One study finds the death penalty has no deterrent effect on people’s behavior, and the other study finds it does. Even reading the same evidence, people in the group will likely wind up with different interpretations of it.

“Those who really believe in the deterrent effect will look closely at the study suggesting there is no deterrent effect, be skeptical about it, poke holes in the argument, and claim to recognize flaws in its reasoning,” Dorst says. “Conversely, for the people who disbelieve the deterrent effect, it’s the exact opposite. They find flaws in the study suggesting there is a deterrent effect.”

Even to these seemingly selective readings can be rational, Dorst says: “It makes sense to scrutinize surprising information more than unsurprising information.” Therefore, he adds, “You can see that people who have this tendency to selectively scrutinize [can] drift apart even when they are presented with the same evidence that’s mixed in the same way.”

By the letter

To help show that this habit exists, Dorst also ran an online experiment about ambiguity, with 250 participants on the Prolific online survey platform. The aim was to see how much people’s views might become polarized in the presence of ambiguous information.

The participants were given an incomplete string of letters, as one might find in a crossword puzzle or on “Wheel of Fortune.” Some letter strings were parts of real words, and some were not. Depending on what kinds of additional information participants were given, the ambiguous, unsolvable strings of letters had a sharply polarizing effect on how people reacted to the additional information they received.

This process at work in the experiment, Dorst says, is similar to what happens when people receive uncertain information, in the news or in studies, about political matters.

“When you find a flaw, it gives you clear evidence that undermines the study,” Dorst says. Otherwise, people often tend to be uncertain about the material they see. “When you don’t find a flaw, it [can] give you ambiguous evidence and you don’t know what to make of it. As a result, that can lead to predictable polarization.”

The larger point, Dorst believes, is that we can arrive at a more nuanced and consistent picture of how political differences exist when people process similar information.

“There’s a perception that in politics, rational brains shut off and people think with their guts,” Dorst says. “If you take that seriously, you should say, ‘I form my beliefs on politics in the same ways.’”

Unless, that is, you believe you alone are rational, and everyone else is not — though Dorst finds this to be an untenable view of the world.

“Part of what I’m trying to do is give an account that’s not subject to that sort of instability,” Dorst says. “You don’t necessarily have to point the finger at others. It’s a much more interesting process if you think there’s something [rational] there as well.”

Related Links

Related Topics

Related Articles

More MIT News