MICHEL MARTIN, HOST:
And now we want to take a few minutes to talk about why people seem willing to believe or at least tolerate assertions that may or may not be grounded in truth. Social scientists call this confirmation bias. That's when we are drawn to information that aligns with our world views and when we hold onto these beliefs, even in the face of compelling evidence to the contrary.
To hear more about this, we called Jonathan Ellis, a Professor of Philosophy at the University of California, Santa Cruz. He researches confirmation bias at the Center for Public Philosophy, and we reached him in Santa Cruz. Professor Ellis, welcome.
JONATHAN ELLIS: Thank you for having me.
MARTIN: I understand that you prefer the term reasoning with an agenda. But either way, tell us more about this subject of confirmation bias or reasoning with an agenda.
ELLIS: Well, first of all, it's worth pointing out that human beings have been doing this forever. So Thucydides, an ancient Greek historian, wrote that it's a habit of human beings to use sovereign reason to thrust aside what they do not fancy. And what he was describing, and in fact what countless playwrights, philosophers and novelists have described ever since, are these human tendencies towards confirmation bias, rationalizations, self-deception. And I think this is the same problem that we're observing in our political culture today.
MARTIN: Well, you know, to that point, I was going to ask you this - do we all engage in this or are there certain times when we are more disposed to this? Are there patterns here?
ELLIS: Well, that's a good question. And there's a lot of research going on about that. When we have a lot at stake, we find that these subconscious processes distort our reasoning. One thing that we all need to do is to acknowledge that we're all susceptible to it.
MARTIN: What about this phenomenon that we are now hearing about called alternative facts?
ELLIS: Well, there's - I think there's an important distinction to make here between two things. And it's not easy to know which is going on in many instances. In one case, the person doesn't really believe what they're saying. In the other case, you really do believe what you're saying. Your mind has found a way to make that conclusion seem the right one. Do they really believe that these are facts and these are correct, or are they simply a political move in a political chess game?
MARTIN: But the question I have is what's the chicken and what's the egg here? I mean, are people seeking out the new sources that confirm their biases? Are news organizations feeding those biases because they think that's what their audience wants? I mean, do you have an opinion about that?
ELLIS: I think all of it is going on. And I'll add a third one, which is that because people have beliefs, certain positions, they're also going to be more inclined to see those media outlets as more trustworthy. But then, the other way is that we are polarized, as you're pointing out, in the media that we take in.
MARTIN: If that's a business model that works for you, where you confirm the biases of your viewers and they are satisfied with that and continue to support you because you do that, why would you change it?
ELLIS: One of the reasons that motivated reasoning and rationalization evolved is that it, actually, in the short term and sometimes in the long term serves our individual interests. But it doesn't serve us as a democracy. Democracy depends on that. And in the long run, perhaps the consequences are not in our best interests.
MARTIN: That's Jonathan Ellis. He's a professor of philosophy at the University of California, Santa Cruz. Professor Ellis, thank you so much for joining us.
ELLIS: Thank you so much for having me.