This is an automated transcript by Descript. Please excuse any mistakes.
Turi: [00:00:51] Today we are thrilled to be talking to Adrian Bardon. Adrian Bardon is professor of philosophy at Wake Forest University USA, and recently published ‘The Truth about Denial: Bias and Self Deception in Science, Politics and Religion’ (OUP, 2020) Adrian, we’re thrilled to have you on the Parlia podcast.
Adrian: [00:01:10] Thank you so much for having me.
Turi: [00:01:12] We live in polarised times, not just around ideas, but also around facts. Endless contestation around alternative facts, fake news, selective statistics. You talk in your book about a sort of breakdown in the reality consensus and that’s frightening. Whether we should blame the economic crash of 2008, the tech revolution and automation, whether we should blame the media social networks, postmodernism, even for relativising the idea of truth. Your book, the truth about denial asks us not to ignore something. You call the murky psychological processes that motivate us to believe in things which aren’t true. Can you unpack this core idea? What is denial? What is denialism? and how do you see it at work today?
Adrian: [00:02:11] Well we have a new colloquial understanding of what it is when we say someone is in denial. We mean there’s some kind of emotional motivation to deny fact or evidence. So there’s going to be something about ourselves or our situation or about the world that we are accustomed to believing, that we find comfortable to believe, that we find threatening not to believe and it’s a really evident fact about human psychology is that we’re very capable of denying fact and evidence when we feel threatened enough by fact or evidence. So I distinguish between denial and denialism in the book just for technical reasons. So denial in the traditional colloquial sense refers to facts about my personal situation that I find threatening or uncomfortable. Whereas denialism, I use to refer to the facts that are threatening to some sort of ideological world view that I have. So I reserved the term denialism for that. So either way, whatever you’re committed to believing with regard to your personal situation or with regard to some belief system you have about the world, when facts show up that are threatening to what you’re really committed to believing, we experienced a term I’m sure you’re familiar with, cognitive dissonance, which is this uncomfortable feeling that we get when we see there’s this discrepancy between our existing belief system and some incoming factor evidence. So we are spurred to resolve dissonance, but we have a choice on some level of how to resolve dissonance. One is by altering our views, altering our belief system. But the other is by denying the evidence and it’s just an evident fact about human beings and human psychology that we are quite capable of being very evidence- and fact-resistant.
Turi: [00:04:06] So that’s frightening and fascinating, and before we go into how denialism works, how we respond to cognitive dissonance. You start your book with a very useful reminder that we must think about the mind in a slightly different way. Descartes was wrong, the mind is not a disembodied thing. It exists very much as part of nature and therefore, if the mind is a bodily function, we must ask what causes it to work the way that it does. As I understand your framing of it, the fundamental point here is that the mind sometimes has two contradictory reasons for working; epistemic on the one hand (to ascertain knowledge) and emotional/social on the other (for collaboration).
Can you help open that up a little bit?
Adrian: [00:05:02] Well you mentioned Descartes first of all, who in the classic Judaeo-Christian tradition and the platonic tradition drew this sharp boundary between the mind and the body. If all we really are essentially is the mind and mentality, then the workings of the mind should be self transparent to us, but that’s the misconception. The mind is part of nature. It’s a product of our physical being and physical activity, on the part of our bodies. And so in that light, there’s no reason to think that its operations should be any more transparent to us than we can introspect on how our liver works or what’s in our stomach. So to explain motivated reasoning, I think the frame to start with is just a quick summary of our evolutionary history, which makes sense out of the other motives we have beyond accuracy that would make us believe things that are not true and give us an advantage historically, at least in believing things that are not true. So this is a point that’s been made by a lot of people, homo sapiens and the pre-homo sapiens ancestors evolved primarily in little groups, little family groups, clan groups, little tribes through most of our history, and homo Habilis, our pre-homo sapiens ancestors probably had very similar lifestyle. So in that context, we are forming beliefs and views about the way the world works. And in that context, there are certain beliefs that we really want to be accurate. What plants can we eat or what plants are poisonous. That would be important to know, is it day or night? Is it warm or cold? What do we do when we’re cold? What warms us up? Those are types of beliefs that would have an advantage in informing and developing systems of reasoning that would allow us to form accurate beliefs. But if you consider that our evolutionary niche, the social group, social cooperation, that’s what gives us our advantage, it’s made us the dominant species on the planet, our ability to get together with other people in our group and make plans and coordinate our activities. If you see human evolution in that light, you think what’s going to allow social cooperation. For the individual what’s going to be really advantageous, what’s going to be necessary is to be able to integrate socially into your group. So you need to know accurately, what plants to eat, what plants are poisonous. But is it that important to be accurate with regard to what God to worship? If your group worships God XYZ or Poseidon, are you going to be that person who says, well I’d like to see that evidence for this. I’m not sure we have an accurate belief about what God to worship. What’s your best case scenario coming out of that? Ostracism from the group at least, And quite possibly much worse. So on the level of large world views or ideological cultural beliefs on the part of the group that our religion is that one true religion, our race is the superior race, you’re really better off adaptively having a belief system that integrates with the group’s belief system. Rather than views composed of accurate beliefs about the world. So all reasoning is motivated. There’s always a goal to reasoning, even if the motivation is just accuracy. Sometimes we want accuracy for the sake of survival or individual wellbeing. But there’s all kinds of unconscious motives that can influence belief formation, particularly this social motive. We need to integrate into our larger community, and this gives us a psychology in the long run that makes changes or informational threats to our existing world views feel really threatening to ourselves. It’s about identity. If I say to somebody who are you? Tell me about yourself. Turi tell me about yourself and you’d be likely to give me an answer like the following. Although these are not true things about you personally, but this is the kind of answers you would give when someone asks you, tell me about yourself. You’d say, I was born in Texas and my great grandparents immigrated from Scotland. I’m a Methodist. I’m a social conservative and a member of the Republican party. I also volunteer for the special Olympics, and volunteer for groups that advocate on behalf of autism because my child has autism. So you just told me all about yourself, but everything you just told me has to do with your connection to some of your affiliation, to some other group that is defined, at least in part by its belief system, it’s worldview. And that’s your identity, that’s who you are. So when some information comes in, it’s threatening to some aspect of your identity, that’s going to feel like a personal attack. Turi: [00:10:15] My old business partner used to say, who I should say was a sort of radical contrarian on almost everything, used to like to say that people prefer to be wrong together than right alone.
Adrian: [00:10:30] Yes.
Turi: [00:10:32] The point I think you’re making here is that from an evolutionary perspective, that’s the right call.
Adrian: [00:10:37] Well, historically it has been the right call. It’s not necessarily the right call now. It’s an ideological explanation in terms of our history, why do we have these dispositions? But that doesn’t make them rational in the sense that we necessarily make the best decisions about what to believe in our current context.
Turi: [00:11:00] That makes sense. So onto this, having anchored reasoning very much in sort of the emotional and social. Can you walk us through how motivated reasoning, which I think you would say is all kinds of reasoning. All reasoning is motivated in some way, but can you walk us through some of the most egregious examples or the egregious forms of that motivated reasoning, confirmation bias, for example?
Adrian: [00:11:28] Well, there’s the personal and the political or ideological. So one instance of motivated reasoning on the personal side would be, I’m not an alcoholic. And in the face of call that motivated reasoning because we’re stipulating that in this case, you actually have tons of evidence that you have a serious drinking problem. Your family has all left you and has declared that you have a drinking problem. Your doctor is telling you that your organs are failing, etc. But people work very hard to rationalise their way out of that kind of admission. So on the larger societal or ideological scale, I might be committed to a small government ideology a pro business ideology, but now you’re telling me that we have to completely revamp our economic system and our systems of industrial production in response to climate change. And so that turns into motivated reasoning. We’re really good at rationalising our beliefs for some reason, we feel attached to where we feel we need to hang onto for some personal, emotional or social reason.
Turi: [00:12:45] . Going back to your description of me as a Texan Republican, which I quite liked. Those were declarations of identity masquerading as philosophical declarations. They weren’t conceptual articulations so much as, articulations of my appurtenance where I belong. Is that the primary evolutionary function of that kind of rationalisation? What are the end goals of motivated reasoning in that regard evolutionarily?
Adrian: [00:13:23] Well, when you talk about function, that can mean two different things. One is a teleological function that is to say goal oriented or indirect function of something and artefacts have that kind of function. I make a table to put things on and make a chair to sit on, it has that teleological function. Human psychology, perhaps unfortunately is not an artefact. It’s something that’s naturally developed. And when we say that aspects of our psychology have a function, we mean, etiological function. And that’s to say that adaptively, this has worked for our ancestors and contributed to survival in fecundity. So the term function can therefore be a little bit misleading when you’re describing something like human psychology or human anatomy. That’s why we can’t really answer questions like; what’s the function of the human appendix or what’s the function of the tailbone? Well, the tailbone doesn’t have a function. It has a history. Right? So human psychology has a history. Our tendency to affiliate ourselves very strongly and very early on in our lives with identity groups has a history to it. Does it serve us?Sometimes in some contexts? Sure, absolutely, it’s pro-social in certain contexts, and it can even be protective. If you’re in a community that’s very hostile to certain other points of view, the ability to integrate yourself into that community in terms of their worldview would be advantageous to you now, but there’s nothing necessarily advantageous about human psychology. It’s a historical story.
Turi: [00:14:58] I think I asked my question badly. What I meant to ask was what is useful to us? What has it been useful to us, evolutionarily about the motivated reasoning that you see in the way that we think today? What’s the evolutionary psychology in a sense of the way we think today?
Adrian: [00:15:21] Well, I think it goes back to social integration and thus our ability to cooperate with each others and our ability to fit into a community. We have lots of emotional motives that go along with that, like the intense need for a sense of belonging, companionship, also stems from the way our ancestors had to live, which was together, which was in groups. So that into what we call ingroup outgroup thinking because groups, define themselves in terms of their Ideologies, in terms of their belief systems, they’re going to contrast themselves with other groups, and that naturally turns into hostility and conflict. But it also turns into switching back to advantages, turns into group solidarity. It turns into a sense of community and it turns into living amongst people who you can rely on and who you can trust. That’s why we see this divergence in what people are willing to believe. It’s because most of our beliefs are based on trust. There’s a few things that I know just from direct observation. There’s a table in front of me. The walls are painted blue. Two plus two is four, I can figure that out for myself, but what about almost everything else about the world you know? How does electricity work? How does chemistry work? How does my toaster work? The planet Neptune. Turi, do you believe that the planet Neptune exists?
Turi: [00:16:52] I have no doubts that the planet Neptune exists.
Adrian: [00:16:56] But if you’re like most people you actually haven’t seen it personally. So in that case, you you’re relying on expertise or someone you regard as authoritative on the matter. And we’re wired up because of this sociality to discriminately trust other people who we feel are on our side, we identify with in some way. That’s why we get to the point where there are certain politicians we listen to. That certain politicians whose claims we automatically discount, no matter what they’re saying. There’s certain media sources we’ve listened to and certain sources we discount no matter what they have to say. If Donald Trump tells us COVID is a hoax, that’s it, it’s been politicised. If we identify with Trump or Trumpism with the Republican party or conservatism. We know the sport. Well, we automatically take this person to be on our side and most of our beliefs are based on some kind of trust. And so we’ve already chosen to trust this person. If he had said the exact opposite thing from the beginning, we would probably see some very different behaviours.
Turi: [00:18:03] So here you’re talking to the point that we outsource a huge amount of our thinking.
Adrian: [00:18:09] Yeah.
Turi: [00:18:10] And perhaps some of the most important elements of our thinking both as to whether Neptune exists, despite us not having seen it through to whether God exists, despite us not having seen him or her. As I was reading your book, I wondered whether there was a difference in quality between the knowledge that we have about two plus two equals four, And the felt knowledge of believing in God. Those that felt experience of belief, is it a different form of knowledge, different experience of knowledge, to the kind of knowledge that we’re used to employing day to day?
Adrian: [00:18:50] Well that’s indeed a very kind of deep and subtle question about belief and our confidence in it. I think that everything that happens that we’re aware of could be described as a kind of feeling. So I have a felt confidence in two plus two is four, as you put it. And some people certainly have a deeply felt confidence that Jesus is Lord. I find beliefs to be on the face of it more reliable or more likely to be reliable if I can’t identify some motives behind the belief that are not accuracy related. It’s very unlikely that you have some kind of social or ideological motive to believe that two plus two is four, as opposed to something else. Notable exception while we’re on that subject is when you’re convinced that the fellows convinced him in 1984, that two plus two is five. Right? It’s a remarkable instance of being given an emotive to different about that kind of statement. But that would be very unusual for that kind of a statement now. What God do I believe in? Well, I mean, just look at the very high correlation between the God you were raised to believe in, and the God you wind up believing in it’s a very strong correlation. So that automatic, it makes you a little more suspicious about what’s going on behind the scenes, in terms of how this person is assessing evidence or how this person is reasoning and what’s driving that reasoning.
Turi: [00:20:25] So again, back to the motivations around it, we’re always asking, I suppose, qui bono, is it accuracy or is it belonging? And in that instance, it feels like it’s a lot more around belonging. But if we go back and ask and take as a starting point, that a lot of our reasoning is done for emotional or social reasons, I want to ask you whether motivated reasoning actually works? We recently on the Parlia podcast interviewed Karen Douglas, who’s an expert on conspiracy theory. Another form of quite extreme motivated reasoning. Conspiracy believers very much end up where they are to satisfy profound emotional wants. But it turns out in that case, that believing in conspiracy theories does not satisfy those wants it actually alienates people ever more from the society than they’re trying to be part of. This motivated reasoning of the kind that you’re describing, does it work? Does it make people happier? I’m struck by the fact that all the statistics seem to show that religious people declare a higher satisfaction with their lives. You talk in your book about the value of positive illusions that are having a hope dial up versus reality does very good things for us. Does motivated reasoning make us happier?
Adrian: [00:21:55] I think it definitely can. Going back to religious ideology or religiosity. You automatically have a really tight community right there. As soon as you adopt that belief, you have a place you can go to where everyone’s gonna welcome you and see you as one of them, as one of their own. And that satisfies really powerful, emotional, psychological needs that we have. Conspiracy theories are an extreme disruption of the recent process based on some underlying motivations. You can get a sense of community out of being a conspiracy theorist, especially in the age of social media. And I think, pro sociality is our number one motivator. It’s not the only thing that can make this sort of thing work for us. You mentioned positive illusions. I think when it comes to things like sexual competition or athletic endeavours, having a kind of unrealistic view of your capabilities can actually correlate with you deciding to keep going and keep trying and actually, you develop some skills in the process. And you have a better chance of succeeding if you start out having unrealistic expectations. Maybe that’s also the reason why we always tell our children that they’re special. They can’t all be special. Approximately half of them are below average as a matter of fact, but it’s probably better if they don’t think that they’re below average from the beginning.
Turi: [00:23:23] Talking of competition, you discuss in your book the fact that there’s an argument, that the capacity to reason may actually have been a marker of success. Not for accuracy, but purely to demonstrate strength in debate than actually the capacity to argue almost any side and win is a signifier of status and power in a group. The debating being more important than the accuracy.
Adrian: [00:23:55] Well that sounds like what’s been called the argumentative theory of reasoning. Which going back to our evolutionary history that Integration and dominance within your group had to do with persuasion and getting people to cooperate with you, then the ability to argue whether your conclusions were right or wrong. Successfully might’ve been more important in a lot of contexts than having a true set of beliefs about the world that way generally true. So that could also be anything that contributes to social success. Like being tall or good looking or being good at arguing is attractive to other people because that’s our number one signal that this is someone we want to be with. We want to make friends with, we want to mate with, someone who’s likely to be socially successful. So being able to debate or argue on just understanding would be on that list of traits.
Turi: [00:25:00] Finally, on this abstract question, we’ve gone over some of the reasons why the emotional parts of reasoning was so important evolutionarily. But are you on some level, not surprised that we haven’t evolved it out, that the need for accuracy and reasoning has not finally trumped it over the need for the emotional social Potter reasoning?
Adrian: [00:25:27] Well, unfortunately human evolution doesn’t work on that kind of timescale. We’ve only been in large societies for 8,000 years or so. Thanks to the development of agriculture we’ve only been industrial and an industrial society for a few hundred years. And you can find possibly some micro-changes in human physiology in response to even industrial production over the course of a hundred years, but not much. So yeah, as you know, as people say we’re walking around dealing with the modern world with our neanderthal brains and, they’re not going to change on that timescale. Unfortunately we don’t have the time to wait. It’s a very serious situation.
Turi: [00:26:10] I’m going to come to that with you later. Let’s move into politics now. So you talk in detail about some of the motivated reasoning associated with our various different political tribes. The motivated reasoning that inform our political choices. One of the examples that you’ve give in your book is a conservative support for supply side economics. Can you give us a little bit more info on that?
Adrian: [00:26:38] Well, if you look at even how political conservatives define themselves, they define themselves in terms of the status quo. They’re comfortable with traditional culture on a social level and the existing, social and economic order. On the economic level, conservatives by definition view what the order and potentially the economic order of the inequalities that have kind of grown out of productive production over the last 100 years as if this is the social and economic order that we’ve arrived at while making ourselves so much richer in the process and better off in the process, that order must be good. So we should conserve it. That’s really what political conservatism is. Now they’re faced with evidence of rampant economic inequality that are based not on people’s choices, but on people’s circumstances. And this is the fundamental disagreement between the left and the right on economics, which is the extent to which one situation is based on hard work and your productivity and the choices you make versus your circumstances on fairness and the system. So if you ask people as a matter of fact, not opinion, but in fact, when someone is poor, what is the primary cause? This is a stark difference between what a conservative or liberal or progressive will tell you. It’s primarily due to their choices or the amount of work they put in, or primarily due to their circumstances or to structural effects about the economy. This is absolutely definitive of the distinction between conservatives and progressives. So as you’d expect from any other kind of ideological belief, we have conservatives presented with evidence that there’s structural inequality, that there’s unfairness, that it has more to do with the circumstance than with choice, they will predictably deny that evidence and call up their own experts and have their own think tanks, that will give you a different factual picture of the world. In another world, or maybe in a communist or heavily socialist context where you have structural equality, where the state is really imposing. Equal outcomes, regardless of what people put into the system, you can have the opposite problem and you can have the left denying that we really should put more choice into the system, that we’re not being very productive as a society under this arrangement. And so you’d see the opposite denial effect, and reality and expertise from the left in that context.
Turi: [00:29:25] Where do these biases come from? What are the inputs to those original biases? You discussed in your book? The idea of the family as our model for governance?
Adrian: [00:29:38] Well, this is a controversial matter. That would be the George Lakoff take on it, that it has a lot to do with your upbringing and indeed back to your family structure where conservatives have this kind of patriarchal, hierarchical top down, authoritative parent model. He suggests that liberals are raised under the nurturant parent model, which is more egalitarian. And focuses on circumstance over personal responsibility. So that’s his story about how it comes about. There’s some very controversial research about innate personality types, or learnt personality types that on some level is predictive of your political orientation. You know what your scores are on the ocean personality test. Liberals will score higher on the O scale would be openness to experience scale and conversely lower on the conscientiousness scale and the reverse for political conservatives if you just do the test that predicts about 40% of the difference. So there’s probably a combination of all of the above. There’s cultural influence, there’s how you were raised in your family context, there might even be some innate personality traits. But the highest correlation is at least early on in your life is it was what your parents believe. So that’s probably your parents. And your parents are influenced by their community. So your parents, your community, that has the biggest influence on how people turn out politically and thus, not just sort of what their values are, but what facts they choose to believe about reality, like how economics works and what experts they choose to believe, in terms of telling them how society works.
Turi: [00:31:28] One of the areas that you touch on most deeply is science denialism, which in the current context of climate change is obviously an existential question for us all. But you differentiate between the kind of science denialism we see on the left and on the right. Can you give us an overview of what the landscape looks like before we jump into the actual differences.
Adrian: [00:31:58] So in the modern context, that’s from the sixties and seventies onward. The most salient science out there is what the sociologists call impact science. Hitherto we had several hundred years of ramping up the opposite, which is called production science and the society overall benefited by the wealth created by the increased abilities to produce, to exploit resources. But then the costs came due. The bill came and we started to recognise the environmental, personal and societal impact of industrial production. The science that was produced, it started to turn more towards that subject matter. And that’s very well correlated with a change in view, on the part of different the different sides of the political spectrum on science in the 1960s, famously with Rachel Carson for example, the left was considered the anti-science side. You know they’re worried about pesticides. They’re worried about nuclear power. They’re worried about plastics. They’re worried about just pollution and industrial production. And to say this has created the greatest society ever, we’re healthier than ever, we’re richer than ever. What’s wrong with you, right? But then science itself started to turn in terms of what became the predominant focus of various scientific fields. Turn started to discern to the impact of production. And that’s when you see conservatives measurably for example, in the U.S general social survey, starting in 1974, starting to express less and less trust in science as an institution. And in scientists as figures, who are reliable experts. So something very predictable happened when science turned against production. Those who were conservatives committed to the status quo on production, on the social and economic order, started to get more and more suspicious of the scientific consensus. And this has just gotten worse and worse. It’s gotten people more and more polarised on the subject, to the point where we have, as you put it, this massive existential threat to our very existence and you have large swaths of the political spectrum just denying the overwhelming scientific consensus on this issue.
Turi: [00:34:34] So let’s go deeper. I’d like to really understand why it is, what the motivated reasoning is behind that conservative rejection of manmade climate change. What are the triggers for conservatives here?
Adrian: [00:34:52] The triggers, I think are pretty clear. Conservatives have allied themselves with the traditional industrial order. And that includes, a small government, pro business, low regulation, low taxes on business. These are all classic elements on the economic side of the spectrum that political conservatives favor. But there’s no small government solution to climate change or overpopulation or resource depletion. Or at least it’s certainly pretty hard to figure out what that small government solution would be. But even before you get to the point of talking about solutions, even admitting to yourself that there is such a grave problem threatening right off the bat, it’s not like you typically, As the evidence is building up what we’ve seen is first the denial that there even is a problem. And it’s starting to shift over to, there’s a problem, but it’s not human caused. Or there’s a problem, but any major attempt to intervene in this problem will be too disruptive to the economy and will cause more suffering than good. So the forms of rationalisation have evolved. There’s a little bit of retreat on the rationalisation, but the reason why the rationalisation is happening is pretty clear. It’s threatening to the traditional economic order. There’s just no way to fix the problem without large disruptions, And that’s completely opposite to the conservative worldview.
Turi: [00:36:26] Adrian, there’s an element here which I’m hearing, which is that aversion to big government requires you not to believe in climate change, which will require the government to solve it. There’s cognitive dissonance there. Is that also cognitive dissonance in just change of that magnitude full stop?
Adrian: [00:36:45] Yes, I think so. Conservatism is characterised by what’s a universal human bias. Status quo bias is a universal human bias. Other things being equal, we always prefer what is familiar and conservatives. If we go back to personality types are definitely measurably, more subject to favouring status quo over change. It’s not clear that in another universe progressive’s couldn’t be the more status quo group. If you lived in this aggressively egalitarian society, to the point where human flourishing was being limited by how aggressively egalitarian it is. You might find the left expressing more status quo bias in that instance. But the fact is when things are going badly you need change. Change that’s disruptive to the traditional order. And that’s what conservatives are all about, is preserving the traditional order. And they’re also more comfortable with hierarchy and the modern capitalist society is hierarchical in terms of inequality and that they are there by definition. What attracts them to the ideology in the first place is in part this comfort with hierarchical and traditional systems border.
Turi: [00:38:04] The idea here is that response to climate change would have to be so systemic as to upend any recognisable order and also hierarchy that currently exists. And that’s the cognitive dissonance and that’s what needs to be battled against with this motivational reasoning.
Adrian: [00:38:23] Yes, that’s both probably true, but really more to the point is that there’s a perception that that’s the case and that’s all you need to launch yourself on to politicising the science on climate change. There’s tons of science that isn’t politicised. Most science isn’t politicised. If you tell a conservative, this is how electricity works, they’re like, ‘okay, what’s the big deal’. Right? If you tell them about the planet Neptune they’re like, ‘Oh, that’s nice, there’s a planet’. But the question is what’s been politicised. Climate change was politicised pretty easily early on because of course it’s inherently threatening to the traditional order, as I mentioned. Then there’s things that can get a little politicised just because in the case of COVID and wearing masks and social distancing, that was probably started out being personally politically threatening to certain autocratic leaders around the world. That’s exactly where you see the worst national responses to COVID. And that was kind of a top down thing. If the leader is like, my chances of reelection are being impacted by this, they start denying it. But if you have people who are already aligned with them, who already feel a sense of identification with them and their movement and their way of seeing things, those people will just go along with it. So it’s kind of a top down politicisation of the science. So there’s different avenues to politicising science. But once it gets politicised, it’s politics now. It’s not science anymore. And people are vicious about it.
Turi: [00:39:53] Adrian, there’s no research on this, but just anecdotally, can you unpack or can you start to look for the motivated reasoning around the political tribes that we’ve built up as a response to COVID. The no mask brigade, which seems to be very much a sort of conservative response versus the progressive or liberal one, which has been very much erring on the side of extreme caution. How would you unpack that?
Adrian: [00:40:22] Yeah, there’s what people say their reasoning is behind mask refusal. And then there’s it might be different from what is really going on. I think there might be some discrepancy between how people represent to themselves, why they’re against masks, or against aggressive measures to address the disease. And there’s some discrepancy there between that and how they really got to that point, but we’ll give you this libertarian case. ‘Well, you know, you can’t tell me to wear a mask’. And they compare it to, it sounds like the same rationale for not having to wear seat belts or motorcycle helmets or something like that. All the way to conspiracy theories about it, ‘It’s actually Bill Gates trying to inject microchip into us using vaccines’, or something like that. Or more broadly an attempt to impose socialism somehow on society. But I think the real origin is just that the politicians and media figures that I identify with, and that my community identifies with and the kind of viewpoints that signal to me, that make me fit in with my community. This was told to me from the beginning that, our side doesn’t believe in this virus or our side at least discounts the significance of it. And our side is in favour of keeping the economy open and now masks become a signal. Signals are very important and in our social lives, we’re always trying to send signals to others that we belong. That they can trust us, that we’re part of that community and masculinity come it extraordinary symbol in certain political contexts, not in a lot of the world, but in the U.S for example, there’s a lot of places in the U.S where you walk into a grocery store wearing a mask and people just yell at you. You’re immediately identified as the other. You are othered. In another context or in many contexts, your skin colour identifies you as the other, or some indication that you belong to a different religion identifies you as the other. Now masks have become a signal that is a powerful signal. Once it becomes a politicised, it has a very powerful and immediate effect on people. Then I think they rationalise it. Then they say, ‘well, it’s about Liberty’. But I think that like in many cases of denial, that’s just the case of rationalising what you already believed. Go back to climate change for a second if I may. We know that if you’re a conservative and you’re denying climate change, you’re more likely to deny climate change as a conservative, if you’re college educated, if you’re score higher on science, literacy tests. You’re more likely generally to engage in politicised science denial if you have a higher political sophistication, if you have better quantitative reasoning skills. What does that tell us? Well, that tells us that’s what’s going on here is not reasoning, but rationalisation. Being educated or having some political sophistication just gives you more ammunition in rationalising what you’re already triggered to believe or triggered to deny in this case. So in the case of climate change, the more you know about solar flares or Milankovitch cycles, the more you can articulate a rationale for the belief you were already inclined to adopt. Or the belief system you’re already attracted to. In the case of masks the more language of libertarianism you have available, the more you can rationalise your refusal to wear masks, even while at the same time, you know, on the inside, I’m like, ‘ah, don’t breathe in my face’.
Turi: [00:44:01] I feel like we’re beating up a lot on conservative responses to science or conservative rationalisation. Can you give me some examples of progressives or liberals doing the same thing with science?
Adrian: [00:44:14] Leftist, progressive liberals are definitely not some kind of superior species of human being. They have the same fundamental human psychology. They probably have an easier time trusting climate science or public health experts, because they’re already basically ideologically aligned with the kinds of messages that you’re getting, from those sources. To give some examples, scientific conclusions that really will trigger leftist denialism, a really good example is the safe evidence of the safety of genetically modified foods. There was a report, a big report, a meta study from the national Academy of sciences, I think about four or five years ago, where they met a set of 70 or 80 scientific papers on the impacts of genetically modified fruits on personal health, on the environment, on communities. And they just reported, we’ve looked at all these studies, we just can’t find any consistent negative impact. The leftists from the beginning, as soon as you say, genetically modified food or genetically modified crops to an environmentalist, progressive, that sounds bad to them. They feel bad about that from the beginning and when the NAS comes up, I saw my own friends in a progressive sense immediately, ‘Oh, the NAS is in the pocket of big agriculture’. Jumping immediately to a conspiracy theory. Monsanto is behind this report. NAS is some really distinguished very conscientious, people. And you see that kind of immediate denialism when anything inconvenient pops up. Because leftists don’t know more about science than right wingers. There’s no advantage in science literacy. It’s not as though they to have a better grasp of genetics or climate than the right does. They’re just allied and predisposed with the groups that are keeping them messages on climate for the most part. They’re predisposed to believe those messages. The same thing on the right side. It’s all fundamental human psychology and it just depends, like what’s most salient at the time. Like what’s the message and who’s delivering the message.
Turi: [00:46:35] That’s fascinating. So I’d like to end with some tips from you. We’ve got a major personal challenge, which is that all of us are engaged on a daily basis, sort of a starting point. All our reasoning is motivated. How do we address that issue in ourselves? You talked about alcoholics at the start of this podcast, the first step in alcoholics anonymous is acknowledging that you have a problem, but how do we take that forward on a personal level? And then the second question that I’ve got to ask you is really around this systemic existential issue of climate change. We don’t have the time to fix motivated reasoning on an evolutionary level. Climate changes is happening and happening at an existentially threatening pace today. What do we do there? How do we fix that problem?
Adrian: [00:47:33] Okay. So first on alcoholism, the first step, and the hardest step is admitting that you have a problem. The problem with denial generally is that people in denial are motivated to stay in denial. There’s the premise of your question that we have some motive to un-motivate ourselves. In the case of alcoholism, because you do have an interest in your own personal health or your relationships with your family and friends, alcoholism would disrupt both of those things. So there is a motive there. What about to make a more healthy environment to fix the climate? Well, if that’s already important to us, then we’re already listening to relevant experts. So if the problem is the folks who aren’t particularly motivated to un-motivate themselves, presumably they want a future for humanity and a healthy environment too, but, they’re obviously not prioritising that over the other motives that are are leading them towards climate science denial. If you’re primarily motivated to preserve the existing social and economic order, you’ve got to figure out somehow to be converted to finding your interest in and undermining that belief system and projecting that belief system in favour rather radical change any way you slice it, that’s tough. So there is the school of thought, and it’s the most attractive and immediate option, to most people looking at the problem of science communication on that subject of climate, which is that what we need to do is we need to frame the message better. And framing is basically a kind of marketing. We’ve got to show people that it’s within their value system, it’s within their belief system to want to preserve or conserve the environment. We talked about white evangelicals who are very high on rejecting climate science. Obviously nothing to do with science. We tell them that it’s to care for God’s creation. We tell conservatives, it’s about conserving the environment. So these are ways of framing the message. We try to have in-group messengers deliver these messages as it seems like that would be the most effective way. The most famous example, a lot of people talk about the Katharine Hayhoe, the evangelical Christian climate scientist, and she goes to these communities, right-wing evangelical communities and she tries to work her identity as a main group messenger and craft her messages. That sounds like the answer to a lot of people. I don’t think the data really supports the efficacy of that kind of intervention. What you’re going up against is an entire lifetime of cultural identifications and loyalties that when you try to throw a little bit of marketing at, it’s like the King Knute who stood and tried to stop the tides from coming in the sword. I favour the Dan Kahan approach. He’s a very well known social scientist at Yale, and anyone interested in this topic should certainly be looking at what he’s done. He talks about deep politicising the issue by not by trying to reframe it. But by just like taking it down to the local level. In particular where he has seen the most success and persuasively argued is when you go to a local community and you say, we have a problem with the water supply, we have problem with drought, we have a problem with rising ocean levels. And we have problem with flooding in this community. And as members of this community, notice the in grouping that happened right there as members of this community, we’re all concerned about flooding or we’re all concerned about drought. So lets start with that and not talk about climate change and that’s his route towards deeply politicising the issue. And you see some success, using this approach at the local level. For example, there’s a Southeast Florida inter-community movement dealing with seawater intrusion into the freshwater supply. And that’s obviously something everybody’s concerned about. Everybody can feel like they’re part of the same group. They’re part of the same interest group with regard to that. So that’s where you see the best success. As opposed to just flying in you’re helicoptering and doing some marketing and then helicoptering out again. It’s still a little bit of a pessimistic view to say that’s effective because if that’s all that’s effective, that’s not going to do the trick because obviously by definition, you’re just addressing issues of local concern. The question is how do you broaden that out to larger, national and global action? It’s like with COVID denialism. You would start by saying, our hospitals are full, our schools are not open. What do we do about that? But it’s very hard to get from that, to here’s major change at the federal level. It’s hard to see the bridge from one to the other, and it’s even worse with climate because you need action at the global level.
Turi: [00:53:05] That is not consoling, but super interesting. So If we’re stuck in our ideological trenches and we wheeled motivated reasoning brilliantly with this motivated reasoning evolved over millennia to keep us in those trenches, how does culture change Cohort replacement? By which I mean, generations passing away?
Adrian: [00:53:33] Well we’re seeing generational change, just talking about the United States again, but probably also applicable to a lot of Western nations. And we’re seeing a lot of difference between the young and old, that the younger generations are much more concerned about environment, for kind of obvious reasons. And I would expect that to continue because the problem’s not going away. And in their lifetime, it’s going to get worse and they know that. So their interest is being triggered and their interests are at stake here. Just like you’d see more anti-war activism when there’s a draft than when you’re dealing with the volunteer only army, then when the draft goes away, the interest in anti-war activism disappears. But the environmental problems are not going away. So I’m optimistic in the sense that I think there’ll be a generationally going to be a greater and greater interest in these issues, greater prioritisation of these issues. I hope it all works out in time. It’s difficult for me. I mean, I know I sound a little pessimistic about it. It’s a little difficult for me to see how given the slowness of power, human psychology evolves and of how we get societal change and cultural change. These changes happen, but it takes a lot the time. And the situation is urgent.
Turi: [00:55:01] I wish we weren’t ending there Adrian. I wish you had a tune to whistle. But precisely to your point about reasoning. Sometimes we reason for emotional reasons kind of this for accuracy. So I’m grateful for the accuracy and unhappy emotionally. Adrian, thank you so much for joining us today. It’s been fascinating and I’ve loved talking to you.
Adrian: [00:55:20] It’s been great. Thank you so much.