Transcript: Equalitarianism, with Cory Clark

This is an automated transcript from Descript.com - please excuse any mistakes.

Turi: [00:00:00] We’re thrilled to be talking to Cory Clark. Cory is a social psychologist and visiting scholar in psychology at the university of Pennsylvania. She’s interested, particularly in political bias and the ways in which the groups that we belong to influence how we approach information. Cory, it’s great to speak to you. Thanks so much for joining us.

Cory: Thanks for having me.

Turi: So perhaps let’s start with, let’s start with tribalism. What is it?

Cory: So tribalism is essentially the way, um, the groups that we belong to shape how we approach information and the way we approach other people. So people are embedded in lots of different social groups. And once they’re part of a group, they start to identify with that group. They want that group to be successful, and they also want to gain status within that group. And that leads people to. Uh, have a variety of psychological tendencies that affects how they evaluate information and other people. So they want to perceive their group members as better people than other kinds of group members. And they want to advance the interests of their in groups. So I primarily look at this, um, within politics.

So, you know, like the reason people would have evolved to be tribal in the first place, the reason they would have this tendency to be loyal to their in groups. Is because throughout human history groups would get into conflict with other groups, they would compete for land and resources. Um, and then the groups that were the best at coordinating within their in group are the ones that would survive, um, and pass their genes on to later generations of humans. So the people who are alive today, um, Came from these other people who were really successful at working within their groups, really successful at cooperating with their in groups. Um, but now we kind of have this within politics and we’re not necessarily fighting life and death a lot of the time, but we are fighting for. Status and resources. Who’s going to pay for what who’s going to receive, what from other people, um, what rights should people have. And so you, you still have these really highly consequential debates happening within the realm of politics. And because there’s something to win or there’s something to lose.

Being a member of a political tribe brings out a lot of these same psychological tendencies. It brings out these, these loyalties and these group commitments. And it causes us to see our political in-group members as better people than our political out-group members. And to see our political and group as being right about the world and our political out-group as being wrong about the world.

And we, we engage in a variety of psychological tendencies that allow us to maintain that perception to, to bolster our in group and to make our in-group members like us more.

Turi: So this talks to a lot of the things that we see all over the press, particularly around this issue of polarization and the fact that we’re supposed to be more polarized than we’ve been in a generation. Is that right?

Cory: That we’re more polarized now than we have been? So that does seem to be the case. There have been a lot of analysis of this, um, political groups seem to hate each other more than ever. Um, there is a lot of debate about what exactly has caused it. So some people would say that social media has something to do with it.

Um, But I’m not sure we really know that for sure. It could just be something that it really comes from the people. So what, what you seek out on social media, what you, what you thumbs up or retweet or whatever. However you engage with that information. That’s the demand you’re creating for that kind of content. And it causes the media to create more of that content. And it causes your friends and family members and other people. If they get a lot of likes on a really nasty political tweet where they completely slammed the other side, that’s a signal to them to do that more. Um, So social media might be sort of like a tool that helps facilitate this polarization, but it’s really coming from people.

Um, and I think that’s because this is just sort of a natural tendency we have, and it’s something we kind of crave. Um, we want to be able to show our in group that we’re really, you know, fiercely loyal members and we’re going to advance the interests of the in group.

Turi: not in that ingroup outgroup sort of dialectic. I I’ve always, there are a couple of, sort of potential causes for this rise in polarization that I like, um, because they talk directly to them. So one, um, uh, the, the end of the cold war, as soon as the cold war goes, we lose this big terrifying enemy that we can unify against. Um, and, uh, an extension of that is this the wonderful research by. Uh, Karen Stenner, who’s worked on authoritarianism. Um, and she, she she’s demonstrated that one way in which to help people expand their in-group is to give them a really frightening external enemy, like aliens. So her, her approach to dealing with this problem of polarization and tribalism is just go, go invent a much more frightening other tribe, like aliens, so that, um, so that everybody else can unify around the hatred of them rather than the hatred of each other.

Cory: Yes, that would be like a non-ideal. So we shouldn’t do the polarization problems. We are under attack, but aliens, or even like within countries being under attacked by another country could potentially unify people, but that’s of course not the way you’d want to do it. Um, but yeah, I think it, it is complicated because there’s no easy way. Um, to sort of solve the polarization problem, because this is just kind of, is it so long as you have political competition, you’ve got groups competing for, for leadership? Um, I think it’s kind of inevitable to have this sort of thing now. Of course, it’s, it seems to be getting worse. So that suggests it used to be better. Um, one argument could be just that things are there aren’t that many. Well, I was going to say there aren’t these like huge threats to humanity, but this is a bad time to make that argument while we’ve got we’re undergoing a pandemic, um, and all sorts of other problems. But, um, you know, life is pretty comfortable right now for the most part, even though maybe we’re not getting to go out and have fun, we’re not humankind. Isn’t struggling to survive as a whole, um, So, I mean, there could be a lot of reasons why polarization seems to be getting worse, but

Turi: Okay. No, I, I agree. Um, so we’ve got, um, You’ve nicely described tribalism for us and the reasons for tribalism and the reasons for that tribal extension into political thinking. It also has an impact in the way that we think. So help us understand tribal thinking.

Cory: So I tried to break tribal thinking into two categories that aren’t completely separate, but I think you can still distinguish them. So one is in group favoritism, and this is how we approach people. So we like members of our in group better than members of our outgroup. And what that means is that we’re more charitable, we’re more forgiving. If I’m a politician that you really liked us something say they, they have a marital affair. If it’s somebody you like, you might say, well, it’s not that big of a deal. It has nothing to do with their, their leadership ability. Uh, sure. It’s not great, but you know, we should put in judge them too harshly. Whereas if it’s a political outgrew member, a politician, you, you dislike, you might say this. Reflects his total lack of moral character. Like we cannot trust this person. He’s completely corrupt. His morals were like, we, he’s just not a good person. Um, so we kind of hold these sites. It’s not just that we like in group members better than group members, but it’s that we hold them to different standards and we’re more generous and forgiving and accepting of the flaws of people.

We like people who are in our in group. And more judgmental and harsh and critical of people in that group. Even if we’re talking about the exact same thing, the exact same behavior, um, we are going to be more generous toward our in group members.

Turi: Bill Clinton’s affair versus Donald Donald Trump’s multiple

Cory: exactly. The thing with real-world examples is that you can always point to small ways. That they differ. Um, but we do this in the laboratory. So you ha you run experiments where you look at you literally manipulate, you have the exact same immoral action, and you say, a Democrat did this, or a Republican did this, how bad is it? And people will say it’s more bad if it’s, um, if it’s a member of your political out-group. So we know people have this tendency. Um, the thing that I’m a more interested in, because I think it’s more problematic really for coming to a shared understanding of the world and of human nature. Um, I call ideological epistemology, but what it really is, is the way we approach and evaluate information. So people are more likely to. Seek out information, that’s going to support their group’s beliefs and avoid information that might challenge their group’s beliefs. And once they actually are confronted with information or they’re actually exposed to information, if it’s something that supports their groups, belief, they’re pretty uncritical. They accept it quickly. They don’t look for flaws and they’re like, of course, that’s totally that’s in line with everything. I already thought this must be true. Whereas, if it’s a piece of information that might challenge your group’s beliefs, you spend more time looking, trying to find the flaw. Why might they be wrong? Um, and are more critical. So people are, are really credulous toward information. If it confirms what their group believes and highly skeptical, if they confront information that opposes what their group believes, um, and what this creates is just a, uh, a tendency for the way people. Interact with [00:10:00] information just constantly reinforces their in group. And then people become more certain over time that they’re right. Um, and that their out-group members are wrong. And how could they be so wrong? It must be that they’re bad people because look, all of the information supports our side. Um,

Turi: So w on the, on the Parlia podcast, we’ve spoken quite a lot about that. Um, motivated reasoning and how that w w w what, where that comes from. What impact it has on our capacity to think rationally, whether motivated reasoning itself is a rational approach, especially in the light of ambiguous evidence. Um, and so maybe, maybe let’s jump into this because I think that. Ambiguity is something that you focus on in this question of ideological epistemology it’s particularly in moments where there is ambiguous evidence, where the motivated reasoning really comes rushing to the fall where [00:11:00] I tribal thinking goes into overdrive. That’s right now.

Cory: yes, so people, so it seems that people are more politically biased or. They’re more tribal when they’re confronted with ambiguous information. And the reason we think this is is because people, granted people have a lot of sort of ludicrous beliefs, but people don’t want to be blatantly tribal. They don’t want to lose credibility as people who are sort of generally concerned for the truth. So if you’re confronted with a mountain of evidence that. Challenges your beliefs. It’s really hard to resist that mountain of evidence because people will be like, how could you ignore this mountain of evidence is so obvious. Um, but in more ambiguous cases, when it’s actually really hard to know the truth, people can’t call you out for being tribal or being politically biased because nobody knows what the truth is, so you can get away with it more easily. Um, [00:12:00] One that I think is a good example, is people debate, um, the gender pay gap. Okay. How big is it? What are all the reasons for it is some kind of discrimination, uh, accounting for some of that gap, or is it all due to other kinds of differences that are like natural differences between men and women? I don’t think. Anybody really fully knows all of the causes of the gender pay gap. And in some cases we don’t even really know how large it is. If you’re trying to compare identical jobs, um, and identical work, that’s really hard to do because the world is a messy place. The world doesn’t occur in a carefully controlled laboratory experiment where you can say, um, You know, the man and the woman are doing the exact same amount of work and producing the exact same amount of whatever it is they’re trying to produce. So this allows both sides to interpret information in a way that’s going to reinforce our side, their side. So maybe liberals want to say no, it’s a hundred percent due to [00:13:00] discrimination and deserve someone to say, no, it’s a hundred percent due to no natural differences between groups. Um, and both sides can advance. Some information that’s going to support their particular yeah. Perspective. And neither side is going to be 100%, right. Um, so, or a hundred percent wrong. So when you have these cases where the information is hard to know the truth, that gives you a lot of wiggle room and it gives you a lot of ability to pick and choose what, what data you’re going to forward. Explain away other data, because there isn’t going to be some answer out in the world. That’s going to. Be impossible to deny at least at the present moment. Um,

Turi: That’s one element of the, of the value of ambiguity, if you want it so far as it allows for a proper, uh, kind of, uh, uh, at least, uh, on the surface plausibly rational debate about, but the other key peanut butter ambiguity is that, um, it, it occurs in the context [00:14:00] of when it occurs in the context of these sort of tribal conflicts that you described, it actually increases the variance, doesn’t it, it, it, it, it sort of allows for. Um, people to death. I don’t want to put words in your mouth, but it seems like when there’s ambiguous, lots of ambiguous evidence out there, it provides people with the opportunity to show more loyalty to that tribe. Because in fact, it’s not a loyalty based on reason. It’s a loyalty based on feeling. Can you help? Can you, can you express that thought in a, in a way, which actually makes sense, please?

Cory: no, that’s interesting. So in the case, I think what you’re saying is that when something isn’t so clear, um, when we don’t know with 100% certainty, whether the group’s position is correct being. A strong defender of that position would signal more loyalty to your group because you’re willing to be [00:15:00] certain, even in the face of uncertainty. So it’s kind of a stronger signal of commitment to the group, um, when there’s ambiguity, but one thing, this is a potential counter argument to what I’m saying. Um, although I think the reality is they’re probably both true. Another thing is that when information is ambiguous, when I cannot know. For sure what the actual truth is then deferring to my group is a perhaps rational heuristic. So I say on average, my group tends to be more right than the outgroup. Um, that’s why I’m part of the script to begin with. So I let’s say I’m a Democrat Democrats tend to be more right. Therefore, on this issue when I can’t tell what is true, I’ll just believe what the Democrats believe because they tend to be right more often. So that would be a kind of. Rational explanation that you might say isn’t necessarily motivated reasoning. It’s sort of irrational, um, heuristic that is applied in cases [00:16:00] when it’s impossible to know the truth. And although I think that’s not the whole story, um, that I suspect is at least part of it. And part of the reason why you see more sort of tribalistic thinking and cases of ambiguous information,

Turi: Because you can do two things here, one it’s useful to you. It helps you accelerate your thinking. We all are source our thinking in many, many ways, most of the time anyway, but two, it gives you an opportunity to demonstrate loyalty. Um, which is interesting. Okay. That’s fascinating. Um, now. Onto the differences between liberals or conservatives on a previous podcast, we spoke to Adrian Barden. Who’s a professor professor of philosophy at wake forest university and has spoken, has written a great deal recently about denialism science denialism, particularly on the conservative side, there are some systemic reasons why. Conservatives would be more of a stew, theoretical science and, and also things like climate change than, [00:17:00] than, um, progressives. But so we have science denialism aimed at conservatives. Conservatives are statistically less educated. The liberals, both in the UK and the us. Um, and they’re also explicitly far more committed to in group loyalty than say liberals or progressives. So all of these things suggest the liberals that conservatives would be way more prone to bias, to not questioning their ways of seeing the world to not being able, even to project what the other side might be thinking. And yet your work, um, suggests that. That’s not true. That liberals are just as good at biased thinking and motivated reasoning. Um, as conservatives, is that right?

Cory: so I agree with some of what you say, not quite all of it. And I’ll, um, maybe even soften a little bit of the claim that you’re saying that I’m making. Um, so [00:18:00] I, so it is true that liberals on average are more educated. Um, I think it is probably true. No, I think. So plenty of work would support this, that liberals are sort of more friendly to science. They’re more trusting of science. Um, in general now I think there are probably good reasons for that. One would be that scientists also tend to be liberal, but it’s what came first, the chicken or the egg in that case. Like, was it that because liberals are more pro-science they become scientists or is it that scientists are liberal, so liberals are more friendly toward it. It’s probably both. Um, I think it’s probably both. Um, uh, the thing that I’m not 100% sure about is I’m not really sure whether there are differences in loyalty and group loyalty. Now I know some scholars say that conservatives are more pro in group loyalty, um, on some measures that seems to be [00:19:00] true in the actual behavior. I don’t know that that’s true. So, um, and

Turi: Sorry, I should, I should precise a little bit. Yes, exactly. Cause it’s sort of at the very edge. What I was suggesting was that you tend to find a conservative response to a migration. For example, a conservative response to other religions tends to be stronger and less inclusive or less open to the idea of inclusion than the progressive or liberal one. That’s all I meant.

Cory: Yes. Yes. Um, and also there are other things, so like conservatives tend to be more religious and that creates a lot of issues. So there may be more willing to deny evolution. And that’s one of the most important theories in terms of understanding human behavior. If you deny evolution, you deny a lot of science. Um, and I don’t even know how you come to understand human behavior at that point. So. So I do think liberals and conservatives are different in a variety of ways. Um, the, [00:20:00] the work you’re referring to is a meta analysis. I conducted with some of my former lab HDMI it’s at university of California, Irvine. Um, we. Pulled together. Every single study, we could find every single experiment we could find that tried to measure political bias. So these are cases where you’re giving participants identical pieces of information, and you’re manipulating something about it. So maybe they’re, they’re evaluating a policy. They say here’s policy X. And then they say Democrats support policy X, or they say Republican support policy X. How good is this policy? And they find that participants who are Democrats say, it’s a better policy. If a Democrat supports it and Republican States better for public and support it, they do this with scientific findings. So the, the classic examples, this board Ross leper study from 1979, they have people read about a scientific study that says the death penalty either deters. [00:21:00] Crime or it increases crime. So it basically says the death penalty is either effective or it’s counterproductive and they find, and then they ask people how good are the methods of the study? So they’re not evaluating the conclusions. They’re evaluating the way they conducted the study itself before they found the conclusions. Um, and people will say the methods are better when the results support they’re there. Previous, um, position on the death penalty. Um, so I think we had 51 of these studies. We, you put them all together and you look at the average effect size, how biased are Democrats, how biased are Republicans. We also included conservatives and liberals in different ways. You can define those groups. And what we found was that liberals and conservatives were virtually. Equally likely to engage in this kind of political bias are equally likely to evaluate information more [00:22:00] favorably when it supports their group than when it challenged their group. Um, so this suggests that in there I would call these kinds of behavior cause you’re evaluating information and this particular tendency, liberals and conservatives seem. Equally, politically biased. They seem equally likely to allow their group commitments to shape how they evaluate information. Um, so this was puzzling

Turi: terribly surprising, right? Because it would be, it would be surprising that we would have evolved differently as political tribes.

Cory: right? So it was surprise. So it’s not surprising to me for one reason. And that’s that. Humans are humans and they do human things. So it’s like liberals and conservatives are both humans. They evolved in similar contexts. They’re not different. They’re not like fundamentally different groups of people. Um, So it’s not surprising [00:23:00] that are similar. It was surprising to a lot of scholars. And in fact, a lot of scholars still reject the results and they think there are problems, um, with how we did the med Ana analysis, or just the studies themselves that were included in the med analysis. Um, because so much of. So the social sciences focus on everything that’s wrong with conservatives and all of the reasons you would expect them to be worse in this way. Um, I think, I mean, one thing I’ve tried to reflect on is. If in modern society are most so in the U S we have the Democrats and Republicans, both very successful political coalitions that have existed for a long time. They go back and forth and who’s winning at a given moment in terms of like, who’s the president who’s controlling the Senate. Um, um, if one group of people, liberals or Democrats were much less. Tribal that is, they don’t necessarily defer to [00:24:00] their group every time they’re willing to not vote for their candidate when they don’t like the candidate that’s up that year. They aren’t, they don’t vote party line. If one group was more willing to do that, you would expect the other group to eventually demolish. If one group was so much more loyal, all of their members were voting a hundred percent Republican on every issue. Every time, every time there’s a Republican candidate. If their members were willing to do that. And the other members weren’t willing to do that, they would win. The Republicans would win every time. Um, now that’s not necessarily true. That’s more just me thinking out loud about this. Um,

Turi: right. No, no, but that also makes sense that yes, there is a strong evolutionary suggestion that, um, humans are humans.

Cory: yes.

Turi: okay, so, so this is a super interesting finding probably also, particularly for our listeners who I imagine are predominantly liberal. [00:25:00] Um, it’s good to be reminded that we make the same mistakes, um, just as well as the other side. Um, Corey, one of the determinant features of biases, of course, that you don’t know you’ve got it. Or do you don’t know that you’re performing it. Um, and that talks to sort of a fundamental quality of these sacred values that you talk about. That should these deeply held beliefs that both liberals and conservatives have and are perhaps not capable, even a verbalizing, not certainly not capable of seeing. Um, so can I ask you just because again, most of all, our, our listeners will be on the liberal side of the spectrum. Can I ask you to help us identify what those hidden biases are on the liberal side?

Cory: Yes. So the, [00:26:00] the term that my, um, colleague and I’m a coauthor on a lot of us work, Beau Wineguard, we call it equalitarian aneurysm. Um, and we think it stems from sort of like a good place among liberals. We think it stems from,

Turi: Can I just ask you to say that again? So equalitarian ism, not egalitarianism.

Cory: Yes, we call it yeah. Equalitarian aneurysm. Um, but we think it stems from egalitarianism in part. Um, and what, what I mean by that is that liberals have greater desire for growth groups to be treated equally, to have equal outcomes. They are more opposed to hierarchies and conservatives, conservatives are more comfortable with hierarchies that some people are going to. Have more than other people, for example. Um, and they particularly have a lot of empathy for people who are on the lower end of the hierarchy, so lower status groups of people. So these are things [00:27:00] that liberals, I think explicitly. Identify with, they identify as egalatarian They identify as having empathy for relatively low step out of groups. But we think that in their desire for groups to be equal, they have a bias toward perceiving groups to be equal. So they want to see groups that are higher status and lower status to be more equal in some sort of fundamental way. Um, and so we perceive liberals as having. A bias toward explaining all group differences through discrimination, um, prejudice and these kinds of things that it’s the powerful people holding the low power people down and they oppose other kinds of explanations. So, because we’ve already talked about the gender pay gap, liberals would want to think that. The only reason women earn less than men on average is because they’re being [00:28:00] discriminated against because of sexism and not because men and women actually are different in ways that would cause that maybe women don’t want to work as much. Maybe they have different interests. Maybe they have slightly different talents. Since, um, so liberals are opposed to scientific findings that would suggest, um, that men and women are actually different.

Turi: So Corey to restate the motivated reasoning here, if I understand correctly, is that liberals desire the world to be equal and therefore. Imagine the world to be equal as a starting point for all their work. So any findings would suggest that the world is not equal. They will discount because they want it to be so, and that’s what the motivated reasoning is.

Cory: Well by discount. I mean, in that they will explain that inequality by appealing [00:29:00] to some kind of discrimination explanation, rather than that, there is some underlying difference whether it be due to, um, Genetics or some sort of evolved difference between groups. Um, so anytime you observed, so there are lots of inequalities in the world, lots of groups, aren’t exactly the same on a lot of different things, um, and on their performance and a lot of different domains. So when liberals are confronted with inequality, they don’t want it to be the case that those, those differences reflect some underlying difference. Within the groups themselves and rather some kind of structural difference that society is imposing on those groups. Does that make sense?

Turi: Yep. So, um, liberals obsession with systemic inequality is, um, And as it sort of at the very heart of the liberal project, if equalitarian aneurysm is, [00:30:00] um, is what the fundamental sacred Valley or the liberal metaphysics are.

Cory: Yes. And I should add that there is another piece to this and it’s because, um, liberals have particular concern for the wellbeing of relatively low status, um, groups in whatever domain they’re looking at. So, um, Sorry. So because they have a particular concern with the low status groups, they want to see the world in a way that those groups are either equal to or better than the high status groups. So one finding among social scientists, which I find to be pretty interesting was, um, a study performed by bill Von Hippel and David bus. They entered or emailed a survey to a bunch of social psychology professors. Um, and ask them how much they think differences could be. Um, whether there could be evolved differences that explain certain group discrepancies and one was, [00:31:00] um, men appear to be somewhat more mathematically talented than women. How much could that be involved difference and women appear to be somewhat more verbally talented than men. And how much could that be an evolved difference? And they were more willing to say that women could have evolved to be more verbally talented than men then that men could have evolved to be more mathematically talented than women. Um, I don’t know of any good reason why you would expect one to be more likely than the other. And the fact that they’re more comfortable with that, um, with the possibility that women evolved this superior quality, but not that many evolved to superior quality, I think reflects liberals aversion to the possibility that there could be some real difference between men and women in which men would be better than women on something. In this case, it would be math.

Turi: So that’s the logical floor, which is to say that, um, they’re perfectly happy to entertain the fact that there may be [00:32:00] differences between the genders, but not if that, uh, that difference between the genders explains current inequality

Cory: Yeah. Um, yeah, I think that’s, I

Turi: or perhaps reinforces current inequality, perhaps.

Cory: Yes, I think that’s one way of saying it. So, uh, another set of studies that I ran, we looked at this with gender and IQ. And these weren’t with academic Caesar, with normal people, but we had, um, people evaluate a scientific study that said that there was some genetic explanation for why women have higher IQs than men, or why men have higher IQs than women. And then we also had an equal condition where we said this gene explains why men and women have equal IQ. And we found, in fact, among liberals and conservatives, although there was more consistency among liberals. Where they found this scientific study to be roughly equally compelling. If it said men and women are [00:33:00] equal or women have higher IQ than men, but they found the study, particularly uncompelling when it said that men have high air IQ than women. So they have this sort of aversion to the possibility that there could be a real difference between men and women. In this case, a genetic, um, explanation for why men might have a more desirable quality.

Turi: So that sort of motivated reasoning performed. I wonder whether an extension question here would be, uh, your respondents. Could your response be aware of. The politics of those questions to start with. And that, in other words, what I’m asking is could it be deliberately motivated, reasoning, not just knee-jerk motivated reasoning, could they could, is it possible for somebody to say, um, I understand what’s happening here, but I don’t want to accept the results as they are because those results are politically dangerous.

Cory: Yeah, [00:34:00] that’s an interesting question. And I think there is probably some of that going on. So I’ve, I’ve talked about those findings. We did a similar one to where we had people read about. Men or women evolve to be better leaders and liberals wanted to censor the argument that when many evolve to be better leaders, there’s more than women evolve to be better leaders. Um, I presented these findings, similar findings to this finding that, you know, liberals are essentially, um, biased against the possibility that the high status groups have some more desirable quality than the low status groups. And some people. We’ll want to deny that it exists altogether either. Um, but what is a more common response? And this is related to what you brought up is they’ll say, well, It’s not a bias. It’s sort of, um, a couple of things. It could be they’re skeptical of the findings because powerful people are the ones forwarding findings. And [00:35:00] whenever powerful people say powerful people have a better quality, we should be skeptical. So this is actually a rational thing. And any time the high status group is better, we should be skeptical of that information because the high status people control the information. So that’s one argument and that’s, that’s compelling. I think, um, And then the other one, is that okay, maybe, maybe it is a sort of bias, but it’s a bias. It’s an intentional bias aimed at correcting current inequality. So it would be worse if, um, and some people would explicitly even say this. And in fact, there was a scholar, I don’t know how to pronounce this first name. S U w E last name Peters. He’s a philosopher. He said. Liberals and professors, liberal professors have this egalitarian bias, but it’s a good thing and we should encourage it. And we even if it pushes us further from the truth, and we’re unwilling to accept [00:36:00] certain empirical realities that are in fact empirical realities, it’s better for the world to have this bias. And so we should support or encouraged the existence of it. Um,

Turi: Because of course the risk it’s a tough Hill to die on right now, certainly in our, in, in, in a, in our sort of political climate, certainly in academia to want to be defending what to liberals looks like oppression because it, from a liberal perspective, often the Hill that conservatives would like to die on, feels like a racist one or a sort of gender, essentially rising one. And that’s tough.

Cory: Um, yeah. So do you mean among liberal scholars defending the bias or do you mean

Turi: the bias. Yes.

Cory: yes, so I, so I don’t actually have a good, I don’t, I don’t know for sure how many people would defend them. Bias. I would suspect it would be less [00:37:00] than half of scholars because I still believe that most scholars think what we’re trying to do is pursue true and accurate information and a better understanding of the world as it really is. Um, I don’t think that everyone supports, in fact, I have some, I have some data on this back when I didn’t have that many Twitter followers and most of my Twitter followers were just other academics. I did a poll, um, on weather. We should be pursuing the truth, making the world a better place, or if people refused to answer, if maybe it was an option and most people said we’re pursuing the truth. But I do think a lot of people think that scholars should be making the world a better place or perhaps they think we should only pursue the truth in cases where it would make the world a better place and definitely not pursue truth when it would make the world a worse place. Um, I don’t think there’s. Uh, definitely correct philosophy behind what academics are doing. I’m [00:38:00] in the pursue truth camp, I suppose. Um, but part of that is because I think that pursuing truth generally does make the world a better place. If, if we know what causes group disparities and we care about minimizing group disparities, you’d want to start with the best information on the causes of those. I think, um,

Turi: Uh, it’s, uh, it’s a, it’s a joke, but, but it’s also, but it’s also a true story. My Jewish grandfather used to say, um, in relation to precisely this question, he didn’t like antisemites, but he didn’t like fellow Semites either. He wanted people to be a Semitic.

Cory: Mm.

Turi: and that feels sort of like the pursuit of the truth rather than advanced one or other side of the question.

Cory: Yes. And I think there. When we have this sort of, what you do see is a little bit of a reversal of how people are willing to treat information. So if people were not treating people based on their gender or based [00:39:00] on their race or whatever group membership category and where they were completely neutral, I think that would be causing less problems. Then what seems to be happening now, which is a bit of a reversal among liberals. So there’ve been a bunch of findings recently. That find that the typical findings we used to see like 20, 30, 40 years ago, where people seem to be biased against women or biased against black people or bias against whatever minority group that was sort of the general tendency. A lot of people are finding now in these similar kinds of studies where they manipulate. The race or the gender of someone, especially among liberals, you get the flip finding where they’re actually biased in favor of black people are biased in favor of women. And even though that might be sort of, let’s say, I don’t know, virtuous, and that you’re trying to, um, get, get rid of any quality. Yeah. [00:40:00] Rebalancing, scared of inequalities. I’m not sure it’s necessarily a sustainable strategy because people pick up on that and you, I think you already see quite a bit of this pushback. Again, a lot of white people now feel like they’re the victims in this country or men. They’re like men’s interest groups popping up all over the place and you don’t necessarily want men’s interest groups popping up all over the place. But. Part of their concerns might be, might be legitimate. And in fact, they are being evaluated and sort of unfair in negative ways,

Turi: So actually

Cory: in a lot of domains.

Turi: reading your paper. It’s the very first time that I was able to understand the idea of a L M all lives matter, not exclusively as a self consciously racist movement, which is I, as a liberal, of course, would see a task, but understand it as a reaction against. Uh, perhaps legitimate or at least understandable [00:41:00] reaction against the, um, the bias of the other side.

Cory: Yeah, it could come from a little bit of fear that. If we’re not being inclusive of all lives in that sort of statement, then white people become the bottom of the hierarchy. And I’m not sure there’s a risk of that sort of thing happening in any like big, important, meaningful way, but it is at least understandable that people might have that fear. Um, and I think the, the vision that so many people had was that. We would stop evaluating people on the basis of their skin tone or on their gender or whatever sort of group category they fall into. Um, and I think people are seeing a new tolerance for that, a new tolerance for taking into consideration res taking into consideration gender, um, in [00:42:00] determining how we should treat people and. It’s sort of a different, it’s a different philosophy than what I think a lot of people expected when we were making so much progress on these issues.

Turi: right. Yeah. That’s a, yes, I understand that. Okay. So we have, um, clearly established sort of metaphysics on either side sacred value clusters on left and right. Conservative liberal. Why evolutionary. Why do we have these two groups? Do you see this? Do you see the existence of these two very, very different sets of ideas in an evolutionary context? Do you think of it as a purely cultural phenomenon, but what’s the value of having these groups like this?

Cory: Yeah, that’s a good question. And I don’t think I have a very good answer for you. So there are some people who think liberals and conservatives are differ or are [00:43:00] different in sort of like big fundament, fundamental ways. There’s some evidence that political ideology is a little bit heritable. And, and I, I think as we’ve mentioned earlier, there are differences between these groups. I. I would agree. They probably are different. And there are people who make arguments. I think even my, my, um, one of my post-doc advisors, Roy Baumeister, he argues about how you want people really focused on generating wealth and other people focused on redistributing wealth and these two actually balance each other out. And so if you look at a lot of conservative positions and liberal positions and the way they differ, one’s really pushing to change things. And one’s really pushing to preserve things or whatever it is. They actually balance out in a nice way in a society. So you want all kinds of people. You don’t want everyone to be pushing for a change all the time. Cause that creates all kinds of instability. Plus we don’t know what’s going to happen when you create change, but the world, the universe will not the universe. Let’s say the world, [00:44:00] humans, human, uh, the human condition seems to improve with time. Um, so change is generally a good thing. Um, So you could say that the reason we have these groups is because they, when they’re in competition with each other and each is having some victories, they’re slowing each other down from realizing their strongest vision of what they’re trying to do. Um, that’s what’s best for society. I don’t really know. And. I I’m sort of, I think liberals and conservatives are different. I don’t think they’re different in huge ways. And I think what happens is people might lean a little bit, one way or the other. They might have a little bit more of a liberal type personality or a little bit more of a conservative type personality. And then once you identify with a group I’m a liberal, I’m a conservative, you surround yourself with other liberals or other conservatives. They reinforce your behavior. You [00:45:00] become more liberal or more conservative and you that’s how you kind of get these divisions where people sort of split off and can be categorized into two things. When humans exist on a spectrum, they, I would think they must exist on a spectrum. Um,

Turi: Even if they exist on a spectrum, the fact that that spectrum exists is sort of interesting. And again, I’m asking it from an evolutionary perspective. But is that, are there other ways of understanding? Cause it’s, it’s a, I would love to believe this. I, it would be, it would do wonderful things to reinforce my deep, deeply, deeply held belief in democracy. If actually we could say society as we have it today is optimized to have a spread of political opinions in them, precisely because it puts, um, because. Th there’s, there’s two opposite approaches to life, calm the sort of fury on either side and advanced society in a way which is best for society. But is that, what, what, what would, uh, [00:46:00] what would your alternative explanation be?

Cory: No alternative explanation would be randomness. Um, I really don’t know. I really don’t know. Uh, and I. I think a lot of people would make the argument that you’re making, but I’m not 100% sure it’s right. It does make sense. Um, and you seem to get this sort of liberal, conservative ish thing that can be at least somewhat compared across different countries and different political systems. They’re not identical, but there is some kind of cultural similarity there, which would seem to suggest. There might be a reason you have that kind of spectrum. Um,

Turi: Because of that. So tell me why you’re nervous of that.

Cory: no, I’m not, I’m not nervous of it. I just don’t know as all. Um, if someone’s making another [00:47:00] argument, then I’m not aware of it. So, and I just wouldn’t have a lot of confidence. It’s, it’s one of those things that just really hard to know.

Turi: Ew. You’ll tell you’re terrible. In, in, in the way that you sort of require or seem to pretend to require evidence before forming judgements and a non-academics like me have so much easier

Cory: I try to be re so it shows that intellectual humility is good in a variety of ways. So I try to exercise it when possible.

Turi: I’ll take that as a personal attack. Um, Corey, this has been a thrilling conversation. Thank you so much. It’s been

Cory: Thanks for having me. It was very fun.

This page was last edited on Wednesday, 13 Jan 2021 at 09:59 UTC

Discuss