Miriam Schoenfield, Transcript: Can we trust what we believe?

This is an automated transcript by Descript. Please excuse any mistakes.

Turi: [00:00:44] Today we’re thrilled to be bringing on Miriam Schoenfield to talk to the Parlia Podcast. Miriam is professor of philosophy at the University of Texas at Austin, and her primary focus is Epistemology: understanding how we know what we know. For Parlia, that’s at the heart of our project to map the world’s opinions, but also to understand exactly where they come from. Miriam, thank you so much for joining us.

Miriam: [00:01:11] It’s my pleasure. Thank you for having me.

Turi: [00:01:13] Miriam, you’ve spent a long time looking at how we acquire our perspectives on the world, and one of your key areas of focus has been how irrelevant influences impact us. Can you help us understand that? What are these irrelevant influences that you talk about?

Miriam: [00:01:31] Sure. Irrelevant influences on belief are just things that have an influence on your beliefs, but that aren’t evidence for them. So an example might be, if you get high on mushrooms and start believing that aliens are talking to you, then the fact that you took the mushrooms is an influence on that belief, it’s causing you to have that belief, but it’s not actually any evidence for it.

Turi: [00:01:59] Okay, and we’re assuming that you’re not talking from experience here. But so taking this outside of the context of fabulous Texan mushrooms, can we open this up a little bit? Because I think you’ve spoken elsewhere about the fact that actually a great many of us inherit our beliefs and the mere fact of inheriting our beliefs sort of makes them irrelevant.

Miriam: [00:02:27] So, the mushrooms example was a sort of fantastical one. My primary focus has been social influences on beliefs, so things like growing up in certain communities or going to certain schools or having certain friends. It appears that those things have a very significant influence on our forming the worldviews that we do. That’s the sort of starting observation, that a lot of beliefs that are very fundamental to who we are and to how we think about the world are influenced by things that at least appear to be kind of arbitrary and irrelevant to the truth of the matter.

Turi: [00:03:03] So there’s a lot to unpack there. The first thing that I want to ask you about is why do we hold onto our beliefs, our inherited beliefs, those beliefs which are sort of socially impacted upon us, so forcefully?

Miriam: [00:03:22] That’s to some extent a psychological question, which I’m not really an expert on, but I think when we’re growing up, we’re forming a picture, then once we form a picture, it’s just easiest to go around in the world working with the picture that we’ve got, and that’s how we continue to interpret new evidence and how we continue to evaluate new arguments. So I think in a sense it’s probably deeply rooted in the fact that humans are very social creatures and that there are benefits to groups of people coordinating about what’s to believe. So we tend to merge with the opinions of those around us and proceed accordingly.

Turi: [00:04:06] There’s an old line that we prefer to be wrong together then right alone. Is that sort of what you’re saying?

Miriam: [00:04:13] Yeah, kind of. I won’t say that we necessarily prefer to be wrong together. In a sense I think that even people who are sincerely seeking the truth and trying to figure out what’s really the case, and so they would prefer to be right alone, are still not necessarily immune from these influences. It’s more that the forces on belief that surround us may lead us to be wrong together rather than right alone, because what they prioritize is the togetherness rather than the rightness. So, we form beliefs in these social environments, which results in whole communities of people having very similar beliefs and that seems to be one of the driving forces on beliefs about really important matters.

Turi: [00:05:01] So essentially red States stay red, Catholic stay Catholic, Jews tend to stay Jews, in a sense, in the same way as football supporters of a particular club tend to have children who also support that club, but the difference there perhaps is that we know the allegiance to the football club is irrational.

Miriam: [00:05:19] Yeah, I think that is right. I think we have a very different attitude towards our religious beliefs and our political beliefs and our moral beliefs than we do towards our sports allegiances, despite the fact that on some level, they’re being influenced by similar factors.

Turi: [00:05:36] So if, as you say, we acquire not all, but many of our most important beliefs in this way that you describe as sort of irrelevant or unreliable, are we left epistemologically stranded? What can we do with this if the very foundations of our sense of the world are conditioned in that way? They aren’t rational.

Miriam: [00:06:03] I think we have a choice as to how to respond to this fact. So I don’t think recognizing that your beliefs were influenced in these ways forces you to come to think that the beliefs are not rational or that you should give them up or that they’re not true. I think what happens once we’re confronted with this fact that a lot of these important beliefs are formed in the ways sports allegiances are, is that we face a choice of whether to take up what I call a perspective of doubt. So one option that we have is to say, okay, you know what? I’m really worried about all these beliefs that I formed in these ways, so I’m going to sort of set them aside and try to think about the world in a way that is not influenced by these factors and one of the things I argue for is that if you do that, you’d basically end up agnostic. So it would be very hard from this very neutral perspective to try to get your beliefs back, but I don’t think there’s any rational requirement to take up that perspective of doubt. So another thing you could do is just say, well, yes, there are all these different belief systems and I might’ve ended up with one of those if I’d grown up elsewhere or had different friends or gone to a different school, but I didn’t, and this is the true belief and, lucky for me, I got it right. There is something sort of strange sounding about that, but I do think that’s a genuinely rational option that’s available.

Turi: [00:07:50] So just to unpack those. On the one hand you’ve got taking this perspective of doubt, which sort of pulls the rug from under sort of any established truths or absolutes that you could rely on and therefore puts you in a position, a permanent position of agnosticism, or you go, well, you know, it was either this or the other and I’m this one and I was lucky and therefore I’m sticking inside it. Would that be sort of like a willful suspension of disbelief or is that actually just a straight move?

Miriam: [00:08:23] I actually think it’s like a willful commitment to believe rather than a willful suspension of disbelief. So if you say, hey, I’m lucky I got it right, you’re in a sense saying like, look, I’m going to choose to not engage in this process of doubt, which I can see is going to lead me down the road to a very sort of global agnosticism where I’m going to end up really uncertain about everything. I’m not going to do that. Instead I’m going to use the beliefs that I find myself with as my starting point. I’m going to continue to assume that they’re true and proceed from there.

Turi: [00:09:04] And you call both approaches rational. Can you explain the rationality of that sort of “lucky me” position?

Miriam: [00:09:12] Well, there’s a lot of disagreement, as you might imagine, in philosophy about what the word rational means and what it means to call something rational. When I say that both positions are rational, what I’m trying to convey is that I don’t think there is an argument that privileges one of those routes rather than the other. So basically I think, which of those routes look good to you? Let’s say the agnostic route or the lucky me route, that basically already depends on whether you’ve decided to take up this perspective of doubt. So when you’re in the perspective of doubt, you’re going to think to yourself oh my goodness, the lucky me route looks terrible because that could very easily lead me astray. Everybody could go that way. Everybody could think: oh lucky me! I happened to get it right. But if you find yourself already occupying the lucky me perspective, if you find yourself with a commitment to the beliefs that you in fact have, then you’re just going to think, well, of course, this is the way the world is. It would be disastrous if I gave up my accurate picture of the way the world is. So given that which way of going seems to make sense depends so much on whether you’ve already decided the matter, which perspective you’re currently occupying. I don’t think there’s any rational deliberative route to preferring one of these paths over the other.

Turi: [00:10:49] So there’s possibly a third move as well, which is to start inside one belief system and jettison it in favor of another. Does that also count as a rational move?

Miriam: [00:11:00] That’s a tricky question, and it’s one that’s familiar to me because that’s something that happened to me. I started out with one belief system and then I jettisoned it in favor of another. At one point I wondered whether I was sort of immune to this challenge because the challenge says, hey, look, we’re all just sheep. We just believe whatever beliefs were given to us. Then there was a question of if that’s not, what happened to you, if you did something different, does that mean that you’re immune to this challenge and how to think about that move of moving from one belief state to another. Ultimately, I don’t think any of us are immune from the challenge, because I think lots of influences that lead us to say, adapt the belief systems that we were raised with, depending on one’s psychological tendencies might lead one to reject the belief systems that when we’re raised with instead. Jettisoning one belief system in favor of another, at least in a lot of cases, is going to require a kind of jump that I think would be difficult to rationally reconstruct, because once you have a certain belief, once you think the world is a certain way, some other way of looking at the world is going to be incompatible with your current way. So the move that you take where you drop the first way and take up the second way, in a lot of cases, isn’t going to be explained by, oh, you’ve got some new evidence or you heard some new arguments. It’s kind of like a gestalt shift, sort of like the world starts seeming to you different from how it was.

Turi: [00:12:41] Miriam can you explain what a gestalt shift is and what you mean by it?

Miriam: [00:12:44] Sure. A gestalt shift is a transition that occurs in the way that you think about things, that’s based on a bigger shift in perspective, as opposed to some really small tweak of some particular beliefs. So when it comes to perception, a lot of people are familiar with the duck rabbit picture, where there’s an image that can look like a duck or look like a rabbit and if you spend a lot of time looking at it, whether it looks like a duck or a rabbit to you is just going to kind of change back and forth, even though the image is exactly the same. So that’s an example of a gestalt shift. You’re looking at the same image, but in one moment, you’re seeing it as a duck and in the other moment, you’re seeing it as a rabbit. So something similar can happen intellectually. You look at a body of evidence and it could be that in one moment, you see that evidence of supporting one world view and at another moment you see that evidence as supporting another worldview. So a gestalt shift is a kind of big global shift in the way that you’re seeing the world and the way that you’re interpreting evidence.

Turi: [00:13:56] I want to jump in here a little bit because I also had that experience as an adult, moving away from a system which believed in God to one of agnosticism and then atheism. I have always thought to myself that this was a rational move. All the scientific evidence around me told me that the big guy in a cloud didn’t make sense, but even that move towards what I think of as rationalism you’re describing as a sort of a gestalt shift, not sort of a rational progression.

Miriam: [00:14:27] I think in any given case we’d really need to look very closely at the details. So there are certainly examples where you can change your mind through a very straightforward, rational progression and come to see things differently. So for example, if I think it’s going to be a nice sunny afternoon, and then I look out the window and I see clouds in the sky, and then I think actually I think it’s probably going to rain later, you know, no gestalt shift was needed, that was a perfectly rational process that the traditional frameworks in epistemology can capture very well. We’re just responding to new evidence as it comes in. When I’m talking about these cases where your sort of basic worldview changes and I think in some cases it’s better to think of it as a gestalt shift than a rational transition in thought, it’s because they’re not cases where you’re getting new information and responding to that information using your basic framework of thinking about the world, rather they are cases in which you are using a different system in order to evaluate information. So just to make this a little bit more concrete, and I don’t want to talk about your case in particular, but suppose that somebody grew up with some religious beliefs and then they decided that the scientific evidence does not support the existence of God and they abandoned those beliefs. Well, it’s possible that the religious perspective they currently occupied, it’s not as if those people have never heard of science or scientific evidence, it’s rather that they think actually that evidence isn’t good enough to undermine the position. The reasons to believe in God perhaps have to do with things like religious experiences and that can override scientific evidence, or maybe they think the scientific evidence isn’t actually as decisive or maybe they question the methodology of science. So they have a whole picture with which to incorporate that information, and according to that picture, that information doesn’t undermine the position. So if you then move into adopting a view where you now think actually that information does undermine that position, that has to be a gestalt shift because the framework you were in previously, didn’t regard that as information that undermines the position and now you get the information and you do regard it as information that undermines the position.

Turi: [00:17:14] That’s beautiful and very destabilizing, thank you.

Miriam: [00:17:22] My pleasure.

Turi: [00:17:22] One of your papers talks about al-Ghazali, who’s an early Sufi mystic who writes a beautiful pamphlet called Deliverance from Error, which I actually read as a teenager because my parents brought us up as Sufi’s. So from their perspective, this deconditioning of thoughts, which is at the heart of the Deliverance from Error, this idea that you look at the world from a particular gestalt perspective and need to decondition yourself to see it as it truly is. For my parents in the 1970s, that meant moving out of a materialist world and into a mystical one, and for me in the 1990s, that was moving out of a mystical one back into the material one.

Miriam: [00:18:07] Wow. That’s fantastic.

Turi: [00:18:09] So I want to come back to this very powerful idea that in fact, the rationalist world that we rationalists think we exist in, is not particularly more rational than the world of somebody who believes in God or supernatural forces, that the “lucky me” move that you described earlier applies just as much to materialists, rationalists, et cetera, as it does to Orthodox religious people.

Miriam: [00:18:39] Yeah. So there’s certainly a temptation on behalf of rationalists and materialists to think that in some sense, they’re following the light of reason and truth and other people are not and therefore they are on the correct path and other people are on the incorrect path. And it’s even tempting, like say as an atheist, to think that, oh, all these different religious groups they’re really subject to this worry about irrelevant influences because Christians believe what their Christian parents told them and Muslims, a lot of them believe what their Muslim parents told them and so forth. But atheists are sort of able to stand back from all that and say, oh, actually, you know, this whole picture is kind of a mess and we’re not subject to those concerns, but I’m inclined to think that atheism is just one position among many, and it is subject to the same sorts of influences. So I think atheistic beliefs are also heavily socially influenced. So I don’t think anybody, is really immune to this challenge and I think everybody, whether you’re a rationalist or not a rationalist has this choice point available of whether to step back and say, actually I recognize that my beliefs were influenced in this way and as a result, I’m going to be very skeptical and I’m not going to commit to any of them or sure, I recognize my beliefs were influenced in this way, but this is how I see things and I’m going to go ahead with them. So, I don’t think rationalist or materialists are atheists are sort of privileged with respect to this particular challenge to believe over anybody else.

Turi: [00:20:31] So there’s an epistemological rug which has been pulled out from the feet of sort of everybody at this point and to repeat, we have two moves. One is to say, okay, I’ll just trust everything, I’m agnostic about everything and the other is to say, lucky me, I’m in the right team, to hell with everybody else. That seems really bleak as an alternative. Is it possible to hold that agnosticism, that knowledge that an enormous number of influences on our beliefs are irrelevant. Is it possible to hold that idea and to still know how to act in the world, to still know our own minds? Is it logical or rational to take any political views? Is it logical to intervene in the world from that agnostic position?

Miriam: [00:21:19] So your question is basically about what are the consequences for action, if you take this agnostic position, am I understanding the question right?

Turi: [00:21:29] Yes, you frame it so much more succinctly. Thank you.

Miriam: [00:21:33] Sure. Yes, it is possible to intervene in that world from an agnostic perspective. But what’s going to happen is that if you choose to intervene, you’re going to want to prefer much more conservative actions. So here’s what I mean by that. I don’t mean politically conservative, I mean conservative as opposed to taking fewer risks. So in general, when you’re less sure about something, it makes sense to choose actions that are less risky. So if I’m really unsure about what the weather is going to be this afternoon, then I don’t want to invest a ton of resources into planning some outdoor picnic, instead I might want to sort of hedge my bets and plan for an activity that would be OK, whether it rains or whether it doesn’t rain. Whereas when I’m really confident in something, then I can go all in and I can make firm plans and really invest myself fully in whatever project I choose to pursue. If you adopt a very agnostic perspective on a lot of matters, then I think what that means is like, yeah, you might want to engage politically in certain ways, but you wouldn’t want to invest a whole lot into trying to bring about one outcome rather than another, because from your perspective, you’re not sure whether that outcome is better than the other. So you’re going to be very careful as far as your actions are concerned. So that doesn’t mean you’re not going to do anything, it just means that you’re going to move in very cautious ways.

Turi: [00:23:22] So, Agnostics make excellent centrist at.

Miriam: [00:23:27] Yes, precisely. Turi: [00:23:30] Got it. This is actually not a podcast. This is just therapy for me, understood. If we assume that rationality is a sort of reason based intellect driven thought system, is that more or less of a belief, is it more or less subjective? Is it more or less anchored in the real than say other ways of thinking about the world?

Miriam: [00:23:58] I think that using methods like highly intellectual methods that involve reasoning and arguments, that way of proceeding is a way of proceeding that’s not necessarily privileged from some objective perspective. So there are questions about whether, for example, emotional reactions can be a good guide to the truth, whether certain spiritual experiences can be a good guide to the truth. So there are lots of aspects of human experience and argumentation and logic is one aspect and that might be one way of getting to the truth. But in principle, spiritual experiences might be a better way to get to the truth or emotional experiences might be a better way to get to the truth. So I think it is a substantive position to say, actually intellectual methods are the best methods to get to the truth. That might be correct. It might be that intellectual methods are the best methods to get to the truth, but that’s not something that can be apprehended from a sort of neutral standpoint, that’s a substantive position to take about how inquiry is best conducted.

Turi: [00:25:19] There’s a political context to this because of course, many of the rationalist communities on the internet claim to be working at a level of truth that hovers above the fray of Twitter, social media, et cetera, et cetera. But if you look at the makeup of those communities, they all look very, very, very similar. They all tend to be very white and very male and very educated and that claim to a superior version of the truth is therefore complex and political in that case. Right?

Miriam: [00:25:47] Yes, arguably it is. I think that as they have certain commitments, so they’re committed to their methods being truth conducive and as I said before, they might be right about that, and like we were talking about before they have the option of making that choice of saying, well, here’s how I’m going to go. I’m going to prioritize these intellectual methods and I’m not going to doubt those and I’m going to proceed from there. And that’s one way of going, but I think they, like anybody else, could also step back and say well, wait a second. If I suspend judgment for a moment about the reliability of these methods, it looks like there’s a wide open space where there are lots of different ways to go in. And I suspect from that perspective in which those methods are doubted, then those methods won’t necessarily look privileged over alternatives.

Turi: [00:26:49] That’s fascinating. So my last question to you, Miriam , is around feeling. You’ve just said that feeling may be just as good a way into the truth as thinking, but my last question is: is there an argument that our feelings are also learned in the same way that many of our beliefs are learned?

Miriam: [00:27:10] Absolutely. So, when I say it may be that this is a better method than this, I’m not suggesting that one of these methods might be immune to this sort of challenge I’m describing. All I’m saying is, as a matter of fact, it could turn out that people that believe things on the basis of their guts and their feelings, for all we know, it could turn out that that way of going about things, tends to lead to more true beliefs. So it’s still true that how people respond with people’s emotions and their guts are highly influenced by their social surroundings and their political environment more broadly and also our spiritual experiences. If you look at different religious traditions around the globe, you’ll see that different spiritual experiences, people have different sorts and the kinds of experiences they have are heavily influenced by the particular religious tradition they’re sort of embedded in. So all of these methods, how we think, how we feel, how we experience, I think, are deeply influenced by the world around us. So I don’t think going any of these ways makes you immune. Any way you go that’s not agnosticism involves some version of the lucky me response, involves some version of endorsing a set of commitments, even though stepping back from those commitments, some others might look just as good.

Turi: [00:28:46] And, do you have a recommendation? Is there any evidence to suggest that people who go one route or the other are happier or less depressed or more fulfilled or live better, more interesting lives?

Miriam: [00:28:58] I don’t have a recommendation. Well, really there’s two questions, there’s what leads to happiness and fulfillment and then there’s a question of what leads to the truth. So, certainly I have no expertise on what leads to happiness and fulfillment. That’s something that there are all sorts of other experts who I’m sure have views about that.

Turi: [00:29:21] But for the truth?

Miriam: [00:28:58] For the truth, is there any evidence? Boy, that is a really difficult question and it gets into a lot of very thorny issues about how we evaluate evidence for the reliability of a method. So I think what I would want to say, if we’re not going to get really deep into the weeds here is that, I don’t have an official stance on that. So I work at kind of a mental level, where I look at the general structure of these sorts of challenges to belief and try to think about what the options are and what ways they could be justified. And then other epistemologists would be more in the business of saying like, wow, what is the evidence that this kind of method rather than this one is better, but that’s not something I really take a stand on, at least in my professional life.

Turi: [00:30:18] You do enough to destabilize the basis of our thoughts about the world without giving us the answers as well. Come on, we do that at the next podcast.

Miriam: [00:30:30] That’s right. Exactly.

Turi: [00:30:32] Miriam, huge thanks for this. It’s been a great, great pleasure talking to you here.

Miriam: [00:30:35] Yeah, you too. Thank you so much.

This page was last edited on Wednesday, 30 Sep 2020 at 09:47 UTC

Discuss