I stumbled across an old note to myself. It was Stanley Fish's 2009 review of Terry Eagleton's book, in which examples of "theological questions" are proposed, such as: “Why is there anything in the first place?” “Why what we do have is actually intelligible to us?” “Where do our notions of explanation, regularity and intelligibility come from?” Eagleton's position (as Fish portrays it) is that these questions are valuable as a sort of art form. They are not intended to address matters factually and attain an instrumental goal; we have science for that. Religion is not after facts. It is trying "to tackle what is at stake” (in Eagleton's words); it has something to do with humility and hope that properly ought to come prior to the search for facts.
It occurs to me that a question like "Where do our notions of explanation...come from?" is difficult to answer by any method — never mind the science-vs.-religion debate — because it is an attempt to use consciousness, reason, and language to interrogate and defend itself. A living being can't pick itself apart to learn more about how it thinks and moves. It has to take certain basic parts of itself for granted.
I wonder if what is being said here is that religion pursues a type of metacognition? and that "faith" is a type of humility that exposes the metacognition we do or don't have and can or can't develop? Here is "metacognition" explained by Tom Nichols:
Students who study for a test, older people trying to maintain their independence, and medical students looking forward to their careers would rather be optimistic than underestimate themselves. Other than in fields like athletic competition, where incompetence is manifest and undeniable, it’s normal for people to avoid saying they’re bad at something. As it turns out, however, the more specific reason that unskilled or incompetent people overestimate their abilities far more than others is because they lack a key skill called “metacognition.” This is the ability to know when you’re not good at something by stepping back, looking at what you’re doing, and then realizing that you’re doing it wrong. * * * The lack of metacognition sets up a vicious loop, in which people who don’t know much about a subject do not know when they’re in over their head talking with an expert on that subject. An argument ensues, but people who have no idea how to make a logical argument cannot realize when they’re failing to make a logical argument. In short order, the expert is frustrated and the layperson is insulted. Everyone walks away angry. — Tom Nichols. The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters. Oxford University Press, 2017.
(Overestimating one's own ability has been called the "Dunning-Kruger effect," but independently of the psychologists who studied it, it might simply be called a lack of metacognition.)
But often these primary assumptions cannot be questioned very thoroughly or for very long, and what we end up with as the product of "faith" is not humility but arrogance. Partly because: We begin with certain identities, and as a result we see what we want to see in the world. Even when we say we're trying to interrogate our beliefs, sometimes all we're doing is massaging data to reaffirm them, and we are deluding others and perhaps even ourselves.
Another passage from Nichols' book:
...a 2014 study of public attitudes about gay marriage went terribly wrong. A graduate student claimed he’d found statistically unassailable proof that if opponents of gay marriage talked about the issue with someone who was actually gay, they were likelier to change their minds. ... It was a remarkable finding that basically amounted to proof that reasonable people can actually be talked out of homophobia. The only problem was that the ambitious young researcher had falsified the data. ... As Konnikova put it in her examination of the fraudulent gay-marriage study, confirmation bias is more likely to produce “persistently false beliefs” when it stems “from issues closely tied to our conception of self.” These are the views that brook no opposition and that we will often defend beyond all reason, as Dunning noted: Some of our most stubborn misbeliefs arise not from primitive childlike intuitions or careless category errors, but from the very values and philosophies that define who we are as individuals. Each of us possesses certain foundational beliefs — narratives about the self, ideas about the social order — that essentially cannot be violated: To contradict them would call into question our very self-worth. As such, these views demand fealty from other opinions. Put another way, what we believe says something important about how we see ourselves as people.
The "foundational beliefs" might be some identity, intrinsic or cultural. They might be tied to an organized religion. But, I imagine, they might also be a set of assumptions that are harder to pin down to any one "thing." We sometimes try to prove a point for the sake of proving a point, and the way it forms and buttresses our identity is a downstream effect.
If you'd like to learn more about my work, I've published books. Also, I write for Medium. There, readers with a paid membership don't have to worry about the paywall.
No comments:
Post a Comment