"But how to be morally severe in the late twentieth century? How, when there is so much to be severe about; how, when we have a sense of evil but no longer the religious or philosophical language to talk intelligently about evil? Trying to comprehend ‘radical’ or ‘absolute’ evil, we search for adequate metaphors. But the modern disease metaphors are all cheap shots. The people who have the real disease are also hardly helped by hearing their disease’s name constantly being dropped as the epitome of evil. Only in the most limited sense is any historical event or problem like an illness. And the cancer metaphor is particularly crass."
Friedrich Nietzsche, Beyond Good and Evil, Section 187:
"Quite apart from the value of such assertions as 'there exists in us a categorical imperative' one can still ask: what does such an assertion say of the man who asserts it? ...moralities too are only a sign-language of the emotions."
"They Get to Me." Jessica Love. The American Scholar. Spring 2010. p. 71. :
"German bilinguals consistently described the bridge with more feminine adjectives (elegant, beautiful), and Spanish bilinguals described it with more masculine adjectives (sturdy, dangerous). Here’s the kicker: instructions were given in English, descriptions were written in English, and the photograph of the bridge was just that—a photograph. This suggests that pronouns might be important, not just to how we use language, but to how we experience the objects in our world (although, as dear Steven Pinker points out, “Just because a German thinks a bridge is feminine, doesn’t mean he’s going to ask one out on a date”)."
In P. Glassen, "The Cognitivity of Moral Judgments," Mind 68 (1959), pp 57–72. and P. Geach "Assertion." See also C. Wellman, "Emotivism and Ethical Objectivity," American Philosophical Quarterly 5 (1968), pp. 90-9. Reprinted in Richard Joyce. The Myth of Morality. Cambridge University Press, 2001. p. 13. :
"Glassen's point is that if all the evidence suggests that we intend to use our moral language in an assertoric manner, then all the evidence suggests that our moral language is assertoric, for assertion is entirely a matter of our intentions. The evidence that Glassen assembles I would employ to a slightly different end: as confirmation that the linguistic conventions that govern moral discourse are those of assertions. Here is Glassen's list..."
1. They (moral utterances) are expressed in the indicative mood
2. They can be transformed into interrogative sentences
3. They appear embedded in propositional attitude contexts
4. They are considered true or false, correct or mistaken
5. They are considered to have an impersonal, objective character
6. The putative moral predicates can be transformed into abstract singular terms (e.g., "goodness"), suggesting they are intended to pick out properties
7. They are subject to debate which bears all the hallmarks of factual disagreement We can add to this list the two related characteristics highlighted by Peter Geach.
8. They appear in logically complex contexts (e.g., as the antecedents of conditionals)
9. They appear as premises in arguments considered valid"
Painting Dragons: What Storytellers Need to Know About Writing Eunuch Villains
"The world comes to us in a tremendously complex tangle. The norms of contemporary journalism—maybe just journalism, period—insist on the present in a way that is flattening and not true to the thickness of time. In general, and definitely in the US, we are discouraged from historical thinking. Even in terms of what’s going on right now, in Israel and Palestine, you hear people say that referring to the occupation or anything that preceded October 7 is a distraction from the present. That attitude is not going to help us understand the violence of our world order. And it won’t help us transform it. I would say the same about nationalism. It’s not explanatory, and we miss so much if we insist on framing things that way. I come from self-consciously diasporic communities, but even if I didn’t, I hope I would still have enough sense to keep my moral focus on people rather than states.
* * *
In general, I’m not interested in a kind of criticism where people retweet it and say, 'This is the last word on X or Y. Mic drop.' I've never been interested in those kinds of proprietary claims. I'm interested in a form of criticism that really opens up other desires, associations, lines of inquiry — because to me, an object is never exhausted, no matter how many people write about it. But there’s also so much where the idea of authority or expertise barely comes up because critics haven’t seen those objects as worthy of analysis. That’s my sweet spot."
— Carina del Valle Schorske, interviewed by Merve Emre. The Tuning Fork in the Ear. Episode Ten of “The Critic and Her Publics”. New York Review of Books. June 25, 2024.
"America is the only country where a significant proportion of the population believes," or so comedian David Letterman suggests we already know, "that professional wrestling is real but the moon landing was faked."
People avoid knowledge
Information avoidance, according to several authors in a 2010 paper published in the Review of General Psychology, is "any behavior designed to prevent or delay the acquisition of available but potentially unwanted information."
On the topic of avoidance, Katy Montgomerie in 2023 notes a similarity in argument styles of “GC people” (i.e., so-called Gender Critical people) and “religious cultists.” Montgomerie says:
“When you find something in their worldview that's an internal contraction or that's disproved by evidence they do this thing where they avoid confronting the realisation just before they get there. Like you set them up and just before they close the loop they deflect. They do it in a way where it could only happen as consistently as it does if they were already aware of the contradiction and are just unwilling to face it.” (tweet 1, tweet 2)
One way to avoid learning: Choose leaders and authorities who don't know anything
Someone who is committed to a conspiracy theory probably spends a lot of time deep in their denialism, because they have to buttress those walls, and so they have less time to spend learning things broadly. Things they learn about the world broadly might eventually challenge their conspiracy theory on a limited topic, and they won't allow that. This amounts to a refusal to have useful knowledge of multiple kinds. When people elevate these limited knowledge conspiracy theorists to, say, political office, everyone is stuck with an incompetent leader.
Anti-trans activists make political parties lose elections. Their singular Qanon-style obsession makes them incapable of *any* sort of job, let alone political office. Hire one as a kitchen hand, they'll refuse to combine the vegetables & tell you George Soros made the plates.
Lawrence Davidson characterized the arguments in Rick Shenkman's 2008 book Just How Stupid Are We?: Facing the Truth About the American Voter as saying that Americans are: "(1) ignorant about major international events, (2) knew little about how their own government runs and who runs it, (3) were nonetheless willing to accept government positions and policies even though a moderate amount of critical thought suggested they were bad for the country, and (4) were readily swayed by stereotyping, simplistic solutions, irrational fears and public relations babble." Davidson then said that this is "a default position for any population," but that it is still a concern when, for example, "polls show [that] over half of American adults don’t know which country dropped the atomic bomb on Hiroshima, or that 30 percent don’t know what the Holocaust was." Such confusion isn't unique to the United States. "In the middle of March 2008," wrote Javier Cercas (translated by Anne McLean) in The Anatomy of a Moment, "I read that according to a poll published in the United Kingdom almost a quarter of Britons thought Winston Churchill was a fictional character."
In 2013, a poll by the Kaiser Family Foundation (reported in The Week on May 10) found that 19 percent of Americans believed that the Affordable Care Act, known popularly as "Obamacare," had already been repealed or overturned, and another 42 percent weren't sure.
In 2014, the National Science Foundation said that only a slight majority of Americans polled were able to correctly respond that viruses can't be treated with antibiotics and that 26 percent said that the sun revolves around the Earth. Citing a prior poll by this organization on this same question, Susan Jacoby wrote in 2008 that "The problem is not just the things we do not know...it’s the alarming number of Americans who have smugly concluded that they do not need to know such things in the first place. Call this anti-rationalism...The toxic brew of anti-rationalism and ignorance hurts discussions of U.S. public policy on topics from health care to taxation." Tom Nichols, commenting on Jacoby's column in his 2017 book The Death of Expertise, said: "Ordinary Americans might never have liked the educated or professional classes very much, but until recently they did not widely disdain their actual learning as a bad thing in itself. It might even be too kind to call this merely “anti-rational”; it is almost reverse evolution, away from tested knowledge and backward toward folk wisdom and myths passed by word of mouth — except with all of it now sent along at the speed of electrons."
Since 2014, a small but growing group of "Flat Earthers" has met regularly in Fort Collins, Colo., with sympathetic meetings occurring in a half-dozen other U.S. cities. A leader recalls seeing a YouTube video that promoted the idea of a flat earth. “It was interesting, but I didn’t think it was real. I started the same way as everyone else, saying, ‘Oh, I’ll just prove the earth is round.’ Nine months later, I was staring at my computer thinking, ‘I can’t prove the globe anymore.” The article in the Denver Post says of this group: "Many subscribe to the 'ice wall theory,' or the belief that the world is circumscribed by giant ice barriers, like the walls of a bowl, that then extend infinitely along a flat plane." In 2017, searching YouTube by the exact phrase "flat earth" (with quotation marks) yields three-quarters of a million videos. In 2018, CNN reported, "a YouGov survey of more than 8,000 American adults suggested last year that as many as one in six Americans are not entirely certain the world is round, while a 2019 Datafolha Institute survey of more than 2,000 Brazilian adults indicated that 7% of people in that country reject that concept, according to local media."
In 2010, the Corporation for Public Broadcasting received funding amounting to 0.00014% of the U.S. federal budget. CNN/Opinion Research found early the next year that "Forty percent of those polled believe funding the CPB receives takes up 1 to 5 percent of the budget, 30 percent believe public broadcasting takes up 5 percent or more of the budget and 7 percent of respondents believe the non-profit receives 50 percent or more of the federal budget." The final cohort of respondents who thought it was more than half of the budget may also suffer from general mathematical or political illiteracy, but it seems nonetheless fair to say that many people have false beliefs about the funding for public broadcasting. (For comparison, when a Roper poll in 2007 accurately informed participants that the Public Broadcasting Service (PBS) receives funding equivalent to about $1 per American per year, half of the respondents said this amount was "too little.")
In 2019, when asked about Arabic numerals by an opinion polling firm, a majority of Americans (56 percent) said the numerals should be excluded from the curriculum in American schools. ("Arabic numerals" are the shapes we recognize as numbers: 0, 1, 2, 3, 4, 5, 6, 7, 8, 9.) The question was designed to highlight how quickly most Americans would respond in a prejudiced manner to anything labeled "Arabic."
"There’s no shame in not knowing; there’s shame in not wanting to know. For years I’ve said this to my college students as a way of telling them that learning should never stop. But I have reluctantly come to the conclusion that, at a certain point, there should be shame in not knowing," Charles Taylor wrote in an opinion piece for the Boston Globe. He fretted over "creative-writing students who have never heard of Edith Wharton or Ralph Ellison; journalism students who can’t identify the attorney general; students who don’t know what the NAACP or the Geneva Convention are."
"The emerging narrative of this election is that Donald Trump was elected by people who are sick of being looked down on by liberal elites. The question the people pushing this narrative have not asked is this: Were the elites, based on the facts, demonstrably right?
* * *
That Trump voters chose an easily disprovable myth over readily available facts is one sign of their willful ignorance.
And still this imperviousness to fact pales next to the racism and xenophobia and misogyny — in other words, the moral ignorance — that Trump’s supporters wallowed in. All of the condescension of which liberals have been accused can’t begin to match the condescension of the current storyline that Trump voters are too disenfranchised or despised or dismissed to be held morally responsible for their choices.
* * *
The apologists for Donald Trump voters have given their imprimatur to a culture that equates knowledge and expertise with elitism, a culture ignorant of the history of the country it professes to love and contemptuous of the content of its founding documents."
It isn't clear from this brief column how Taylor thinks factual knowledge and moral knowledge might be related. Most people would say that moral knowledge depends on drawing conclusions that incorporate factual knowledge. (For example, you have to know whether someone else is threatening you before you can properly decide how to act in "self-defense." As another example, you have to know whether a crime occurred before you can express your opinion about it. Berel Lang wrote: "...the most extreme Holocaust 'revisionists' — Faurisson, Rassinier, Butz — do not deny that if the Holocaust had occurred, it would have been an enormity warranting moral reflection, judgment, and whatever else followed from these, presumably including condemnation and punishment; they deny only that it did occur.") Some would also say that moral knowledge is not merely a concatenation of ordinary beliefs and social agreements but that it exists in some separate sphere.
“According to a poll taken in August 2017, 47% of Republicans believe he [Trump] won the popular vote — even though he lost it to Hillary Clinton by nearly 3 million votes. Even more horrifying, 68% of Republicans mistakenly believe that millions of illegal immigrants voted in the 2016 contest. And here’s the authoritarian kicker for good measure: 52% of Republicans surveyed said that they would support postponing the 2020 presidential election if President Trump suggested that doing so was necessary to ensure that only legal citizens could vote. Democracies die when presidents can postpone elections based on the mythology of a pernicious lie.”
And in November 2020, days after Joe Biden won the popular vote by at least 4 million votes and also won the Electoral College (with the same number of electoral votes that Trump had won in 2016, incidentally), a Reuters/Ipsos poll found that only 6 in 10 Republicans said Biden had won. The false belief grew more popular and stuck. An Ipsos poll conducted March 30-31, 2021 found that 6 in 10 Republicans believed that Trump was the rightful victor and that the election had been somehow stolen from him, while nearly as many believed more specifically that Biden had won due to some form of vote fraud or election rigging.
Some political choices are motivated by beliefs that are not merely incidentally false, but superstitious and, I would argue, horribly immoral. Newsweek reported in January 2018 that "many evangelical Christians believe that Trump was chosen by God to usher in a new era, a part of history called the 'end times,'" and, accordingly, this group "overwhelmingly support[s] President Donald Trump because they believe he'll cause the world to end." [Emphasis mine.]
Many Americans said in an August 2018 poll that “the president should have the authority to close news outlets engaged in bad behavior,” despite the First Amendment guaranteeing the freedom of the press. The breakdown was partisan: 12 percent of Democrats, 21 percent of Independents, and 43 percent of Republicans believed the president should be allowed to shut down newspapers.
Across various countries, in a 2019 study, about 10-30% of atheists have superstitious beliefs. Agnostics are a little more likely to be superstitious. Meanwhile, 30-70% of the general population, which includes religious believers, is superstitious. This shows that while atheism correlates with a lower likelihood of superstitious belief, it does not root out superstition entirely. Belief in God and belief in superstitions are separate things.
Truth ought to matter
More reasoning skills or relevant knowledge background does not always help someone arrive at the correct answer. If their reasoning is motivated by something other than seeking truth (see the definition of "motivated reasoning"), then their enhanced reasoning skills will only abet them in seeking their ulterior motive. Ezra Klein wrote in Why We're Polarized:
If the problem was truly that people needed to know more about science to fully appreciate the dangers of a warming climate, then their concern should've risen alongside their knowledge. But here, too, the opposite was true: among people who were already skeptical of climate change, scientific literacy made them more skeptical of climate change.
This will resonate with anyone who’s ever read the work of a serious climate change denialist. It’s filled with facts and figures, graphs and charts, studies and citations. Much of the data is wrong or irrelevant. But it feels convincing. It’s a terrific performance of scientific inquiry. And climate-change skeptics who immerse themselves in researching counterarguments end up far more confident that global warming is a hoax than people who haven’t spent much time studying the issue. This is true for all kinds of things, of course. Ever argued with a 9/11 truther? I have, and they are quite informed about the various melting points of steel. More information can help us find the right answers. But if our search is motivated by aims other than accuracy, more information can mislead us — or, more precisely, help us mislead ourselves. There’s a difference between searching for the best evidence and searching for the best evidence that proves us right.”
"Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.
* * *
Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs — and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.”
We care more about facts when we feel good about ourselves
“The 2000 [presidential] campaign was something of a fact-free zone,” said Brendan Nyhan, who was an undergraduate at Swarthmore at the time and who subsequently founded a political fact-checking website called Spinsanity that led to a book All the President's Spin. In his doctoral program at Duke University, he moved on to ask, as Maria Konnikova put it: "If factual correction is ineffective, how can you make people change their misperceptions?"
From Konnikova's article:
"Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research.
* * *
One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief — that is, a more lasting state of incorrect knowledge — and not all false beliefs are difficult to correct.
* * *
When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.
In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior.
* * *
Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would."
It is also important to note the difference between actually believing something and merely claiming to believe it to maintain one's public image. Public image is more obviously related to one's identity and also to one's material interests. Alexander C. Kaufman provided this example:
"In December 2006, Exxon Mobil Corp. convened a two-day summit of environmental and ethics experts at a rural retreat near the base of the Blue Ridge Mountains in Virginia....For decades, Exxon had funded far-right think tanks that seeded doubt over the scientific consensus on climate change. [The new CEO Rex] Tillerson and Ken Cohen, Exxon’s PR chief and chair of its political action committee, wanted to broaden the company’s political reach. One step was changing their messaging about climate change, moving away from the denial the company had been attacked for supporting....Not long after the summit, Exxon began to modify its public stance on climate change."
Sometimes what is claimed publicly is done to maintain relationships and make money. Edward S. Herman and Noam Chomsky on how the American mass media operate:
"But a critical analysis of American institutions, the way they function domestically and their international operations, must meet far higher standards; in fact, standards are often imposed that can barely be met in the natural sciences. One has to work hard, to produce evidence that is credible, to construct serious arguments, to present extensive documentation — all tasks that are superfluous as long as one remains within the presuppositional framework of the doctrinal consensus. It is small wonder that few are willing to undertake the effort, quite apart from the rewards that accrue to conformity and the costs of honest dissidence."
A user writing as "Exiled Consensus" in December 2018 said that it gives "too much credit" to climate change denialists to say that they are primarily motivated by science denialism.
It is equivalent to saying that cigarette company lobbyists analyze the facts, read the studies, conclude that cigarettes do not cause lung cancer, and are happy when their children develop smoking habits. Quite simply, such analysis does not occur, as author Ari Rabin-Havt explains in Lies, Incorporated. The intention of denial is to pollute discourse and sow doubt. That alone is considered a victory as it stalls action which can harm cigarette sales, or fossil fuel production. It is not science denial; rather, it is anti-profit denial, as the science is not even considered in the first place.
Or consider that Trumpist radio host Bill Mitchell tweeted in March 2020 that the coronavirus was "climate change 2.0," his way of saying that politicians were scaremongering. Mitchell is inclined to worry neither about climate change nor pandemic — or so he publicly claims. Intentionally ignoring scientists is part of his political ideology, as "GOP Climate Change Denial Set The Stage For Trump’s Coronavirus Conspiracies" more broadly (to quote a Huffington Post headline in July 2020).
Nyhan's work, by contrast, seems to be about more privately held beliefs.
Pessimism can drive down the accuracy of our estimates, according to Steven Pinker in his book Enlightenment Now:
"In a recent survey Hans Rosling found that less than one in four Swedes guessed that it [the average global life expectancy] was that high [71.4 years in 2015], a finding consistent with the results of other multinational surveys of opinions on longevity, literacy, and poverty in what Rosling dubbed the Ignorance Project. "The logo of the project is a chimpanzee, because, as Rosling explained, 'If for each question I wrote the alternatives on bananas, and asked chimpanzees in the zoo to pick the right answers, they'd have done better than the respondents.' The respondents, including students and professors of global health, were not so much ignorant as fallaciously pessimistic."
So they say
Albert Einstein said, "Only two things are infinite, the universe and human stupidity, and I am not sure about the former." Elbert Hubbard: "Everyone is a damn fool for at least five minutes every day. Wisdom consists in not exceeding that limit." George Bernard Shaw said it would be better to know that one does not know: “Beware of false knowledge; it is more dangerous than ignorance.” As hope, nonetheless, the words of Phyllis Bottome: "There is nothing final about a mistake, except its being taken as final."
Cryptids
"According to Chapman University’s Survey of American Fears, more than 20 percent of Americans believe Bigfoot is real, the same number who believe the Big Bang actually happened," William Giraldi wrote in 2019.
Animals often act smarter than us
Brian Klaas wrote in The Evolution of Stupidity (and Octopus Intelligence): What we can learn about intelligence, stupidity, and ourselves—from some of the smartest, strangest, alien-like creatures on the planet (The Garden of Forking Paths, April 23, 2024):
"Sadly, so much of our discourse around intelligence and stupidity gets hijacked by pseudoscience, racism, and debates over whether arbitrary measurements like IQ are valid. We ignore more interesting questions around intelligence and stupidity that we can learn not from ourselves, but from other species. In particular:
1. What, specifically, does it mean to be “intelligent?” What do we mean when we say that humans and chimps and dolphins and crows are intelligent?
2. Why did some species—including us—become smart, while others didn’t?
3. Why is stupidity still so widespread in humans?"
"Powell had earned his fortune as a lawyer representing the tobacco company Philip Morris, parrying the blows of critics who asserted that cigarettes were causing an epidemic of cancer in America. Though Powell and his associates fought these accusations tooth and nail, secretly they knew them to be true. ... This tactic would later be replicated by the oil industry as their own scientists confirmed that the burning of fossil fuels was beginning to endanger the environment and contributed to calamitous global climate change. By 1968 they already knew their businesses were creating 'serious world-wide environmental changes' that could lead to 'the melting of the Antarctic ice cap, a rise in sea levels, warming of oceans,' and a whole host of apocalyptic scenarios. Rather than deal with the looming crisis responsibly or ethically, the oil companies borrowed a page out of Big Tobacco’s playbook by using their resources to fund spurious science, often relying on the same individuals and companies to break consensus and inspire falsified skepticism."
Maybe everything's determined and we believe we have free will only because we haven't observed and identified why everything is happening?
"Richard Herrnstein, one of [B. F.] Skinner’s most-accomplished students, later his colleague in Harvard’s Psychology Department, and a luminary of radical behaviorism, once explained to me that any action regarded as an expression of free will is simply one for which 'the vortex of stimuli' that produced it cannot yet be adequately specified. We merely lack the means of observation and calculation."
— Shoshana Zuboff
In 2023, we're seeing the Supreme Court's "systematic discounting of, & even hostility to, the expertise of administrative agencies. It's an aggressive, assertive form of ignorance from a very powerful body", i.e., the court is embracing "populist epistemology". — Kevin Elliott on Bluesky
In reply to which, someone mentioned this paper: Populist Constitutionalism101 North Carolina Law Review 1763 (2023). Anya Bernstein and Glen Staszewski.
Also
This is one reason (among several) that it’s worth society’s trouble to fully fund a community of *full time nonsense specialists*: scholars, reporters, and communicators on misinformation, disinfo, pseudoscience, and fringe, quack, and paranormal claims (plus research on extremism, propaganda, etc)
Most working scientists are traditionally very reluctant to engage with fringe claims because it wastes time, invites abuse, and also *is not their area of expertise* (except for a few who make nonsense an additional specialty). Creationists clobbering scientists in debates is the classic example
Society has never come close to supporting skeptics, debunkers, misinfo reporters, and disinfo researchers at any level remotely approaching the scale of the problem—and even today, during the “infodemic” or “misinformation apocalypse,” there are Very Serious People who deny there’s any such need
"...Republicans are often significantly better at ignoring the policy details and focusing instead on shaping schemas within which voters perceive the world. Too many people in politics wrongly think you’ll win the argument if you have better facts. But winning the argument in politics isn’t often about finding more or better facts. It’s about perception and the cognitive shortcuts we use to process information as we sort our world into neat categories that make sense."
Brian Klaas, Schemas and the Political Brain: The neuroscience and psychology of why better facts don't make winning arguments in politics. Garden of Forking Paths, Apr 18, 2025
Javier Cercas. The Anatomy of a Moment: Thirty-five Minutes in History and Imagination. (2009) Translated from the Spanish by Anne McLean. New York: Bloomsbury, 2011. p. 3.
One question is whether empiricism itself can be empirically verified. Michael Lerner:
"Consider its [scientism’s] central belief: ‘That which is real is that which can be verified or falsified by empirical observation.’ The claim sounds tough minded and rational, but what scientific experiment could you perform to prove that it is either true or false?"
Another is whether, although empiricism is indeed a value, there may be other methods and approaches that are also valuable. Sir James Baillie:
"Empiricism is so true that the closer one keeps to it – without becoming an empiricist! – the better. Just as, on the contrary, Idealism is so questionable that the farther one keeps from it – without ceasing to be an idealist! – the truer will one's view of reality be."
Daniel C. Dennett:
"This spell must be broken, and broken now. Those who are religious and believe religion to be the best hope of humankind cannot reasonably expect those of us who are skeptical to refrain from expressing our doubts if they themselves are unwilling to put their convictions under the microscope. ... If the case for their path cannot be made this is something that they themselves should want to know. It is as simple as that. They claim the moral high ground; maybe they deserve it and maybe they don’t. Let’s find out."
In philosophy, it is important to define one's terms precisely, as advised by Caritat, Marquis de Condorcet:
"One of the essentials for any sound philosophy is to produce for every science an exact and precise language in which every symbol represents some well defined and circumscribed idea; and so by rigorous analysis to guarantee that every idea is well defined and circumscribed."
Defining terms enables them to be verified. However, if defined too rigorously, one loses the flavor, subtlety and ambiguity of ideas, as well as the different human perspectives that generate them.
Sources
Michael Lerner. The Left Hand of God: Taking Back Our Country from the Religious Right. HarperSanFrancisco, 2006. p. 132.
Sir James Baillie, Reflections on Life and Religion, London: George Allen and Unwin, 1952. p 252-3
Daniel C. Dennett. Breaking the Spell: Religion as a Natural Phenomenon. New York: Penguin Group, 2006. p. 17.
Marie-Jean-Antoine-Nicolas Caritat, Marquis de Condorcet. Sketch for a Historical Picture of the Progress of the Human Mind. (1794) Translated by June Barraclough. Westport, Conn.: Hyperion Press, Inc. 1955. p 44.
A character in a novel by Gregory David Roberts called truth a "bully". People feel they have to pursue and honor truth even when they don't like it or it doesn't seem to serve them.
"...take yesterday, for instance, when we were all talking about truth. Capital T Truth. Absolute truth. Ultimate truth. And is there any truth, is anything true? Everybody had something to say about it – Didier, Ulla, Maurizio, even Modena. Then you said, The truth is a bully we all pretend to like. I was knocked out by it. Did you read that in a book, or hear it in a play, or a movie?"
This truth, this bully, requires its own bully – the challenge of falsehood – to compel it to strengthen itself. John Stuart Mill wrote, "If opponents of all important truths do not exist, it is indispensable to imagine them and supply them with the strongest arguments which the most skilful devil's advocate can conjure up."
But then, maybe the devil's advocate is also correct, as per Neils Bohr: "There are trivial truths and there are great truths. The opposite of a trivial truth is plainly false. The opposite of the great truth is also true."
Some see a "hierarchy of truth," as explained by Kovach and Rosenstiel:
"It is interesting that oppressive societies tend to belittle literal definitions of truthfulness and accuracy, just as postmodernists do today, though for different reasons. In the Middle Ages, for instance, monks held that there was actually a hierarchy of truth. At the highest level were messages that told us about the fate of the universe, such as whether heaven existed. Next came moral truth, which taught us how to live. This was followed by allegorical truth, which taught the moral of stories. Finally, at the bottom, the least important, was the literal truth, which the theorists said was usually empty of meaning and irrelevant. As one fourteenth-century manual explained, using logic similar to what we might hear today from a postmodern scholar or a Hollywood producer, 'Whether it is truth of history or fiction doesn't matter, because the example is not supplied for its own sake but for its signification.'
"The goal of the medieval thinkers was not enlightenment so much as control. They didn't want literal facts to get in the way of political/religious orthodoxy. An accurate understanding of the day threatened that control – just as today it is a weapon against oppression and manipulation."
What reins in truth? It does require "a measure of some kind" or else it is not viewed as truth. Nicholas Fearn:
"The wider conclusions of Protagoras may be self-refuting, but he did hit upon an important insight. This is the thought that every truth requires a measure of some kind. Truths are not true of and in themselves, but are true within a system of thought, or according to certain rules that test their veracity. This would be the case even if there were only one objective measure of truth. It is unequivocally true that two plus two equals four, but only because four is always the result when we apply the rules of addition correctly. The value of a pair of shoes, on the other hand, may be different according to whether they are given to a beggar or a king, but in each case their value is a value to someone. In both cases, the measure of the truth is external to what it evaluates. How we are to evaluate the measure is another issue, and one that does not always have an easy answer. It will certainly not do to say that this measure is simply 'reality' or 'the way things are,' since how we divine the nature of things is precisely what is in question."
Sources
Gregory David Roberts. Shantaram. New York: St. Martin's Griffin, 2003. p. 60.
John Stuart Mill. Essay on Liberty, quoted in Rollo May, Power and Innocence: A Search for the Sources of Violence. New York: W. W. Norton and Co., 1972. p 109-110.
Bill Kovach and Tom Rosenstiel. The Elements of Journalism: What Newspeople Should Know and the Public Should Expect. New York: Three Rivers Press, 2001. p 38.
Nicholas Fearn. How to Think Like a Philosopher. New York: Grove Press, 2001. p 15.
"True ideas are those that we can assimilate, validate, corroborate, and verify," James summed up his empiricist theory of pragmatism. "False ideas are those that we cannot."
There is no epistemological chasm between reality and knowledge. The space is filled with ideas and sensations. The universe is made of relationships as much defined by experience, and therefore subject to debate, as their constituent parts.
Adding context to experiences--for example, realizing the identity of someone seen--gives knowledge. Knowledge is having an idea that resembles and impacts reality. Solipsistically copying the universe in our minds, such as knowing the number of hairs on a head, achieves no purpose. "All that the pragmatic method implies, then, is that truths should have practical consequences." Scientific laws are a "human device" and "true so far as they are useful." James wrote, "'The true' is only the expedient in the way of our thinking, just as the right is only the expedient in the way of our behaving."
"Truth" means connection or relation to "terminal experiences," the "linchpins of all reality." The linchpins themselves are not "true." We think we have knowledge when our propositions are consistent (often achieved by leaving out contradictory or unknown facts). Pragmatism, a method of thinking and behaving, is not necessarily a call to action because ideas can be said to "work" with other ideas.
Broadly applied, pragmatism can be called humanism, which holds that an experience is "true" if it minimizes contradiction and yields satisfactory results with related experiences. You know a building's location if you can lead someone there. Truth is the event of verification; a belief isn't true until proven. Therefore, "experience and reality come to the same thing." The knower and the known are both parts of experience; "experience as a whole is self-containing and leans on nothing."
In contrast to pragmatism, "absolutism" or "transcendentalism" maintains that certain propositions are true regardless of any useful consequences to believing them. However, James notes, the only "cash-value" of a transcendent reality is whether there are practical results to knowing it. "The transcendentalist believes his ideas to be self-transcendent only because he finds that in fact they do bear fruits. Why need he quarrel with an account of knowledge that insists on naming this effect?" Pragmatism fleshes out a definition of truth that absolutism phrases only in the abstract. "We offer them the full quart-pot, and they cry for the empty quart capacity." The view that "concrete workings" are irrelevant to truth is "the renunciation of all articulate theory."
Pragmatism is inaccurately accused of holding that anything is true if one thinks it true at the present moment. Rather, pragmatism emphasizes the context in which idea and object relate. Another objection is that the pragmatist thesis is not itself meant to be pragmatically understood. James responds that it is indeed; an idea is true if it is satisfactory, and the pragmatic thesis is "ultra-satisfactory" to pragmatists.
William James. The Meaning of Truth: A Sequel to "Pragmatism." New York: Greenwood Press, 1968. (Originally 1909.)
This summary was written in 2005, along with a series of other 500-word summaries of philosophy books, as an exercise in brevity.
Atheist neuroscientist Sam Harris's book The Moral Landscape: How Science Can Determine Human Values (2010) argues for a consequentialist and realist moral theory where the good is whatever promotes human well-being. Harris says that insofar as there are objective facts about what makes humans thrive, there are objective facts about how we ought to treat other people. In any one situation, there may be multiple relevant types of well-being to be considered, it may be difficult to gather data and assess risk, and there may be dissenting opinions, but this does not mean the moral question has no correct answer. Moral facts may exist even if we are untalented at perceiving them and tend to disagree about them.
There are facts about morality, he says, because there are facts about well-being. "If we were to discover a new tribe in the Amazon tomorrow, there is not a scientist alive who would assume a priori that these people must enjoy optimal physical health and material prosperity," Harris writes. Scientists would want to study the tribe's health and prosperity before passing judgment. On the other hand, he suspects that
"news that these jolly people enjoy sacrificing their firstborn children to imaginary gods would prompt many (even most) anthropologists to say that this tribe was in possession of an alternate moral code every bit as valid and impervious to refutation as our own...The disparity between how we think about physical health and mental/societal health reveals a bizarre double standard: one that is predicated on our not knowing - or, rather, on our pretending not to know - anything at all about human well-being."
We do indeed know some things about individual and social well-being, Harris says, such as that killing children doesn't contribute to it. "But this notion of 'ought' is an artificial and needlessly confusing way to think about moral choice," he qualifies. "For instance, to say that we ought to treat children with kindness seems identical to saying that everyone will tend to be better off if we do."
He bravely acknowledges some of the problems with consequentialism. For example, he quotes philosopher Patricia Churchland who has pointed out that "no one has the slightest idea how to compare the mild headache of five million against the broken legs of two, or the needs of one's own two children against the needs of a hundred unrelated brain-damaged children in Serbia." He also mentions the problem of calculating well-being as raised by philosopher Derek Parfit: If total well-being is our aim, it must be better to fill the Earth with hundreds of billions of people who have the barest glimmer of joy than to make the current seven billion people very happy. If, on the other hand, we are to focus on the average well-being per capita, we should euthanize our unhappiest people and we should prefer large numbers of mostly miserable people over one exceedingly miserable person. We would not normally intuit these to be moral goals, so perhaps something is amiss with these approaches to consequentialism.
Harris also acknowledged John Rawls's criticism that fair play and optimum outcomes sometimes seem at odds with each other. What if enslaving a few people would make the rest of humanity very happy? In a consequentialist moral theory, why can't we treat people as means to the ends of others? Harris doesn't try to tackle too much in his response to these objections - which was probably the right thing to do within the scope of a book that is already quite ambitious - but it is important to realize that these questions are left open. This kind of radical uncertainty doesn't lend support to his idea that moral questions have right and wrong answers. It isn't just that the calculation is too hard to crunch; it's that it looks suspiciously like it can't be run at all.
Harris explores the evolution of morality and its roots in cooperation and language, referring to the work of William Hamilton on kin selection, Robert Trivers on reciprocal altruism, and Geoffrey Miller on sexual selection. He talks at some length about how the brain's medial prefrontal cortex (MPFC) is associated with both mathematical and ethical beliefs; people with damage to that area are more likely to remove emotion from moral analyses. He correlates a healthy amount of fear with moral understanding.
He points out that, as Adam Smith noted, we tend not to be swayed by anonymous and distant suffering, even if we know it involves large numbers of people. He mentions of the work of psychologist Paul Slovic who has pointed out that this leads to "genocide neglect." Harris concludes: "Clearly, one of the great tasks of civilization is to create cultural mechanisms that protect us from the moment-to-moment failures of our ethical intuitions." Another gap in our moral reasoning is that we suffer from poor intuition on risk analysis. We tend to gravitate toward the power to "save" some and shy away from the complementary choice of "losing" the rest, even when these amount to the same thing and the only difference is in the linguistic expression of the glass being half full or half empty. Harris explains: "Another way of stating this is that people tend to overvalue certainty: finding the certainty of saving life inordinately attractive and the certainty of losing life inordinately painful."
He doesn't believe in free will and thinks we trump up the idea of it more than we actually feel a sensation that represents free will. First of all, from a neurological point of view, unconscious brain activity associated with an action has been documented to precede our awareness of having chosen to do anything. It's as if the car starts before we turn the key. The sense of having "chosen" may be an illusion. Secondly, although we have the freedom to emphasize certain facts, we cannot, given a body of available evidence, freely choose to believe something that contradicts the evidence. He concludes that scientific and ethical judgments have something in common; beliefs about facts and beliefs about values are not very different from a neurological perspective.
The book's shortcoming lies in his attempt to straddle the fence on the question of how much we should listen to our moral intuition. For example, he says that the fact that his moral theory fits our commonsense definition is a reason in favor of accepting his theory: "While moral realism and consequentialism have both come under pressure in philosophical circles, they have the virtue of corresponding to many of our intuitions about how the world works." This does not quite dovetail with his list, mentioned above, of all the ways in which our moral intuitions tend to let us down due to rational and emotional flaws in our brains. If we're so fallible about the details of calculating consequentialist outcomes, why should we assume that consequentialism itself is a correct theory? Who cares what seems right to a species that is so often wrong?
A specific application of this problem occurs when he takes anthropologist Scott Atran to task for claiming that the religious motives articulated by violent Muslim jihadis may not be what really causes them to become violent; they may be radicalized by a lack of social integration, for example. Harris seems outraged as he insists that "given the clarity with which they articulate their core beliefs, there is no mystery whatsoever as to why certain people behave as they do." This is an odd comment, coming from him; isn't one of the main arguments of his book that people frequently are mistaken about their own motives? As mentioned above, Harris did say that we should create "cultural mechanisms" that would, in essence, trick us into behaving better. There seems to be a trace of unexamined bias here: is it well-meaning atheists who would benefit from paternalistic cultural guidance designed to draw out increasingly better behavior that reflects their authentic inner goodness, while militant Muslims aren't expected to have psychological depth beyond what they announce publicly and are in this respect unreformable?
A related criticism is that he's unclear about the specific roles of reason and emotion in our moral judgments. Mentioning the work of philosopher Jonathan Haidt who has argued that moral decisions are usually made on closed-minded gut instinct rather than on open-minded careful reasoning, Harris insists that just because we're not always rational doesn't mean we shouldn't set rationality as a goal. But wasn't his mention of the MPFC's role in injecting emotion into moral decisions meant to imply that emotion is an inseparable part of moral decision-making? "Of course," he says, "it is now well known that our feeling of reasoning objectively is often illusory. This does not mean, however, that we cannot learn to reason more effectively, pay greater attention to evidence, and grow more mindful of the ever-present possibility of error." One wonders whether he would be willing to apply this charge of an "illusory feeling of reasoning" to his own "intuition about how the world works" as mentioned above.
Skepticism of any kind is a slippery slope to radical skepticism. Perhaps we should be radical skeptics about many of our moral intuitions, but Harris doesn't let us know exactly when and why he jumps off the skeptic's bandwagon. Alternatively, rather than a problem of skepticism, it could be a problem of subjectivity that confronts all scientists, and perhaps especially those scientists that study the human mind. How can a person use his or her own subjectivity to study other subjective beings and call it an objective process?
The conflation of reason and emotion, and of the metacognition that chooses which method of judgment to apply in a given circumstance, also plagued Jonah Lehrer's recent book on the neuroscience of decision-making. In How We Decide, Lehrer claims that reason and emotion are interrelated phenomena, and he prods this blurry area without defining (or successfully redefining) these terms. Lehrer's book is not listed in Harris's prodigious bibliography. Might Harris have avoided making the same error had he studied Lehrer's presentation?
Overall, this is a provocative book worth reading, especially for those with prior exposure to these popular topics in philosophy and neuroscience. Harris sticks to the narrow path and, for the most part, avoids wading into the treacherous waters of bitter diatribes against religion. He effectively demolishes the allegation of a metaphysical basis for morality that is defended by most religions, not by making sport of religious traditions and ancient texts, but simply by offering a substantive, nuanced, functional, science-based alternative.
"I...deny the existence of sexual virtues," Wollstonecraft wrote. "For man and woman, truth, if I understand the meaning of the word, must be the same..." By "sexual virtues," she means any virtues assigned primarily to one sex or the other. She would be happy to learn that, in the United States today, there is general agreement that we should not have "double standards" of virtue for men and women, especially in areas such as education, career, athletics, sexual behavior, and child rearing.
,p>But on another point of Wollstonecraft's, there is less agreement. She continued: "I do earnestly wish to see the distinction of sex confounded in society..." Many people, including feminists, do not want to promote androgyny for everyone or the erasure of differences between the sexes and genders. Some believe that there are biological differences between males and females, the importance of which should not be discounted; others simply feel that differences in gender performance make the world a more interesting place.
Yet it is hard to see how these two claims can be separated from each other. How can one make the former claim that men and women are alike in all morally relevant respects without also making the latter claim that there should be no social distinction between them?
Mary Wollstonecraft. A Vindication of the Rights of Woman. (1792) Chapter 3: The Same Subject Continued, and Chapter 4: Observations on the State of Degradation to Which Woman Is Reduced by Various Causes.
"Mary had furiously written Rights of Men in 1790 to defend the French Revolution from the conservative gripes of philosopher Edmund Burke, who described working people as a 'swinish multitude' unworthy of the vote. Men was an instant bestseller. Paine's 1791 Rights of Man, which was similarly sympathetic to the Revolution, was an even bigger hit; it sold as many as a million copies."
* * *
Mary followed up Rights of Men with A Vindication of the Rights of Woman in 1792; it was an influential work of early feminism, calling for drastic upgrades in gender equality and asserting the importance of eduation for girls.
— Brian Merchant. Blood in the Machine: The Origins of the Rebellion Against Big Tech. Little Brown and Company, 2023.