Saturday, October 14, 2017

On tolerating intolerance: Thoughts from 'The Atheist Muslim'

Ali Rizvi begins The Atheist Muslim with his memoir of his formative years growing up Muslim and then proceeds to general atheist arguments. He is concerned about human rights and also about logical arguments for the existence of God. He identifies “when I let go of religion completely” as the moment when he learned that the Big Bang created time itself and therefore eliminated the necessary frame for the idea of creation.

The most valuable chapter for me is the sixth one, “Islamophobia-Phobia and the ‘Regressive Left,’” where he ties together modern identities and arguments for Islam and atheism. In this chapter, he begins by identifying as a “free speech absolutist” because individuals, not governments, should decide what constitutes “hate speech.” For one thing: “Criminalizing hate speech like France does infantilizes people. It doesn’t just take away someone’s right to speak; it takes away your right to form your own opinions and response to them.” Furthermore, “The uncomfortable truth is this: if you really wanted to ban all hate speech, the Bible and Quran would be the first to go. Next would be the preachers who read from them and quote them in their sermons.” (p. 132)

He goes on:


“In their well-intentioned effort to protect what they see as a targeted minority [people with Muslim identity], Western liberals unwittingly find themselves fighting to guard and protect the same backward values [of the Muslim religion] that their counterparts in Muslim-majority countries are fighting against.” (p. 133) He asks us to “Consider the case of my friend Raif Badawi, the liberal Saudi blogger who is currently serving a ten-year prison term with a sentence of one thousand lashes; or all the Bangladeshi bloggers who have been hacked to death for writing critically about Islam.” (p. 134)


It is more important now than ever to challenge and criticize the doctrine of Islam. And it is more important now than ever to protect and defend the rights of Muslims. Both of these must go together. … The only rational position between Islamic apologism and anti-Muslim bigotry is one espousing secular and liberal values. This is the only position that allows both the right to criticize bad ideas and the right to believe in them — both of which must be protected in order to set the stage for meaningful dialogue. (p. 135)


"Again, it’s crucial to emphasize the difference between criticism of Islam and anti-Muslim bigotry. The first targets an ideology. The second targets human beings. This is an obvious, significant distinction, yet both are frequently lumped together under the unfortunate, reductive umbrella term ‘Islamophobia.’ Again, human beings have rights and are entitled to respect. Ideas, beliefs, and books don’t and aren’t. The right to believe what one wants to believe is sacred. The belies themselves aren’t. Challenging ideas moves societies forward. Demonizing people rips societies apart. If anything, it’s precisely because of how I’d seen ordinary Muslims suffer under theocratic policies and Sharia law that I wanted to start a dialogue to help shatter the taboo of criticizing religion." (p. 137)

Some nuance is missing here. Much of the modern debate over tolerance of sexual orientation has centered on the question of whether same-sex desire is something that one chooses or something that is an immutable feature of one's being, and therefore whether same-sex behaviors (and the tolerance thereof) can be said to be ideologically motivated. My point in bringing this up is not to shift the topic from tolerance of Muslims to tolerance of gays, but rather to suggest that a similar dynamic might be at play in assumptions about how much of religious belief is ingrained in someone's personality from an early age and can't easily be unwound upon mere instruction from others.


“Criticizing, satirizing, and even mocking any belief system is never bigoted or racist.” (p. 143)


“When legitimately criticizing illiberal elements of Islam — as we might do any other religion or political ideology — elicits accusation of bigotry and racism, it abruptly ends an important conversation that needs to be had. Calling someone a bigot, racist, or Islamophobe isn’t a counterargument. It’s a lazy substitute for one. Yet we all fall for it.” (p. 146)


He quotes Reza Aslan: “People don’t derive their values from their religion — they bring their values to their religion...Those interpretations have nothing to do with the text, which is, after all, just words on a page, and everything to do with the cultural, nationalistic, ethnic, political prejudices and preconceived notions that the individual brings to the text.” Rizvi challenges Aslan’s overstatement that religious texts have “nothing” to do with values. “So, every time a jihadist yells ‘Allah Akbar!’ and severs an infidel’s head from his body with a knife, citing verses like 47:4 and 8:12-13 from the Quran,” Rizvi asks, “you would blame every possible factor for his actions except the one that literally contains the words, ‘Smite the disbelievers upon their necks’?” (pp. 148-149) He also observes that blaming the people (identified, for example, by their culture that supposedly determines their values) does tend toward bigotry.


“Our criticisms of religion aren’t an attack on people, but a challenge to what we consider bad ideas that drive bad behavior, and the sacred status afforded to them. Our opposition to religion isn’t a demonstration of bigotry; it is a demonstration against it.

Bigotry against bigotry isn’t bigotry, and tolerance of intolerance isn’t tolerance.

...

Liberalism isn’t just about tolerance of dissent. It is also about an intolerance of those that don’t tolerate dissent.” (pp. 159-160)


Page numbers from:

Ali A. Rizvi. The Atheist Muslim: A Journey from Religion to Reason. New York: St. Martin’s Press, 2016.

Saturday, October 7, 2017

Fatal flaws in 'Mere Christianity' by C. S. Lewis - Part 1, The Moral Law

Mere Christianity by C. S. Lewis is a book beloved by millions of people. When I first read it out of curiosity, at age 18 and having just started college, I found it to be a string of fallacies. I did not expect ever to change my mind about this, but I did expect one day being asked to explain why. Nearly 20 years later, I reread the book, and here is my explanation.

My objections to the following 14 flaws in C.S. Lewis' positions and arguments only address the first part of the book. (The rest of the book is also flawed, but I have not yet written an explanation as to why I believe so.)

Part I - The Moral Law

1. He assumes that there are facts about moral right and wrong, that individuals generally have intuitive knowledge of the correct answers to these moral questions, and that moral obligation presents itself as an "impulse."

The moral law is a kind of natural law, except that, as distinct from physical laws such as the law of gravity, the moral law can be disobeyed.

He says we are full of instincts upon which it may be moral (or not) to act depending upon the situation. When one "hear[s] a cry for help from a man in danger," one may feel conflicting desires about whether to help the man or to keep oneself safe. There must be, then, "a third thing which tells you that you ought to follow the impulse to help, and suppress the impulse to run away. Now this thing that judges between two instincts, that decides which should be encouraged, cannot itself be either of them." The Moral Law is not itself an instinct, but rather it is something that can judge between instincts based on circumstances.

The problem: For any individual in the moment, one impulse might win out simply because it is stronger than the other. It isn't logically necessary that there has to be an outside arbiter deciding which impulse is right. He is wrong to say that "this thing that judges...cannot [my emphasis] itself be either of them." From an evolutionary perspective, whatever tends to promote survival of individuals and groups will be the trait that is passed down. Another possibility is that these "impulses" might not be discrete, unchanging things, but might be facets of a dialectic conversation in which each "side" informs the other before an action is chosen. The "weaker impulse" might be seen as weaker in retrospect because it was the losing side, the path not taken, or a path that was ultimately taken but required a winding path and extra support. It is not really a weak impulse. It may be a strong impulse for the sake of which we have to fight against social or political currents.

But if Lewis is correct in that there really is a Moral Law, he needs to do more work to prove why it cannot be an instinct. Perhaps it is the only instinct upon which we always ought to act. If it is not an instinct, he needs to explain how it is that we are intuitively aware of it. By what mechanism does it work?

He furthermore says that "at those moments when we are most conscious of the Moral Law, it usually seems to be telling us to side with the weaker of the two impulses."

The problem: It is unremarkable that we are most conscious of making a moral judgment when there is some kind of battle between impulses and we feel a calling to side with the one we might not automatically choose. Only when there is an internal battle are we aware that we are making a hard decision. Our “stronger” impulses let us run on autopilot, and our society endorses them and does not stand in our way; our "weaker" ones have to be consciously chosen and politically defended. This does not prove that there is a Moral Law that favors the "weaker" impulse and helps us decide.

Later he claims that the mistaken worldview he calls “Dualism” pits good and evil against each other as two equal forces. The two forces cannot be perfectly equal, he says, since the mere identification of one force as “good” reveals that “one of them is in a right relation to the real ultimate God and the other in a wrong relation to Him.” Also, the evil force cannot exist independently of the good force, since “you cannot be bad for the mere sake of badness…badness is only spoiled goodness.” Thus, while Christianity maintains that there is a dualist war between good and evil, Christianity presents it more specifically as “a civil war, a rebellion,” with the devil himself as a fallen angel.

The problem: If he is here suggesting that a good God is stronger than an evil Devil, this doesn't align with his claim that in humans the impulse that tends toward the good is the weaker impulse.

2. His depiction of moral agreement and disagreement is extremely oversimplified.

On the subject of moral agreement, he says we can speak coherently of "moral progress," or "changing for the better" on a societal level, because of the assumption that "Reformers or Pioneers...understood morality better than their neighbours did....that there is such a thing as a real Right, independent of what people think, and that some people's ideas get nearer to that real Right than others."

On the subject of moral disagreement, he says that people may "quarrel" when one points out an unfairness, cruelty, or forgotten promise and the other replies by rationalizing "that what he has been doing does not really go against the standard, or that if it does there is some special excuse." Yet this disagreement simply illuminates, he believes, "some sort of agreement as to what Right and Wrong are". When people are asked to account for their behavior and specifically to explain why it does not conform to a moral standard, they provide "a string of excuses as long as your arm". Lewis interprets this as meaning that "we believe in decency so much...that we cannot bear to face the fact that we are breaking [the moral law], and consequently we try to shift the responsibility."

The problem: Assuming that moral truths are facts (rather than individual or social constructs) is a large assumption. Even granting him that assumption, he hasn't given samples of the varied types of moral agreements and disagreements. Our intuition about what is "right" is not infallible. How do we know when our intuition works and when it fails us? Surely there are examples when a majority of people are in agreement, but we want to say that the majority's conclusion is wrong. His example of a "quarrel" is limited to one specific scenario: one person calling out another on a violation of a shared standard, probably one of which they were both already conscious. Other kinds of quarrels include accusations of "violations" of expectations that were never articulated and arguably never existed, genuine disagreements over what the shared standard ought to be, or pleas for privacy and autonomy to make one's own way independently of others' expectations (i.e. a claim that the issue isn't properly a moral question at all). All of these would complicate his approach. He should also acknowledge that individual psychology helps determine one's moral intuitions, meaning that people care about different virtues and reach different conclusions (see the 21st century work of George Lakoff and Jonathan Haidt) and that moral reasoning skills vary in sophistication based on age and other ability levels (see the work of Lawrence Kohlberg from 1958 onwards). These particular thinkers were not available to Lewis when, in the early 1940s, he gave his BBC radio talks that were later published as Mere Christianity, but the concepts are timeless and he might have begun to inquire about these problems himself. He should also acknowledge the role of power (in the sense of a dominant culture exerting control over individuals) in each individual's ability or willingness to come up with moral answers.

Here is one example of a moral disagreement. To the common objection that God’s salvation is exclusivist, Lewis says, “if you are worried about [the salvation of] the people outside [Christianity, who haven’t heard or can’t believe the Christian message], the most unreasonable thing you can do is to remain outside yourself.”

The problem: To remain outside deliberately in this situation is to make a principled protest against exclusivity. To refuse to embrace a specific religion can be a form of protest. This does not reflect an underlying "agreement as to what Right and Wrong are" and an embarrassed rationalization of one's own disobedience. Rather, it reflects real disagreement about a fundamental assumption.

3. He offers limited insight into when we should oppose others' moral agenda, and no insight into when our opposition should lead us to intervene and when it should remain a privately held belief.

He assumes that the moral consensus allows us to judge others. Even when the person being criticized does not admit to recognizing the law, they can still be held to its standard and blamed for transgressing it: "What was the sense in saying the enemy [in WWII] were in the wrong unless Right is a real thing which the Nazis at bottom knew as well as we did and ought to have practiced?"

This is not an adequate investigation of how we should properly construe and defend human rights. A plausible competing explanation is that human rights are a political construct that was invented because it objectively improves people's lives and satisfies our subjective, empathy-driven concerns; that such rights are best understood and implemented in conditions of political freedom; and that when a society falls into totalitarianism there is a perceived moral need to rescue it or help it rescue itself from that condition so that human rights can be restored.

4. He downplays the significance of the diversity of moral opinions.

He assumes that, in its most general form, the content of the moral law is valid across all cultures: "Men have differed as to whether you should have one wife or four. But they have always agreed that you must not simply have any woman you liked." The differences, he says, are "not nearly so great as most people imagine".

The problem: This is not a comprehensive statement on the extent of cultural diversity. He cannot just assume that human differences are not significant and meaningful. A man's attitude "that you must not simply have any woman you liked" may be a reflection of individual psychology, gender roles, or political power. It can be any or all of these things independently of whether it is also an objectively valid moral fact. Furthermore, if it is a moral fact, it is curiously devoid of content. What would be the use of a transcendent law that says only Thou shalt obey rules of sexual conduct? The content of the rule needs to be defined. If it is humans who write the content, and if this content can vary across cultures, that undermines Lewis' point greatly.

5. As part of defending the idea of shared core moral beliefs, he is too quick to dismiss the counterobjection of why atrocities happen.

He argues that significant collective moral lapses are not really exceptions to the rule that "everyone has intuition of correct moral answers" but rather that there must have been mitigating circumstances that confused the moral analysis. In the following example, he excuses moral error on the basis of factual ignorance. It was reasonable for English people to execute suspected witches, given – so goes his apology – the widespread, factually mistaken belief that some people were really evil witches, and that "surely we would all agree that if anyone deserved the death penalty, then these filthy quislings did."

The problem: This implies that he should want to at least partially morally exonerate the Nazis due to their factually incorrect beliefs about the targets of their witch hunts, and that is an undesirable conclusion. More abstractly, he hasn't explained why a failure of factual knowledge should overpower, and why it should excuse the overpowering of, our usual empathy and political systems that serve as checks and safeguards on the possibility of human violence. Nor has he demonstrated how we know when we are disagreeing about facts on the ground and when we have a more fundamental disagreement about a moral standard – in other words, what is the distinction between scientific progress and moral progress, both of which are needed for us to be able to treat others well.

6. He suggests that virtue needs no explanation.

He says: "If we ask: 'Why ought I to be unselfish?' and you reply 'Because it is good for society,' we may then ask, 'Why should I care...' * * * You would have said just as much if you had stopped at the statement, 'Men ought to be unselfish.' And that is where I do stop."

The problem: This is a straw man. The interlocutor could give a better reply than that: for example, either about the evolutionary origins of unselfishness (why it can promote survival, and thus how we acquired it), or about the merits of being selfish or unselfish in any particular situation, since, as he acknowledged earlier, it does vary based on circumstance. Just because he did not provide a better answer doesn't mean there isn't one.

7. He wants to limit the way in which the Moral Law can demonstrate its existence.

He says "If there was a controlling power outside the universe, it could not show itself to us as one of the facts inside the universe" (which is fine), and then follows up with a more problematic statement: "The only way [emphasis mine] in which we could expect it to show itself would be inside ourselves as an influence or a command trying to get us to behave in a certain way." He says that this subjective experience of directly perceiving the Moral Law is the only way we can assure ourselves of its existence. Empirical observation of how we actually do behave cannot reveal the fact that we also have opinions of how we ought to behave.

The problem: It is not obvious what he means by saying that the Law can't be "inside the universe" yet can be "inside ourselves." Furthermore, his statement on behaviorism isn't obviously true. Psychological experiments are performed on animals to investigate their moral capabilities, and, based on this empirical observation, researchers draw conclusions about whether animals have opinions about proper behavior. An ordinary person who watches a dog and a fish would conclude that the dog has opinions of how it ought to behave while the fish does not. Whether or not this perception is accurate, it it is easy to form the perception.

8. He too quickly dismisses different spiritual approaches.

He disapproves of the humanist/evolutionary view of a Life-Force because he doesn't think people feel accountable toward it. It is just a pleasant idea and does not enforce moral behavior. The belief in absolute good and bad, together with the association of God with the absolute good, also precludes what he calls “Pantheism,” the belief that “the universe almost is God...and anything you find in the universe is a part of God.” This is, he explains, because some things are bad; therefore, some things are not part of God. It is God who gives us the ability to distinguish good from bad.

The problem: The charge of This is a pleasant idea that doesn't hold us accountable could also be said of more traditional ideas of God. A god need not recommend moral behavior and there is not necessarily any way for God to enforce it or any interest in the part of people on obeying it. Philosophy addresses this under the term "divine command theory."

9. He posits a God who is disappointed in us, and he mistakenly suggests that this belief is more comforting than atheism.

The Moral Law “tells you to do the straight thing and it does not seem to care how painful, or dangerous, or difficult it is to do.” We may conclude that God himself “is not soft,” since the Moral Law comes from God and tells us about God’s character. This makes God rather terrifying, since “if there does exist an absolute goodness it must hate most of what we do.” Yet atheism would be no comfort since it would amount to nihilism: “If the universe is not governed by an absolute goodness, then all our efforts are in the long run hopeless.” So we begin with despair. We should not seek comfort, but we should seek the truth about our situation and we may find comfort.

The problem: We will have concerns about mortality and posterity regardless of whether there is a God and whether we believe in God. Facing the truth of our evanescence is not specific to theism and is not necessarily comforting.

10. His theodicy (resolution of the question of why evil exists) is unsatisfying.

He says that God has given humans free will to obey or disobey. Why? Because God is “probably the same” as a parent who wants the children to learn. Generally, God helps us “love and reason” in the manner of a teacher who holds a child’s hand while it writes. God wants the world this way because it’s the only way there can be real meaning in human life rather than a world full of robots (“automata”). Furthermore, God invented humans to run on him as a car runs on gasoline, so we can never find happiness without religion or while arguing against God.

The problem: While other theistic approaches are no better at resolving this question, it is significant that he hasn't adequately addressed it because he acknowledged it as an important problem and he seems to believe that he has adequately addressed it. What's wrong? First, we have no reason to assume that a god would "probably" feel and behave "the same" as a human parent. Second, if we need this much guidance, do we really have free will in this department? Do we only have free will as a child has free will? Third, it is also demonstrably untrue that nonreligious people can't be happy.

11. The "Mad, Bad, or God" claim does not offer a full set of options.

He says that Jesus' claim to divinity means that we can only conclude that he was crazy, lying, or telling the truth. If Jesus was not telling the truth on this point, then he cannot be considered a great moral teacher. Lewis says it is “obvious” that Jesus isn’t mad or bad, so therefore he’s God.

Lewis says that Jesus identified himself with a God outside the world, not a pantheistic God inside the world, for two reasons: because Jesus was limited by his Jewishness and that meant he couldn’t be a pantheist, and because Jesus said he forgave all sins which doesn’t make sense unless he (as God) was personally offended by the sin.

The problem: Those reasons are invalid. Anyone in any time or place can have a sense of mystical oneness with God, and anyone can be self-righteous or sensitive enough to believe they are injured by other’s conduct that really doesn’t concern them. The potential explanations that Jesus was a mystic or that he was self-righteous would lead to a different conclusion than Lewis’. Lewis' conclusion is not as obvious as he asserts it to be. In the Mad-Bad-or God argument, he bypasses alternatives such as the idea that Jesus was honestly mistaken or speaking metaphorically, as well as the observation that the written record of this character named Jesus, at least of these particular words, may have been more of a literary or folk representation or a theological lesson than anything historically attributable to an actual person. And it is not clear why the same argument that constitutes a great moral teaching when spoken by God is mad or bad if spoken by a human. It might mean something a little different but it isn’t pure wrongness and evil.

12. He admits that his religion's beliefs and rituals are weird.

Just as God created human sexual reproduction, God created mechanisms for transmitting and strengthening Christian belief, he says. This stuff may initially seem weird to us because we didn’t invent it. However, it’s real and it works.

The problem: The common thread is evolution, not divine creation. Sexual reproduction works the way it does because it evolved that way. Religious beliefs and rituals also evolved (socially), and therefore they probably serve a function, too, but it is not necessarily the function that the adherents of those religions believe them to have.

13. He identifies God's suffering as Jesus as essential.

God came to Earth to suffer and die to have that experience so God could help us with our process of repentance, too. If God has special expertise or ability in the “suffering and death” department, that’s all the more reason to accept him as a teacher.

The problem: God’s qualifications to teach need not concern us. Some of our human teachers have suffered more than others, and that does not determine our ability to learn from them, even on subjects like repentance. The issue is whether we need to believe in a limited, temporary version of free will that is sufficient to give meaning to life or if we will adopt a more wholehearted humanistic outlook.

14. He believes the world will be forcibly ended.

After all this talk about free will, he abruptly says that God will come back into the world and take it by force. God delays his return because he wants to give us a chance to use our free will to believe in him.

The problem: He leaves many unanswered questions, such as: Why more than one generation delay? Give everyone alive a chance to exercise their free will, then let history reach its conclusion. What is the purpose of delaying the conclusion for two thousand years to observe so much exercise of free will? And, if it is free will that gives meaning to life, why ever take it away at all? Wouldn’t the end of history take away the meaning of life? To put the choice as he might: If there is a purpose to ending the world, go ahead and do it already; but if free will gives meaning to life, then give us unending generations of that.

Sunday, March 26, 2017

'War eunuchs' in Hirschfeld's 'The Sexual History of the World War' (1930)

In 1930, Magnus Hirschfeld published Sittengeschichte des Weltkrieges in German. Panurge Press produced an abridged, adapted English translation as The Sexual History of the World War in 1934. Another English edition was produced by Cadillac Publishing Co. in 1941. The last one, since 2015, is available to read free online through the Internet Archive. Chapter 12, "Genital Injuries, War Eunuchs, etc." includes the following information.

"Above all, it was shot wounds in the testicles and also injuries to the spinal marrow which induced a complete disappearance of the sexual functions. Injuries of this sort were not uncommon during the war which explains their frequent occurrence in literature. Yet it appears that poetry gave much more attention to this problem of emasculation during the war than did science. One of these cases became famous in medical literature because the patient became a subject for transplantation experiments."

Dr. Robert Lichtenstern reported having to remove both testicles from a soldier in 1915 in Vienna due to an infected gunshot wound. The patient immediately ceased to have erections "despite various devices calculated to arouse him"; he rapidly lost his facial and body hair; and

"he read nothing and manifested no interest whatever in the war....For the most part the patient sat near his bed or at the window, ate voraciously, slept a lot, and busied himself with absolutely nothing at all. The loss of both testicles resulted in a remarkable increase of adipose tissue, especially around the neck which gave the patient a peculiarly stupid appearance."

Doctors then transplanted another man's testicle into him, with these alleged results: "Various castration symptoms, such as adiposity, altered trichosis, loss of libido and psychic indifferentism, all receded temporarily so that the patient actually entertained the idea of marrying."

Dr. F. Pick's study found "commotion neurosis" in 10 out of 25 officers and in 7 out of 75 soldiers. These men were unable to ejaculate and in some cases also unable to get erections. Pick attributed this to physical and psychic stresses of battle, including sexual abstinence.

Several literary passages are referenced in this same chapter of Hirschfeld's book:

From an author named Bruno Vogel: "I saw Sczepczyk again. With amazing precision his generative organs had been shot from his body. 'Herr Leutenant,' he whispered, a little bit ashamed and in deep confidence, 'Herr Leutenant, and I have never yet had a girl.' He gladly accepted the cigarette I gave him and I softly stroked his hair and forehead. Finally I slipped my hand over his eyes and, as a little smile of pleasure curled over his mouth, I pushed my mercifully brutal sword into his side." The title was not mentioned, but possibly this was Vogel's Es lebe der Krieg! (1924).

The Siberian diary of Edwin Erich Dwinger The Army Behind Barbed Wire: A wounded soldier says that his wife (whose picture shows her to be "a perfect child-bearing machine") wanted at least six children. "Until now we weren't able to have any children because there wasn't any money for them." When he is told that he cannot have children due to his wound, "he turned around slowly and walked to his bed, stretched himself out painfully and never spoke to anyone else until they sent him to Siberia. It is significant that we meet the tragic figure of this emasculated man further on in the novel, but at this later stage, he rejoices that he does not have to suffer the sexual hunger which the others are being plagued by."

The poet Ernst Toller has a man named Hinkemann who "may be regarded as the final literary formula of the emasculated soldier who returns home from the wars, and the inability of his wife to continue a veritably inhuman sacrifice in his behalf....we are dealing with a group of men who will never be able to find their lost happiness by the side of a woman. From every outcry of Toller's hero, we hear the whole dismal and appalling tragedy of a creature who has gone through the vast hell of war, and it is a cry which can never be silenced. How brutal is the reply to Hinkemann by his wife's seducer, Paul Grosshahn, who rebukes the cripple for seeking to keep his wife a nun. Hinke- mann is informed by the seducer that he is in reality nothing more to his wife now than a ground for divorce!"

Panurge Press and other early 20th century distributors of erotic books

Jay A. Gertzman's Bookleggers and Smuthounds: The Trade in Erotica 1920-1940 gives an engaging history of the difficulties in New York City with distributing literature that had any sexual content. "The federal antiobscenity statues, lobbied through Congress by Anthony Comstock in 1873 and enforced just as powerfully half a century later, called their wares 'obscene, lewd, lascivious, indecent, filthy, or vile.' * * * By the early 1920s, a group of young New York publishers was providing Americans with literature from European writers whom the older publishers considered too subversive to touch. Beginning in 1922, a series of court rulings made it more difficult to suppress sexually explicit material that could not be termed flagitious by any general consensus." (pp. 1, 10)

The most detailed figures in Gertzman's history include Esar Levine of Panurge Press (Esar was editor, and his brother Benjamin was business manager) and Benjamin Rebhuhn of Falstaff Press (he ran it with his wife and nephew). The Levines and Rebhuhns both had mail-order businesses and were close friends with each other. "Many Panurge titles were transferred to Falstaff in 1936 (and reprinted as new editions), and later became property of Metro Books, distributed by Benjamin Levine." (p. 30) The most important character is probably Samuel Roth, whose Golden Hind Press at 122 Fifth Avenue was raided on October 4, 1929. (p. 16) These men endured repeated prosecutions and incarcerations.

The majority of the names of booksellers in this narrative belonged to Jews. "In New York at least, during the period from 1880 to 1940, many [erotica dealers] were members of Jewish immigrant families," Gertzman writes. He adds that "German immigrants were skilled printers, lithographers, and typesetters". (pp. 28- 29)

"Although avoiding ethnic scapegoating, John Sumner [secretary of the New York Society for the Suppression of Vice] sometimes specifically described the purveyor of 'obscenity' as a Jew (or Italian or German). Rooted in his opposition to erotic literature was a fear of contamination by the unclean outsider. Society as a whole, as well as the immigrant neighborhoods, was in danger of contagion. Sumner's annual reports stigmatize individuals arrested (whether convicted or not) as 'foreign looking,' 'mentally defective,' 'exhibitionists,' 'fly-by-night.' 'Most of these defendants,' he wrote in his 1928 report, 'were of the young, radical, irreligious and over-educated type. Their personal writings wherever found, indicated an utter disregard for the law, public decency or any of the proprieties of organized society. They are literally anarchists.'" (p. 45)

The Panurge books were overpriced for the Depression era. Consequently, "Panurge classified its clients into groups. There were twenty-five 'prominent individuals'...ten 'professors'; fifty 'army officers'; twenty 'reverends'; two hundred eighty 'lawyers'; and fourteen hundred 'doctors,' including more dentists than physicians — thirty-five fully typed pages were needed to list them." (p. 57) Gertzman also says: "Judge Learned Hand appears to have recognized the more complex reality, when he found Esar Levine guilty of pandering to prurience with the circulars for his Panurge Press books. He refused to admit into evidence the Panurge Press mailing list, with its 'professors,' 'army officers,' and 'physicians.' 'Even respectable persons may have a taste for salacity,' he wrote." (p. 144)

Sunday, March 19, 2017

Alan Turing's story as told in the film 'The Imitation Game' (2014)

"The Imitation Game" (2014) stars the actor Benedict Cumberbatch playing the mathematician Alan Turing. Turing was famous for his work on early computers. During World War II, he worked for the British government on a team that deciphered intercepted Nazi communications. His successful cryptography is believed to have shortened World War II.

In the film, Turing is portrayed as a reclusive personality without strong ties to friends or family. He knows from an early age that he is attracted to other men. This was illegal in Britain at the time; sexual relations between men were punishable by prison. He is briefly engaged to a fellow codebreaker (Joan Clarke, played by the actor Keira Knightley), but he breaks it off with her, admitting his true feelings.

When finally convicted of "gross indecency," Turing was given the choice between prison and a "treatment" of chemical castration that was supposed to moderate or eliminate his sexual feelings. Both possibilities devastated him; Turing chose treatment. The film depicts him as gaunt and frail after beginning the chemical castration. He lasted one year on treatment and then committed suicide on June 7, 1954 by biting an apple poisoned with cyanide.

Wednesday, January 11, 2017

The long and misguided history of swearing in on Bibles

Using any particular religious scripture for the swearing-in ceremonies for politicians and court witnesses poses the obvious problem that not everyone endorses the content of the given scripture. If someone does not believe at all in a particular God or scripture, then they may object to being forced to invoke this foreign or disagreeable belief system. Even if they are willing to recite the words and mimic the gestures, their oath would not carry the intended religious weight, since they do not believe that this particular God holds them accountable. This problem applies on a "sliding scale" to people who believe in the Bible in diverse ways or with loose or inconsistent interpretations. People do not all believe in the same God in the same way, and there is no sense in making them recite words that presume they do.

For example, president-elect Donald Trump, who was raised Presbyterian, was heavily influenced by the so-called "prosperity gospel" and doesn't currently belong to any church, according to Ken Briggs, writing for the National Catholic Reporter in January 2017.

In secular contexts, swearing on the Bible is nonsensical and causes dissension. Its practice for politicians' swearing-in ceremonies in the United States nevertheless has an interesting history that can be traced hundreds of years back to England. Melissa Mohr explains it well in her 2013 book "Holy Sh*t:  A Brief History of Swearing," which is about the history of oaths as well as obscenities.

When England was a Catholic country, swearing oaths on physical copies of the Bible held a prominent place in the culture. A religious movement whose adherents were known as Lollards opposed this practice in the early 15th century, as did Quakers in the 17th century. Lollards were willing to swear verbally by God, but were burned at the stake for being unwilling to swear on the Bible. Quakers would not swear at all, which meant that they couldn't take oaths of allegiance and couldn't testify in court. Mohr writes, "A good technique for getting rid of a Quaker you didn't like was to accuse him of doing something illegal. Whether or not he was guilty, when he refused to take an oath his property would be confiscated and he would be thrown in jail for contempt of court."

Aware of this religious history in England, the American founding fathers aimed for a more secular start to the nation in the 18th century. The U.S. Constitution prescribes this presidential oath of office: "I do solemnly swear (or affirm) that I will faithfully execute the office of President of the United States, and will to the best of my ability preserve, protect, and defend the Constitution of the United States." This secular statement avoids the difficulties that presented themselves in England. Article VI of the Constitution additionally clarifies: "No religious test shall ever be required as a qualification to any office or public trust under the United States."

'Book-oath'

The term 'book-oath' goes back at least as far as Shakespeare's Henry IV. Part II contains the words: "I put thee now to thy / book-oath: deny it, if thou canst." In pre-Revolutionary America, swearing on the Bible served as a religious test "designed to marginalize infidel deists like Thomas Paine, and religious dissidents especially like members of the Dutch Reformed Church," according to information received from Ray Soller.

Placing one's hand on the Bible

Despite this, many U.S. presidents have recited the oath with their hands on a Bible. George Washington did so at his first inauguration. (For the next several presidents after him, there are only persistent but unconfirmed national myths.) The next well substantiated claim to this is for the seventh U.S. president, Andrew Jackson, at his inauguration in 1829, followed by the eleventh U.S. president, James Polk, who also kissed the Bible when he swore on it at his 1845 inauguration, an event that was publicized by telegraph. Social critic and comic Dean Obeidallah singled out "two presidents, Teddy Roosevelt and John Quincy Adams, [who] did not use a Bible at their swearing-in ceremonies," but many others certainly did.

Saying 'So help me God'

David B. Parker wrote for the History News Network:

"...we have no convincing contemporary evidence that any president said 'so help me God' until September 1881, when Chester A. Arthur took the oath after the death of James Garfield. William Howard Taft, Warren G. Harding, Calvin Coolidge, and Franklin Roosevelt said 'so help me God,' as has every president since then. But before 1933, we have good evidence for only four (of thirty-one)."

Of potential interest, see "Kiss the Book...You're President...: 'So Help Me God' and Kissing the Book in the Presidential Oath of Office," Frederick B. Jonassen, 2012 in the William & Mary Bill of Rights Journal, Vol. 20, Issue 3, Article 5.

Dissent

In the nineteenth century, England's laws for swearing-in ceremonies were challenged by the elections to Parliament of Lionel de Rothschild and David Salomons, who were Jews, and Charles Bradlaugh, who was an atheist. The Jews' proposed modifications to the oath were not accepted, while the atheist was willing to swear the Christian oath but was denied the opportunity. For showing up to work in the chamber to which they'd been elected, Salomons was fined heavily and ejected from the room, and Bradlaugh was arrested and jailed. With perseverance, eventually the Jewish Relief Act (1858) and the Oaths Act (1888) enabled non-Christians to complete the oath of office.

A secular approach seems the obvious solution to the conflict. U.S. CIA Director John Brennan was sworn in on a copy of the U.S. Constitution in 2013. Yet some politicians, seeing that Christian politicians swear in on Bibles, wish to swear in on a copy of their own religious text. Rep. Keith Ellison (D-MN), the first Muslim to be elected as a member of the U.S. Congress, was sworn in on a copy of the Koran that was published in 1764 and was owned by Thomas Jefferson. House Speaker Nancy Pelosi (D-CA) participated in the ceremony and also placed her hand on the book.

Endorsers of the Bible, meanwhile, often are reluctant to allow others the opportunity to use their own texts, so the conflict perpetuates itself. Rep. Virgil Goode (R-VA) took advantage of Ellison's pending swearing-in to release a statement calling for stricter immigration laws, without which, he said, "there will likely be many more Muslims elected to office and demanding the use of the Quran." Goode claimed that restrictions on immigration, particularly from the Middle East, "are necessary to preserve the values and beliefs traditional to the United States of America and to prevent our resources from being swamped." (All this, despite the fact that Ellison is African-American and was born in Detroit.) Similarly, Dennis Prager, a talk-show host and a member of the council that oversees the U.S. Holocaust Museum in Washington, D.C., complained about Ellison's anticipated use of the Koran. He said he feared that the nation would "abandon its Judeo-Christian values" and that he himself, as a Jew, would "get hurt" as a consequence. At their base, Goode's and Prager's expressed concerns are not about the ritual use of the Koran in American politics, but rather about Muslim Americans in public service.

Debates like this occur in many countries. For example, Israel's national anthem, "Hatikva," is written from a Jewish point of view and refers to Jews living freely in their land of Zion. This often causes distress for the one-quarter of Israelis who are not Jewish. In the February 2013 swearing-in ceremony for new parliament members, several Arab politicians left the room to protest the words of the anthem. Mere suggestions to make the language more inclusive, even when those suggestions are vague and are made by Jewish politicians, still prompt strong opposition.

Conclusion

In short, the use of the Bible for swearing oaths originated hundreds of years ago as a Catholic tradition, and despite some Protestant opposition and American secularist reform, the practice continues today. The custom is confusing and unnecessary. Unless one literally believes in a God who holds people accountable for their oaths, one cannot believe that such an oath has any inherent force that makes people keep their promises.

From an irreligious or non-literal religious perspective, the only extra force of a public religious oath lies in its potential activation of reverence and shame in the oath-taker. But this assumes that the oath-taker (or perhaps the audience) has certain religious sensibilities. Not everyone does, so mandatory swearing on Bibles is a transparent affront to individuals' true belief systems. It is a coercive effort to tamp down intellectual and religious diversity in favor of a public show of conformity. Some find the ritual inspiring, but others find it off-putting. Therefore, it discourages unity while being mostly useless in enforcing promise-keeping.

This article was originally published to Helium Network on Dec. 10, 2013. It has been significantly revised in January 2017 thanks to input from Ray Soller.
Image by: Adrian Pingstone, 2005. The photograph is of a Latin Bible made in Belgium in 1407. © Public domain. The Bible is on display in Malmesbury Abbey in England. Wikimedia Commons.

Sunday, December 25, 2016

Avoiding and correcting false beliefs

People believe these things

Lawrence Davidson characterized the arguments in Rick Shenkman's 2008 book Just How Stupid Are We?: Facing the Truth About the American Voter as saying that Americans are: "(1) ignorant about major international events, (2) knew little about how their own government runs and who runs it, (3) were nonetheless willing to accept government positions and policies even though a moderate amount of critical thought suggested they were bad for the country, and (4) were readily swayed by stereotyping, simplistic solutions, irrational fears and public relations babble." Davidson then said that this is "a default position for any population," but that it is still a concern when, for example, "polls show [that] over half of American adults don’t know which country dropped the atomic bomb on Hiroshima, or that 30 percent don’t know what the Holocaust was." Such confusion isn't unique to the United States. "In the middle of March 2008," wrote Javier Cercas (translated by Anne McLean) in The Anatomy of a Moment, "I read that according to a poll published in the United Kingdom almost a quarter of Britons thought Winston Churchill was a fictional character."

In 2014, the National Science Foundation said that only a slight majority of Americans polled were able to correctly respond that viruses can't be treated with antibiotics and that 26 percent said that the sun revolves around the Earth.

Since 2014, a small but growing group of "Flat Earthers" has met regularly in Fort Collins, Colo., with sympathetic meetings occurring in a half-dozen other U.S. cities. A leader recalls seeing a YouTube video that promoted the idea of a flat earth. “It was interesting, but I didn’t think it was real. I started the same way as everyone else, saying, ‘Oh, I’ll just prove the earth is round.’ Nine months later, I was staring at my computer thinking, ‘I can’t prove the globe anymore.” The article in the Denver Post says of this group: "Many subscribe to the 'ice wall theory,' or the belief that the world is circumscribed by giant ice barriers, like the walls of a bowl, that then extend infinitely along a flat plane." Today in 2017, searching YouTube by the exact phrase "flat earth" (with quotation marks) yields three-quarters of a million videos.

In 2010, the Corporation for Public Broadcasting received funding amounting to 0.00014% of the U.S. federal budget. CNN/Opinion Research found early the next year that "Forty percent of those polled believe funding the CPB receives takes up 1 to 5 percent of the budget, 30 percent believe public broadcasting takes up 5 percent or more of the budget and 7 percent of respondents believe the non-profit receives 50 percent or more of the federal budget." The final cohort of respondents who thought it was more than half of the budget may also suffer from general mathematical or political illiteracy, but it seems fair to say that many people have false beliefs about the funding for public broadcasting. (For comparison, when a Roper poll in 2007 accurately informed participants that the Public Broadcasting Service (PBS) receives funding equivalent to about $1 per American per year, half of the respondents said this amount was "too little.")

"There’s no shame in not knowing; there’s shame in not wanting to know. For years I’ve said this to my college students as a way of telling them that learning should never stop. But I have reluctantly come to the conclusion that, at a certain point, there should be shame in not knowing," Charles Taylor wrote in an opinion piece for the Boston Globe. He fretted over "creative-writing students who have never heard of Edith Wharton or Ralph Ellison; journalism students who can’t identify the attorney general; students who don’t know what the NAACP or the Geneva Convention are."

"The emerging narrative of this election is that Donald Trump was elected by people who are sick of being looked down on by liberal elites. The question the people pushing this narrative have not asked is this: Were the elites, based on the facts, demonstrably right?

* * *

That Trump voters chose an easily disprovable myth over readily available facts is one sign of their willful ignorance.

And still this imperviousness to fact pales next to the racism and xenophobia and misogyny — in other words, the moral ignorance — that Trump’s supporters wallowed in. All of the condescension of which liberals have been accused can’t begin to match the condescension of the current storyline that Trump voters are too disenfranchised or despised or dismissed to be held morally responsible for their choices.

* * *

The apologists for Donald Trump voters have given their imprimatur to a culture that equates knowledge and expertise with elitism, a culture ignorant of the history of the country it professes to love and contemptuous of the content of its founding documents."

It isn't clear from this brief column how Taylor thinks factual knowledge and moral knowledge might be related. Most people would say that moral knowledge depends on drawing conclusions that incorporate factual knowledge. (For example, you have to know whether someone else is threatening you before you can properly decide how to act in "self-defense." As another example, you have to know whether a crime occurred before you can express your opinion about it. Berel Lang wrote: "...the most extreme Holocaust 'revisionists' — Faurisson, Rassinier, Butz — do not deny that if the Holocaust had occurred, it would have been an enormity warranting moral reflection, judgment, and whatever else followed from these, presumably including condemnation and punishment; they deny only that it did occur."). Some would also say that moral knowledge is not merely a concatenation of ordinary beliefs and social agreements but that it exists in some separate sphere.

We care more about facts when we feel good about ourselves

“The 2000 [presidential] campaign was something of a fact-free zone,” said Brendan Nyhan, who was an undergraduate at Swarthmore at the time and who subsequently founded a political fact-checking website called Spinsanity that led to a book All the President's Spin. In his doctoral program at Duke University, he moved on to ask, as Maria Konnikova put it: "If factual correction is ineffective, how can you make people change their misperceptions?"

From Konnikova's article:

"Until recently, attempts to correct false beliefs haven’t had much success. Stephan Lewandowsky, a psychologist at the University of Bristol whose research into misinformation began around the same time as Nyhan’s, conducted a review of misperception literature through 2012. He found much speculation, but, apart from his own work and the studies that Nyhan was conducting, there was little empirical research.

* * *

One thing he learned early on is that not all errors are created equal. Not all false information goes on to become a false belief — that is, a more lasting state of incorrect knowledge — and not all false beliefs are difficult to correct.

* * *

When there’s no immediate threat to our understanding of the world, we change our beliefs. It’s when that change contradicts something we’ve long held as important that problems occur.

[For more examples of how this might work, see these Disruptive Dissertation blog posts. In religious thought: "The specious claim that human calamities are caused by an angry God" In political thought: "False reports that President Obama is a Muslim"]

Konnikova went on to say:

In a series of studies that they’ve just submitted for publication, the Dartmouth team approached false-belief correction from a self-affirmation angle, an approach that had previously been used for fighting prejudice and low self-esteem. The theory, pioneered by Claude Steele, suggests that, when people feel their sense of self threatened by the outside world, they are strongly motivated to correct the misperception, be it by reasoning away the inconsistency or by modifying their behavior.

* * *

Normally, self-affirmation is reserved for instances in which identity is threatened in direct ways: race, gender, age, weight, and the like. Here, Nyhan decided to apply it in an unrelated context: Could recalling a time when you felt good about yourself make you more broad-minded about highly politicized issues, like the Iraq surge or global warming? As it turns out, it would."

It is also important to note the difference between actually believing something and merely claiming to believe it to maintain one's public image. Public image is more obviously related to one's identity and also to one's material interests. Alexander C. Kaufman provided this example:

"In December 2006, Exxon Mobil Corp. convened a two-day summit of environmental and ethics experts at a rural retreat near the base of the Blue Ridge Mountains in Virginia....For decades, Exxon had funded far-right think tanks that seeded doubt over the scientific consensus on climate change. [The new CEO Rex] Tillerson and Ken Cohen, Exxon’s PR chief and chair of its political action committee, wanted to broaden the company’s political reach. One step was changing their messaging about climate change, moving away from the denial the company had been attacked for supporting....Not long after the summit, Exxon began to modify its public stance on climate change."

Sometimes what is claimed publicly is done to maintain relationships and make money. Edward S. Herman and Noam Chomsky on how the American mass media operate:

"But a critical analysis of American institutions, the way they function domestically and their international operations, must meet far higher standards; in fact, standards are often imposed that can barely be met in the natural sciences. One has to work hard, to produce evidence that is credible, to construct serious arguments, to present extensive documentation — all tasks that are superfluous as long as one remains within the presuppositional framework of the doctrinal consensus. It is small wonder that few are willing to undertake the effort, quite apart from the rewards that accrue to conformity and the costs of honest dissidence."

Nyhan's work, by contrast, seems to be about more privately held beliefs.

So they say

Albert Einstein said, "Only two things are infinite, the universe and human stupidity, and I am not sure about the former." Elbert Hubbard: "Everyone is a damn fool for at least five minutes every day. Wisdom consists in not exceeding that limit." George Bernard Shaw said it would be better to know that one does not know: “Beware of false knowledge; it is more dangerous than ignorance.” As hope, nonetheless, the words of Phyllis Bottome: "There is nothing final about a mistake, except its being taken as final."

Sources

"Why Americans Are So Ignorant: It's Not Just Fox News," Lawrence Davidson, Consortium News, April 8, 2013.

Javier Cercas. The Anatomy of a Moment: Thirty-five Minutes in History and Imagination. (2009) Translated from the Spanish by Anne McLean. New York: Bloomsbury, 2011. p. 3.

"Poll: Americans way off on public broadcasting funding," Politico.com, April 1, 2011.

"Highlights of the 2007 Roper Public Opinion Poll on PBS."

"Yes, there is shame in not knowing." Charles Taylor. Boston Globe. Dec. 19, 2016.

Berel Lang. Heidegger’s Silence. Ithaca and London: Cornell University Press, 1996. p. 14.

"I don't want to be right," Maria Konnikova, New Yorker, May 19, 2014.

"Rex Tillerson Supposedly Shifted Exxon Mobil’s Climate Position. Except He Really Didn’t." Alexander C. Kaufman. Huffington Post. Dec. 26, 2016.

Edward S. Herman and Noam Chomsky. Manufacturing Consent: The Political Economy of the Mass Media. New York: Pantheon, 1988. p. 305.

Elbert Hubbard, quoted in The Village Voice, quoted again in The Week, Feb. 22, 2013. p. 19.

George Bernard Shaw, quoted in RefDesk.com, quoted again in The Week, July 18, 2014. p. 15.

Phyllis Bottome, quoted in the Associated Press, quoted again in The Week, June 13, 2014. p. 15.

Saturday, February 20, 2016

Pierre Darmon on impotence trials in pre-Revolutionary France

Pierre Darmon's Le Tribunal de l'Impuissance (Paris: Editions du Seuil, 1979) was translated into English "for the general reader," minus "technical appendices dealing with juridical matters, and all footnotes and detailed references," according to the publisher's note. The English edition is Damning the Innocent: A History of the Persecution of the Impotent in pre-Revolutionary France (New York: Viking, 1986).

The 17th century was a perilous time for a man to be accused of failure to consummate a marriage. It often led to pseudo-medical examinations by the Church, to a public trial in court, and to legal dissolution of the marriage. The examination was generally informed by “the eternal trinity which confirms a man’s virility: ’erecting, entering, emitting’, in the words of the surgeon Guillemeau.” (p. 13) The examination of at least one man in 1694 was invasive enough to involve the surgical removal of a large bladder stone. (p. 115) By the 19th century, there was a backlash against this entire enterprise, and the public debate about impotence subsided.

Darmon writes:

"Deep in the shadowy unconscious of every man lurks a terrible fear of castration. The myth of virility can be seen as the sublimation of this anxiety into a more abstract form which is the basis of a man’s prestige, yet completely beyond his control. The forces that do control it are certainly mysterious. From one day to the next, its happy functions, the tangible proof of male prestige and perfection, can vanish like smoke in the wind, for no apparent reason. It is a cruel state of uncertainty. For all his pride, the virile male is a man trapped by his celebrated and over-estimated body, and the endless escalation of his physical prowess, which, far from reassuring him, plunge him into an infernal spiral of anguish. At any moment, the world which he has built up around himself may disintegrate. For this reason, the trial and condemnation of his impotent fellow men affords him substantial consolation. It serves to idealise his own position, and allows him the opportunity to display his own conformity with the collectively approved sexual norms – a conformity which is statutory, mandatory and of divine ordinance. The priests themselves submit to it, and the Church casts out all those whose virile organ, though sworn to inaction, reveals the slightest anomaly." (p. 2)

Thus impotence trials had a “dual objective: to reassure and to incriminate.” (p. 13) "Is it any coincidence that, from Sparta to Nuremberg, the most disastrous ideologies have been those founded largely upon a coherent mythology of virility?" (p. 229)

Darmon says we need a "categorical" and "visceral rejection" of the impotence trial, and a recognition that "to debate impotence in the first place is to trigger off a poisonous chain of logic" that will lead to an impotence trial. (p. 228-229) These trials were especially bad for men, constituting a form of public shaming. They did seem to empower women to divorce their husbands, but this could backfire, as sometimes the woman would be taken into custody. In 1667, a woman who petitioned the High Court of Provence for separation from her husband was sent to a convent by the bishop. Darmon says this was about "depositing temporarily in the hands of a third party an object whose ownership was in dispute. The husband had first of all to prove his ability to operate the object before he could reclaim it. In the eyes of the law, the title deed was acknowledged as valid only under these conditions." (p. 127)

Despite this opinion, however, Darmon's writing conveys humor when relating many situations. In describing the idea of "relative impotence" (when a man is able to perform sexually with one person but not another), he says this idea "was responsible for situations which made complete nonsense of the institution of marriage," including a man who fathered children with several concubines in his house but had never slept with his wife. "After twelve years of this regime, his wife finally came to realise that her household was in certain respects infringing the laws of Christian marriage." (pp. 25-26) He also quotes a transcript of an investigation in which a physician Croissant de Garengeot noted of one man's erection: "far from lasting longer upon being handled, as is the case with the true erection, it immediately faded away." After which Maître Simon: "What penis, he [Simon] exclaimed, however swollen it might be, when handled and examined by Garengeot (since M. Col de Vilars was content to hold the candle all the time) would not wilt on the very spot? Did this oracle from Bourges flatter himself that he was of sufficiently comely appearance and had a sufficiently enlivening hand to stroke the imagination of a man of modesty?" (p. 183)

There was a longstanding fear that if a woman was bound to a man who could not have the kind of intercourse that was sanctioned by the authorities, she was at risk of being abused as he took out his sexual frustration on her. “For the theorists of the classical period, the ‘counsel’ and advocacy of chastity drew its inspiration from an excessively refined conception of human nature. Sexual continence in marriage was merely a fiction, an abstraction. In reality – which was rather more prosaic – the wife of an impotent man was prey to the lascivious excesses and perverse wit of a spirit unhinged by its disability.” (p. 66) In part because of this, from the 16th century, “the marriage of a eunuch was seen as an intolerable threat to the doctrine of Christian marriage.” (pp. 12-13) As an example, in 1655, Denis Pinot wanted to marry Marie Bulot, but their parish priest would not publish the banns because Denis Pinot was, “by common knowledge,” a eunuch. The couple appealed to the High Court of Paris, but the court rejected the appeal. This case was famous for being a rare outcome. Generally, partners who wanted to marry would not reveal such a secret. (pp. 68-69, 71) In the eyes of the authorities: “By marrying, the impotent man commits an act of larceny, profanes a sacrament, and indulges in an inhumane, cruel and dangerous act. These were the three themes which, from the beginning of the sixteenth century to the end of the eighteenth century, recurred incessantly in the writings of jurists and theologians.” (p. 59) Yet: “At the beginning of the seventeenth century, the surgeon Guillemeau noted that the removal of one or even of both testicles did not put a definitive end to sexual relations.” (p. 23)

Accusations of impotence could, of course, be brought up as a pretext for preventing marriages that others opposed for other reasons and interests.

“Thus in 1639 an inhabitant of Pamiers decided to get married. His brother tried to prevent him, on the grounds that he lacked ‘those natural parts necessary for marriage.’ The accused retorted that ‘the beauty, whom the matter most concerned, had no complaints, and, when this was no longer the case, she would have him tamquam patrem, non tanquam maritum’ (as a father if not as a spouse). The parliament of Toulouse ruled that the marriage could proceed. Needless to say, ‘the beauty’ had been made heiress to a tidy little fortune.” (p. 67-68)

Darmon mentions one brief anecdote from Ancillon's 1707 Traité des Eunuques, but it is possible he learned about it from a secondary source, as this book is not mentioned in his bibliography. (It was a rare book that seems never to have been reprinted until it was digitized in 2010.) He does, however, address the question of eunuchs, since this overlaps significantly with the question of impotence.

"In the case of eunuchs – those afflicted by 'overt impotence' (Tagereau) resulting from some congenital malformation – there were fewer problems. In its fullest snese, the term 'eunuch' means a complete absence of genitals. Occasionally these do exist, but with some flagrant anomaly. Tagereau reported the case of a man 'possessed of two penises, each hindering the other', of another whose testicles were situated above his penis, and of a third 'who had a penis the size of a wart and testicles that of two peas, hardly discernible.'

The contempt in which these unfortunate individuals were held by the general public reduced them to the level of outcasts or monsters. 'They bear the mark of their ignominy upon themselves,' wrote the theologian Fevret. 'How might we consider them to be whole, since this blemish in them is so considerable, and what reasons might induce us to believe them fit for marriage?' The verdict of Sébastien Roulliard was even more savage: 'These castrates are incapable of progeny, and we do take it as given that it is these which either sex is accustomed to hold in abomination, and whom the Greeks termed half-men or half-women, neither men nor women.' In their hatred of eunuchs, some went so far as to call them creatures of the devil. According to the lawyer Peleus, 'spados [eunuchs] are incubate demons, for which reason, in the manner of those spirits that have relations with women, their substance is colder than the icy wastes of Scythia, and does spoil the reproductive faculty that women possess by nature.

As a result, eunuchs found themselves outcasts, rigidly ostracised by ecclesiastical and civil legislation....the articles of canon law...prescribe[d] a scrupulous and intimate physical examination of candidates for the monastic orders. ...eunuchs were prohibited from admittance to [civil] public office..." (pp. 17-18)

Through the 16th century, eunuchs – men with no testicles and no ability to ejaculate, but sometimes with the ability to get erections – were permitted to marry, but if their wives brought them to impotence trial, the marriage could be dissolved. On June 27, 1587, Pope Sixtus V required that the marriages of eunuchs should be dissolved regardless of whether the married couple wanted that. This was followed at the turn of the century by a prolonged legal battle of the Baron d'Argenton who sought to prove that his testicles, while undescended and thus not visible, nevertheless existed. He lost the case. In 1604, "[t]he whole of Parish rushed to be present for the autopsy of his body," and, when the doctors found his undescended testicles, "D'Argenton was posthumously declared potent, and the Faculty of Medicine in Paris pronounced by decree 'that for the purposes of engendering offspring, it is not requisite that the testicles be present in the scrotum of a man, provided nevertheless that he display other and sufficient marks of virility.'" (pp. 21-22)

Meanwhile, "hermaphrodites" – those who exhibited physical characteristics of both sexes – were at risk of summary execution from Roman times all the way through the ages, despite Ulpien's third-century Lex Repetundarum statute that dictated that they should be assigned to one sex or the other and treated accordingly, a precedent that continued to influence pre-Revolutionary France.

Sunday, October 25, 2015

Should philosophical ideas be subject to empirical verification?

One question is whether empiricism itself can be empirically verified. Michael Lerner:

"Consider its [scientism’s] central belief: ‘That which is real is that which can be verified or falsified by empirical observation.’ The claim sounds tough minded and rational, but what scientific experiment could you perform to prove that it is either true or false?"

Another is whether, although empiricism is indeed a value, there may be other methods and approaches that are also valuable. Sir James Baillie:

"Empiricism is so true that the closer one keeps to it – without becoming an empiricist! – the better. Just as, on the contrary, Idealism is so questionable that the farther one keeps from it – without ceasing to be an idealist! – the truer will one's view of reality be."

Daniel C. Dennett:

"This spell must be broken, and broken now. Those who are religious and believe religion to be the best hope of humankind cannot reasonably expect those of us who are skeptical to refrain from expressing our doubts if they themselves are unwilling to put their convictions under the microscope. ... If the case for their path cannot be made this is something that they themselves should want to know. It is as simple as that. They claim the moral high ground; maybe they deserve it and maybe they don’t. Let’s find out."

In philosophy, it is important to define one's terms precisely, as advised by Caritat, Marquis de Condorcet:

"One of the essentials for any sound philosophy is to produce for every science an exact and precise language in which every symbol represents some well defined and circumscribed idea; and so by rigorous analysis to guarantee that every idea is well defined and circumscribed."

Defining terms enables them to be verified. However, if defined too rigorously, one loses the flavor, subtlety and ambiguity of ideas, as well as the different human perspectives that generate them.

Sources

Michael Lerner. The Left Hand of God: Taking Back Our Country from the Religious Right. HarperSanFrancisco, 2006. p. 132.

Sir James Baillie, Reflections on Life and Religion, London: George Allen and Unwin, 1952. p 252-3

Daniel C. Dennett. Breaking the Spell: Religion as a Natural Phenomenon. New York: Penguin Group, 2006. p. 17.

Marie-Jean-Antoine-Nicolas Caritat, Marquis de Condorcet. Sketch for a Historical Picture of the Progress of the Human Mind. (1794) Translated by June Barraclough. Westport, Conn.: Hyperion Press, Inc. 1955. p 44.

Philosophy to bring joy

Epicurus said: "Philosophy is useless if it does not drive away the suffering of the mind."

What will end suffering and bring joy? Is it a final analysis, or is it the process of thinking itself?

First, we may need to begin with reverence, as Sir James Baillie wrote: "The final and supreme destiny of the scholar is to unite wisdom with kindness, knowledge with love, care for truth with love of man – and without reverence that is not possible." We may also need curiosity, as St. Augustine wrote: "We learn better in a free spirit of curiosity than under fear and compulsion." We may need to accept our humble limitations and contradictions, as Johannes Climacus wrote:

"That which makes understanding so difficult is precisely this: that he [the learner] becomes nothing and yet is not annihilated; that he owes him everything and yet becomes boldly confident; that he understands the truth, but the truth makes him free; that he grasps the quilt of untruth, and then again in bold confidence triumphs in the truth."

And we may need to resist assumptions, conclusions, labels, and roles that are given to us, as James Baldwin wrote:

The world's definitions are one thing and the life one actually lives is quite another. One cannot allow oneself, nor can one's family, friends, or lovers – to say nothing of one's children – to live according to the world's definitions: one must find a way, perpetually, to be stronger and better than that.

Sources

Epicurus, quoted by Alain de Botton, The Consolations of Philosophy, Vintage Books, 2000. p 55.

Sir James Baillie, Reflections on Life and Religion, London: George Allen and Unwin, 1952. Title page.

Augustine, Confessions, I:14

Johannes Climacus (Soren Kierkegaard), Philosophical Fragments, ed. and trans. by Howard V. Hong and Edna H. Hong. Princeton University Press, 1985. p 30-1.

James Baldwin, quoted by John Stoltenberg as an epigraph to Refusing to Be a Man: Essays on Sex and Justice. New York: Meridian, 1989.