THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries for July 2005

July 29, 2005

How governments lie

I am sure all the readers of this blog would be aware of the shooting of an innocent Brazilian electrician by British police in the wake of the second attempt at bombing the British underground.

The question of how police should deal appropriately with fast moving events is a complex one and is beyond the scope of this posting. But this incident does provide a good example of how governments use the media to get their version of events into the public consciousness first, knowing that this is what most people remember.

The New York Times of Saturday, July 23, 2005 had a report that said that immediately after the shooting:

Sir Ian Blair, the commissioner of the Metropolitan Police, said the dead man had been "directly linked" to the continuing terrorism investigation, but he would not say how or why or identify him by name or nationality.

The Plain Dealer of that same day added that the Washington Post was reporting that the man had been "under surveillance." Both these items seemed to provide evidence that there was a good reason for the police to decide to shoot.

But the very next day comes another report that said:

Scotland Yard admitted Saturday that a man police officers gunned down at point-blank range in front of horrified subway passengers on Friday had nothing to do with the investigation into the bombing attacks here.

How could someone go from being "under surveillance" and "directly linked" to a terrorism investigation on one day, to having "nothing to do" with it the next?

The reason most likely is that after the man had been shot the police immediately wanted to give the public the impression that the shooting was completely justified and said whatever was necessary to achieve that goal, whether they knew it was true or not.

This is one more reason why I always try and suspend judgment and never believe the initial reports that emerge from official spokespersons immediately after some major event. Those initial official reports often have only the remotest connection to the facts and are usually designed to imprint in the public mind what the governments want the public to believe. Newspaper reporters usually have no choice but to report these statements without question since they have not had time too do any independent checking.

My skepticism has been developed over many years due to the things like the following two examples.

On July 3, 1988, during the gulf war between Iraq and Iran, the American cruiser USS Vincennes, which was in the Persian Gulf, shot down an Iranian civilian passenger aircraft, killing all 290 passengers aboard. The American President at that time (Ronald Reagan) and his Chief of Staff (Admiral William Crowe) immediately went on TV (I vividly remember watching them) and said that the shooting had been completely justified. They gave four reasons: (1) that the Iranian plane had been diving towards the USS cruiser and gaining speed, typical of an attack aircraft; (2) the plane had been transmitting of a military frequency instead of a civilian one; (3) there were no scheduled commercial Iranian airways flights at that time; (4) the flight path of the plane was outside the corridor that commercial airlines use.

So the image we were given repeatedly in the days immediately following the disaster was that this huge Airbus A300 civilian passenger plane was essentially dive-bombing the US cruiser, possibly on a kamikaze-type mission, which meant that the commander of the cruiser had no choice but to shoot it down.

At that time I thought that it was unbelievable that the Iranians would sacrifice nearly 300 of their own people on such an insane mission, but the media did not dwell much on this implausibility. After all, memories of the US embassy hostage crisis (which ended in 1981) were still fresh in people's minds and Iranians, portrayed as fanatical Muslims, were thought to be capable of anything.

Months later, the news slowly eked out in dribs and drabs, buried on the inside pages of newspapers, that every single one of the four justifications were false. (See this site for a history of the incident and the coverup.) The plane was on a regularly scheduled flight on a regular route, traveling at a steady altitude and speed, and transmitting on the civilian frequencies. Three years after the incident, Admiral Crowe admitted that the US cruiser Vincennes had actually been in Iranian territorial waters. Five years after the incident, the International Court of Justice concluded that the US actions had been unlawful.

But no one apologized for the lies, no one was punished, and the matter was quietly forgotten, except by the Iranians. The lies had served their purpose, which was to rally this country around their government in the immediate aftermath of a tragedy.

A similar situation arose when President Clinton bombed a pharmaceutical factory in Sudan on August 20, 1998, claiming that it was manufacturing biological weapons (particularly VX nerve gas), and killing at least one person. (News of this bombing, and the simultaneous bombing of Afghanistan, shoved the Monica Lewinsky scandal off the front pages just at the time she was to give her much anticipated grand jury testimony.) The US government insisted that it had firm evidence of biological weapons but that they could not reveal it for security reasons. They also blocked a UN investigation into the bombing. No evidence was ever produced to support its case. It was much later that the US government very quietly conceded that it had been wrong. In the meantime the loss of the only pharmaceutical factory in a poor country like Sudan resulted in a huge loss of medicines to a very needy population, resulting in serious health problems and deaths. Again, the government lies had served their purpose.

The retraction by the British police in the latest incident was unusual in that it was quick. The usual policy (at least in the US) is for officials to keep stonewalling and throwing up one smokescreen after another until the public gets bored or another big story consumes the media. Then a quiet admission is made of the error, which gets buried at the bottom of page 20.

On Wednesday, I went to see the excellent film Hijacking Catastrophe: 9/11, Fear and the Selling of American Empire (see the postscript to this posting for details) which showed the propaganda process at work following the events of 9/11.

This is why I always take initial news reports of such events with a grain of salt. I believe that all governments, without exception, lie to their people. They do this routinely and without shame. But most people are uncomfortable accepting this fact and want to believe that their government is trustworthy. And at the early stages of the events, governments and official spokespersons take advantage of people's trust and use their dominance of the media to make sure that people's early impressions are favorable. The only reason that governments will hesitate to lie is if the media quickly investigates the original story and gives the subsequently revealed facts as much publicity as the original stories. But as we have see, the present media have largely abdicated that role, playing it safe by simply reporting what the government says.

It will be interesting too see if the alternative press, via the internet, can help to bring more honesty into political life by quickly exposing lies. But what we can do is to treat the initial stories with a healthy skepticism until we have been convinced that there is a basis for believing them.

July 28, 2005

Agnostic or atheist?

I am sure that some of you have noticed that you get a more negative response to saying you are an atheist than to saying that you are an agnostic. For example, in a comment to a previous posting, Erin spoke about finding it "weird that atheism is so counter-culture. Looking back at my youth, announcing your non-belief in God was a surefire shock tactic." But while I have noticed that people are shocked when someone says that he/she is an atheist, they are a lot more comfortable with you saying that you are an agnostic. As a result some people might call themselves agnostics just to avoid the raised eyebrows that come with being seen as an atheist, lending support to the snide comment that "an agnostic is a cowardly atheist."

I have often wondered why agnosticism produces such a milder reaction. Partly the answer is public perceptions. Atheism, at least in the US, is associated with people who very visibly and publicly challenge the role of god in the public sphere. When Michael Newdow challenged the legality of the inclusion of "under God" in the Pledge of Allegiance that his daughter had to say in school, the media focused on his atheism as the driving force, though there are religious people who also do not like this kind of encroachment of religion into the public sphere.

In former times, atheism was identified with the flamboyant and abrasive Madalyn Murray O'Hair whose legal action led in 1963 to the US Supreme Court voting 8-1 to ban "'coercive' public prayer and Bible-reading at public schools." (In 1964 Life magazine referred to her as the most hated woman in America.) I discussed earlier that the current so-called intelligent design (ID) movement in its "Wedge" document sees this action as the beginning of the moral decline of America and is trying to reverse that course by using ID as a wedge to infiltrate god back into the public schools. Since O'Hair also founded the organization American Atheists, some people speculate that the negative views that Americans have of atheism is because of the movement's close identification with her.

I think that it may also be that religious people view atheism as a direct challenge to their beliefs, since they think atheism means that you believe that there definitely is no god and that hence they must be wrong. Whereas they think agnostics keep an open mind about the possible existence of god, so you are accepting that they might be right.

The distinction between atheism and agnosticism is a bit ambiguous. For example, if we go to the Oxford English Dictionary, the words are defined as follows:

Atheist: One who denies or disbelieves the existence of a God.

Agnostic: One who holds that the existence of anything beyond and behind material phenomena is unknown and (so far as can be judged) unknowable, and especially that a First Cause and an unseen world are subjects of which we know nothing.

The definition of atheism seems to me to be too hard and creates some problems. Denying the existence of god seems to me to be unsustainable. I do not know how anyone can reasonably claim that there definitely is no god, simply because of the logical difficulty of proving a negative. It is like claiming that there is no such thing as an extra-terrestrial being. How can one know such a thing for sure?

The definition of agnosticism, on the other hand, seems to me to be too soft, as if it grants the existence of god in some form, but says we cannot know anything about she/he/it.

To me the statement that makes a good starting point is the phrase attributed to the scientist-mathematician Laplace in a possibly apocryphal story. When he presented his book called the System of the World, Napoleon is said to have noted that god did not appear in it, to which Laplace is supposed to have replied that "I have no need for that hypothesis."

If you hold an expanded Laplacian view that you have no need for a god to provide meaning or explanations and that the existence of god is so implausible as to be not worth considering as a possibility, what label can be put on you, assuming that a label is necessary? It seems like this position puts people somewhere between the Oxford Dictionary definitions of atheist and agnostic. But until we have a new word, I think that the word atheist is closer than agnostic and we will have to live with the surprise and dismay that it provokes.

July 27, 2005

Simplifying difficult texts - 2

To illustrate the problems of simplifying original texts, we can look at examples from Shakespeare and the Bible. I came across a site that seeks to make Shakespeare's plays easier to understand by re-writing them:

Here is the original text from HAMLET Act III, Scene i, lines 57-91

To be, or not to be? That is the question—
Whether ’tis nobler in the mind to suffer
The slings and arrows of outrageous fortune,
Or to take arms against a sea of troubles,
And, by opposing, end them?
….
Thus conscience does make cowards of us all,
And thus the native hue of resolution
Is sicklied o’er with the pale cast of thought,
And enterprises of great pith and moment
With this regard their currents turn awry,
And lose the name of action.

Here is the simplified text:

The question is: is it better to be alive or dead? Is it nobler to put up with all the nasty things that luck throws your way, or to fight against all those troubles by simply putting an end to them once and for all?
….
Fear of death makes us all cowards, and our natural boldness becomes weak with too much thinking. Actions that should be carried out at once get misdirected, and stop being actions at all.

Do the two passages have the same meaning? They convey different senses to me.
Or take the famous passage from Ecclesiastes 9:11 of the Bible. Here is the familiar King James Version:

I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.

And here is the simplified modern language of the New Living Translation:
I have observed something else in this world of ours. The fastest runner doesn't always win the race, and the strongest warrior doesn't always win the battle. The wise are often poor, and the skillful are not necessarily wealthy. And those who are educated don't always lead successful lives. It is all decided by chance, by being at the right place at the right time.

Again, does the simplified passage capture the meaning of the original?

I am not criticizing the quality of the simplifications, although there may be better ones around. If you asked me what Shakespeare's passages mean, I probably would have come out with a more confused meaning than what was given above. But the point is that it is in the process of struggling to understand the author's original meaning that we make individual sense of the passage. I think that the best we can hope for is a shared consensus of the meaning, and we can never hope to exactly enter into the author's mind.

This problem is always present when the US Supreme Court tries to rule on the constitutionality of present day issues using a document written over two hundred years ago. People who call themselves "strict constructionists" say that the constitution should be interpreted according to the text and the intent of the frames. But how can you glean intent? The text of the document, by itself, is not sufficient, because words can never capture exact meanings. Literary theorist and legal scholar Stanley Fish has an interesting article that is worth reading. In it he says:

It follows that any conclusion you reach about the intention behind a text can always be challenged by someone else who marshals different evidence for an alternative intention. Thus interpretations of the Constitution, no matter how well established or long settled, are inherently susceptible to correction and can always (but not inevitably) be upset by new arguments persuasively made in the right venues by skilled advocates.
This does not mean, however, that interpreting the Constitution is a free-form activity in which anything goes. The activism that cannot be eliminated from interpretation is not an activism without constraint. It is constrained by the knowledge of what its object is - the specifying of authorial intention. An activism that abandons that constraint and just works the text over until it yields a meaning chosen in advance is not a form of interpretation at all, but a form of rewriting.

This is why I am so much a fan of collaborative learning and discussions to tease out meaning. I think you get more out of having a group of people reading the original, (difficult) text, and then arguing about what it means, than by reading a simplified text alone, however 'clear' the latter might be.

Here is a Zen koan:

Hyakujo wished to send a monk to open a new monastery. He told his pupils that whoever answered a question most ably would be appointed. Placing a water vase on the ground, he asked: "Who can say what this is without calling its name?" The chief monk said: "No one can call it a wooden shoe." Isan, the cooking monk, tipped over the vase with his foot and went out. Hyakujo smiled and said: "The chief monk loses." And Isan became the master of the new monastery.

What is the message this koan is trying to convey? The words are simple but the ideas are deep and captured succinctly. I think that it illustrates the point I am making here and I can try and tell you what it means to me, using a lot more words than in the original. But what does it mean to you?

July 26, 2005

Simplifying difficult texts

Some time ago, Aaron Shaffer in his blog expressed his disappointment with the texts he was reading in his philosophy class, particularly the fact that the writers seemed to not take the trouble to be concise, with individual sentences running as long as paragraphs. He felt that this poor writing diminished them in his eyes, since the ability to express one's ideas briefly and concisely demonstrates intellect.

I have been thinking about his comment for some time. I too, on occasion, try to read some philosophy and tend to find it heavy going. The somewhat dense and obscure style of some branches of the arts and humanities (especially the post-modernist philosophers and the area known as cultural studies) led to a notable hoax being pulled by physicist Alan Sokal, who deliberately wrote a paper whose conscious meaninglessness was disguised using dense language and the jargon common to the field of cultural studies. His article Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity was published in the journal Social Text on the same day he wrote a newspaper article exposing his hoax. (I will write about the Sokal hoax in a later posting. As is usually the case, the issue was more complicated than it might first appear, and raises serious ethical issues.)

Of course, physicists are not in a good position to throw stones at philosophers because it has long been the case that physics papers have stopped being intelligible to anyone other than those in the same sub-sub-field. But the reason for this is that scientists long ago made the transition from writing for the general public in the form of books, to writing for fellow scientists, using the form of the short paper. Once the link to the public was broken, there was no necessity to try to make oneself intelligible since other scientists know your jargon. Some argue that scientists have carried this too far, which is why the public generally has such a poor idea of what scientists actually do.

But philosophers are still, by and large, writing for a general audience, so why is their writing so hard to understand? Part of the answer is that philosophers are dealing with a difficult subject, very abstract, and this requires very skilful writing to make clear to the non-specialist. Bertrand Russell was pretty good at this but he was an exception.

Some people have tried to tackle this problem by rewriting philosophical works to make them easier to understand. Some time ago, I received an email from a professor of philosophy about his project to simplify the works of philosophers by rewriting them, to remove redundancies, simplify language, etc. But this raises the issue: Can you rewrite someone else's work without introducing distortions?

If we look at the path that ideas take, we can start with an idea in an author's brain. The author's meaning is translated into words, then the words are read by the reader and their meaning recreated in the reader's brain. Ideally we would like the process:

author's meaning ---> written words ---> reader's meaning

to occur with no loss of precision. I think that this ideal cannot be attained because it is intrinsically impossible for words to exactly capture ideas. At best they come close and make a good approximation. The reason that an author may think he/she has expressed an idea exactly is because of the implicit meanings we individually assign to words, in addition to the explicit and agreed upon meanings that we all share.

The reader also uses implicit meanings of words in reconstructing the ideas but there is no guarantee that the reader's implicit meanings are the same as that of the writer's. Hence we end up with distortions. The author, if conscientious, tries to find the words and phrases that minimizes the amount of implicit meaning and best captures the idea, but this cannot be done with 100% accuracy. The more you try to replace implicit meanings with words, the wordier the writing gets.

So when someone tries to "clarify" the ideas of another author, that introduces a third filter of implicit meanings, and possibly greater distortions. This does not mean that it is not a worthwhile exercise. A good translator might be able to infer the original meaning of the author better than the novice reader can, and render those ideas in a form that makes it easier for the novice reader to understand. But there will always be some element of ambiguity that is intrinsic to the original work. And there is always the danger that the "simplified" work introduces new ideas that the original author did not intend.

In some areas, revisionist writings completely replace the original. For example, in teaching science, we almost never use the original papers of (say) Newton and Einstein. We use textbooks instead that explain the ideas originated by them. The difference for this may be that in modern science, the community works with consensus meanings. The original idea is usually elaborated on and expanded by many others before it becomes the paradigm which the community accepts. This paradigm then represents a kind of consensus scientific view of the field and people can set about trying to present the ideas in a simplified form, suitable for novices, which is how textbooks originate. We just give a nod to the originator of the idea but the idea has ceased to be his or hers alone. When we talk of "Newton's theory" or "Einstein's theory", what we are referring to is not usually exactly what those people may have intended.

But in other areas (such as philosophy) there is no consensus paradigm to the field so there is no consensus belief structure. Hence we keep going back to the original sources, trying to tease out what the author intended. So while in physics, we never refer to someone using quantum mechanics as an "Einsteinian" or "Bohrian" (two people who had competing interpretations of quantum mechanics) but simply refer to the current consensus view, in other fields such as philosophy it is quite common to refer to someone as a "Kantian" or a "Lockian", and this implies adherence to that person's original views.

I'll write more about this tomorrow.

July 25, 2005

Will the real Americans please stand up?

Once in a while, the media decides to find out what the "real" America thinks about some major issue that is consuming the national media.

I can immediately predict what they will do. They will send a reporter out to somewhere in the mid-west, say Ohio or Iowa or Nebraska, and that reporter will go to a small town or rural area, and interview some people there. And typically, the person interviewed will be white, middle-aged, middle-class, religious and church-going, and having a conventional occupation (teacher, home-maker, small businessperson).

These are supposed to be the "real" Americans, who represent the true values of the country.

I always wondered about this particular journalistic cliche. What is it, exactly, that makes this particular group more truly representative of the country, more credible as speaking for the nation than, say, an elderly, white, New York City shopkeeper or an atheist black doctor in Mississippi or a young Hispanic farmer in Arizona?

I don't think that the reasoning behind this choice is purely demographic and statistical. It may be that if we do a multiple slicing of the entire population according to color, age, class, religion, geography, and occupation, the group singled out for journalistic preference might come out as slight more populous than other groups. I am not even sure if that is true but I think it is irrelevant.

The point is that it has become an ingrained part of conventional wisdom that this particular grouping has some special claim to speak for the country as a whole. It is as if there is a sense that "true" Americans are those who look as if they could have stepped out of a Norman Rockwell painting.

This comes back to a point made earlier in my previous posting where I argued that this idea that one segment of the population are the rightful heirs to a country and that others are "allowed" to be there is a notion that can ultimately lead to chauvinism and conflict, as can be seen in the experience of other countries. In my native Sri Lanka, there is the feeling that it is the Sinhala Buddhist majority who somehow represent the "true" Sri Lanka, and this sentiment has been the source of endless political and social unrest.

So the question is, can we define a "real American"? Is it just any citizen? Is it a citizen who was also born here? Or does it also require one to adhere to a certain set of beliefs and values? Does it depend on your physical appearance? Or is the whole exercise of searching for the "real" Americans simply pointless and should be abandoned?

I suggest that that we should reject that kind of thinking altogether, along with corresponding journalistic tropes such as the "American heartland." They serve no useful purpose and only serve create divisions and hierarchies.

POST SCRIPT

The following is a notice from Case for Peace of which I am a member:

Film: Hijacking Catastrophe: 9/11, Fear and the Selling of American Empire
Date: Wednesday, July 27, 2005,
Time: 7:00 PM
Where: Peace House, 10916 Magnolia Drive, University Circle (near the parking entrance to the Auto Museum).
Admission: I believe that there is no admission charge.
Duration: full version 64 minutes (accompanied in DVD format by abridged version and additional footage). To be projected in full screen mode, followed by discussion.

Cleveland Peace Action invites you to view this remarkable film that documents how a radical fringe of the Republican Party used the trauma of the 9/11 terror attacks to advance a pre-existing agenda to radically transform American foreign policy while rolling back civil liberties and social programs at home.

The documentary places the Bush Administration's false justifications for the war in Iraq within the larger contrast of a two-decade struggle by neo-conservatives to dramatically increase military spending in the wake of the Cold War, and to expand American power globally by means of force. At the same time, the commentary explains how the Administration has sold this radical and controversial plan for aggressive American military intervention by deliberately manipulating intelligence, political imagery, and the fears of the American people.

The film is produced by The Media Education Foundation, narrated by Julian Bold, and features interviews with Col. Karen Kwiatkowski, Scott Ritter, Daniel Ellsberg, Jody Williams, Norman Mailer, Noam Chomsky, and many others.

Produced before the 2004 election, this is film is particularly relevant now as a public education tool, since more information is being widely revealed about the background preceding the war.

July 22, 2005

False memories

The person being interviewed on the quirky NPR radio program This American Life told a story that happened to him many years back. He had been walking with his wife in New York City when he saw Jackie Kennedy across the street waving at him. Since he did not know her, he looked around to see if she was waving at someone behind him but there was no one there. Not wanting to snub a former first lady, he waved back genially just before a taxi halted before her and he realized that she had merely been hailing a cab.

This was a mildly amusing anecdote, but what is more interesting is that his wife interrupted him at that point to say that he had not been present on that occasion at all and that the incident had actually happened to her when she had been walking alone. But she had recounted the story to friends many times in her husband's presence and somehow it had got embedded in his own memory as his own story. Her husband took some persuading that he had imagined his role because he remembered very vivid details about the incident but his wife pointed out that he could not have possibly seen some of things he said (like the kind of buttons on Jackie Kennedy's coat) because she was too far away.

I think all of us have experienced the feeling of astonishment when something that we remembered quite clearly turned out to be not true or quite different. I have this vivid memory that when I was about six years old, a major fire broke out in a house a few doors away from where we lived. I remember clearly the flames shooting up into the dark night sky. We watched the blaze for some time but at some point my father decided it was coming too close and put us in the car and drove us away to safety. Our house was spared and we returned later that night. But recently, I happened to ask my mother and older sister about this and neither of them could remember the event at all.

How could this happen? The answer is that memory is a tricky business. When we experience something, we tend to think that the event is recorded by some kind of VCR in our brains to be played back later. In such a model of the brain, there may be some degradation in the recording and playback modes but there should be no major distortions. The main story should not change, and characters, chunks of dialogue, and events should not appear or disappear mysteriously.

But in actuality, the brain is not at all like a VCR, it is more like a computer. It appears that events are broken up into pieces and stored in multiple locations. Then when we "remember", the event is not recalled like a videotape playback, it is reconstructed from the separate stored bits. It is similar to the way that a computer stores a program in its hard drive and then runs it later.

Of course, this means that there must be some algorithm in the brain that breaks up the experience into bits and stores them, another algorithm for retrieving these bits and reconstructing the memory, and some other mechanism that is constantly implementing these algorithms. The problem is that these things don't seem to function with 100% accuracy, with the result that when we recall something, it is possible (if not likely) that the memory is not a faithful reconstruction of the original. Even a slight malfunctioning in the algorithm can cause a major distortion in the memory. The distortions become more pronounced with greater time lags and if we have a strong emotional investment in the event.

(I have noticed this again when writing this blog. Many times I recall some news item or quote that is relevant to a posting. But when I look up the original source, I often find that the story is not quite what I thought it was. This is why academics place such a premium on reading and citing the originalsources of anything, as far as possible. It minimizes the chances of distortions creeping in.)

So maybe my fire memory was due to an algorithm malfunction that erroneously coupled images of a fire (from some other source) with my father taking us out in his car, or something like that. This is why trusting to memory alone is a dangerous business, especially if the stakes are high, and why notes of something taken contemporaneously of an event are more valuable than verbal recollections.

At one time in the 1980s, there was a rash of high profile court cases involving the sexual abuse of children by workers in day-care centers. There was a huge media frenzy about it with several day care center workers going to jail. Many of those convictions were subsequently overturned but the lives of those day care workers were already ruined. The recent flurry of cases involving abuses by Catholic priests follow a familiar pattern.

Sexual abuse of minors is a serious problem in America (and anywhere it occurs) and it is one of the most despicable of crimes. But while the abuse of children is a horrible thing, we have to be careful to not let our disgust destroy the lives of innocent people who are convicted without having sound evidence against them. This is particularly relevant when adults (as witnesses) recall events that happened to them as children and were supposedly suppressed because of the trauma associated with them. The website of the National Center for Reason and Justice argues that some of the well-publicized cases of child sexual abuse were based on shaky science and shaky memories.

So how can we reliably distinguish false memories from real ones, since so much can depend on doing so? The book The Art of Changing the Brain by my colleague James Zull of the Biology Department (p. 84) explains how brain scanning science may be able to help.

[T]he brain behaves differently when we recall something that really happened and when we recall something that didn't happen. [Schacter's] studies showed that the part of the brain that is needed for memory formation, the parts around the hippocampus, was activated by both false memories and true ones. But when a true memory was recalled, a second part of the brain also became more active. This second part of the brain was the sensory cortex that was involved in the real event when it was sensed by the brain. In these experiments, people were asked to remember spoken words, so recall of words that had actually been spoken activated the hearing part of the cortex, the auditory cortex. This part of the brain was silent for the false memories.
(emphasis in original)

In other words, real events usually are accompanied by actual sights and sounds, and possibly also tastes, smells, and physical contact. These physical stimuli are stored in a different part of the brain from that where the memory is stored and should be activated in the memory reconstruction process. False memories do not have such recorded physical sensations.

I am not sure if brain scanning data will become reliable enough to be part of criminal trials. I am also not sure if the sensory cortex retains actual sounds and sights from very long ago. But it would be nice if we had help in identifying which memories are reliable and which are not.

July 21, 2005

Shafars and brights arise!

Sam Smith runs an interesting website called the Progressive Review. It is an idiosyncratic mix of political news and commentary with oddball, amusing, and quirky items culled from various sources thrown in. Mixed with these are his own thoughtful essays on various topics and one essay that is relevant to this series of posts on religion and politics is his call for "shafars" (an acronym he has coined that stands for people who identify with secularism, humanism, atheism, free thought, agnosticism, or rationalism) to play a more visible and assertive role in public life and to not let the overtly religious dominate the public sphere.

Daniel Dennett and Richard Dawkins have started a similar effort, more serious than Smith's, to have people identify themselves as "brights". Who or what is a "bright"? The bright website says that a bright is a person who has a naturalistic worldview; a bright's worldview is free of supernatural and mystical elements; and the ethics and actions of a bright are based on a naturalistic worldview.

Smith playfully refers to the "faith" of shafarism and says that "Shafars are 850 million people around the globe and at least 20 million at home who are ignored, insulted, or commonly considered less worthy than those who adhere to faiths based on mythology and folklore rather than on logic, empiricism, verifiable history, and science." He goes on:

As far as the government and the media are concerned, the world's fourth largest belief system doesn't exist. In number of adherents it's behind Christianity, Islam and Buddhism but ahead of Hinduism. Globally it's 85% the size of Catholicism and in America just a little smaller than Episcopalians, Presbyterians and Lutherans put together. Perhaps most astoundingly, given today's politics, in the U.S. it is roughly the size of the Southern Baptist congregation.

Its leaders, however, are not invited to open Senate sessions. Our politicians do not quote them and our news shows do not interview them. And while it is a sin, if not a crime, to be anti-Catholic or anti-Semitic, disparaging this faith is not only permitted, it is publicly encouraged.

He argues that the overtly religious are given prominence in the media out of proportion to their actual numbers.

Further, omnipresent evocations of American religiosity ignore some basic facts. Such as the Harris poll that shows about half of Americans go to church only a few times a year or never. In other words, they are at best what is known in some Latin American countries as navi-pascuas, attending only at Christmas and Easter. And among these, one reasonably suspects, are numerous closet shafars, silenced by the overwhelming suppression of skepticism and disbelief. In fact, the same poll found that 21% of Catholics and 52% of Jews either don't believe in God or are not certain that God exists.
Such facts are blatantly ignored by a media which happily assigns absurdly contradictory roles to God in stories such as the recent shootings in Atlanta. In that case one was led to believe that religious faith saved the hostage, even though the abductor professed belief in the same almighty, as presumably did at least some of those killed by the perpetrator. But who needs journalistic objectivity when such cliches are so handy?

Smith makes the important point that there is nothing intrinsically virtuous about being a shafar. "None of which is to say that mythology and folklore are necessarily evil or that the non-religious necessarily earn morality by their skepticism. I'd take a progressive cardinal over Vladimir Putin any day. The thoughtfully religious, expressing their faith through works of decency and kindness, are far more useful, interesting and enjoyable than lazy, narcissistic rationalists."

But the key point is that there is no reason to give the leaders of traditional faiths any more respect than anyone else when they make pronouncements on public policy. As long as they stick to their pastoral and spiritual roles, they can enjoy the benefits of being treated deferentially by their congregants. But if they want to step into the political arena they should expect to receive the same amount of slapping around that any politician or (for that matter) you or I can expect. This is something that seems to be lost on our media who treat the statements of people like Pat Robertson, Jerry Falwell, James Dobson, etc. with an exaggerated deference, even when they say things that are outrageous.

For example, in a program on the Christian Broadcast Network just after the events of September 11, 2001, Falwell and Robertson suggested that the events were God's punishment on America for the sins of its usual suspects, especially the gays, abortion rights supporters, and the shafars. Falwell said:

"The ACLU has got to take a lot of blame for this. And I know I'll hear from them for this, but throwing God...successfully with the help of the federal court system...throwing God out of the public square, out of the schools, the abortionists have got to bear some burden for this because God will not be mocked and when we destroy 40 million little innocent babies, we make God mad...I really believe that the pagans and the abortionists and the feminists and the gays and the lesbians who are actively trying to make that an alternative lifestyle, the ACLU, People for the American Way, all of them who try to secularize America...I point the thing in their face and say you helped this happen."
Robertson said, "I totally concur, and the problem is we've adopted that agenda at the highest levels of our government, and so we're responsible as a free society for what the top people do, and the top people, of course, is the court system."

(See an interview with Pat Robertson on ABC's This Week on May 1, 2005 for another example of the kinds of things he says on national TV.)

Falwell and Robertson can think what they want and say what they want on their own media outlets. The question is why the rest of the media take people who have such bizarre views seriously and invite them over and over again to give the "religious" perspective on political matters, and treat them with excessive deference.

As Smith says:

If the Pope wants to tell Africans not to use condoms, then he has left religion and deserves no more respect than George Bush or Bill Clinton. If Jews encourage Israel to suppress the Palestinians then they can't label as anti-Semitic those who note the parallels to South Africa. And if the Anglican church wants to perpetuate a second class status for gays, then we should give the Archbishop of Canterbury no more honor than Tom DeLay.

In other words, if you want to pray and believe, fine. But to put a folkloric account of our beginnings on the same plain as massive scientific research is not a sign of faith but of ignorance or delusion. And if you want to play politics you've got to fight by its rules and not hide under a sacred shield.

Smith also makes an important point about the different standards that are applied to different groups.
After all, is it worse to be anti-Catholic than anti-African? Is it worse to be anti-Semitic than to be anti-Arab? Is it worse to be anti-Anglican than anti-gay? Our culture encourages a hierarchy of antipathies which instead of eliminating prejudices merely divides them into the acceptable and the rejected. Part of the organization of some 'organized' religion has been to make itself sacred while the devil takes the rest of the world.

Smith's essay is thought provoking. You should take a look at the whole thing.

POST SCRIPT

I'd like to thank commenters Cool and Becky for recommending the novel Good Omens by Neil Gaiman and Terry Pratchett, which is based on the rapture. I read it and it is a funny book, written in a style that is a cross between Douglas Adams' The Hitchhiker's Guide to the Galaxy and the British schoolboy fiction William series by Richmal Crompton.

There were some plans to make Good Omens into a feature film directed by Monty Python's Terry Gilliam, and starring Johnny Depp and Robin Williams, but the project apparently got cancelled due to lack of funding. Too bad.

July 20, 2005

Religious beliefs and public policy - 2

In the previous post I discussed the problems that can arise when religious beliefs start influencing public policy. But because issues of the environment and global warming are so long term, it is possible, in the short term, to ignore the inherent contradictions that can arise. But this luxury is not available when it comes to issues of war and peace.

For example, take the turmoil in the Middle East. Whatever one's political views, one would hope that in general all would tend to agree that long-term peace is a good thing and that policies that increase the risk of violence and instability are bad things. So one would think that if one was convinced that a certain policy might lead to greater risk of war in the Middle East, then that policy should be avoided.

But in the topsy-turvy world of rapture-based politics such assumptions do not hold. Take for example, the so-called "Road Map" for Middle East peace, a strategic plan that has been proposed by the United States, European Union, United Nations, and Russia and is seen as providing hope for long-term peace between Israel and the Palestinians. The Dominionists (or dispensationalists) are not thrilled by it. As Barbara Rossing says in her book The Rapture Exposed (p. 46):

The influence of dispensationalism can be seen also in fundamentalist Christians' opposition to the U.S.-backed "Road Map" for peace in Israel and Palestine. "The Bible is my Road Map," declares an Internet petition circulated by [Pat] Robertson, [Jerry] Falwell, and LaHaye in opposition to a negotiated solution to the Israeli-Palestinian conflict. Peace and peace plans in the Middle East are a bad thing, in the view of fundamentalist Christians, because they delay the countdown to Christ's return. Israel must not compromise by giving back any occupied territory to the Palestinians. New Israel settlements and a rebuilt third temple are God's will for Israel, no matter how violent the consequences.
The dispensationalist version of the biblical story requires tribulation and war in the Middle East, not peace plans. That is the most terrifying aspect of the distorted theology. Such blessing of violence is the very reason why we cannot afford to give in to the dispensationalist version of the biblical storyline – because real people's lives are at stake.

You cannot persuade Dominionists that hard-line Israeli policies should be rejected because they will lead to instability and chaos and bloodshed, because they see this as an argument in their favor. It is as a good thing because it is a sign of the second coming. Similarly, policies that might lead to increased upheaval in Iraq, Iran, Syria, and so on are welcomed as fulfillments of their version of Biblical prophecy of the end-times.

It is somewhat bizarre that people who hold such views on what public policies should be adopted seem to have access to the media and influential policy makers in the government. Pat Robertson, Jerry Falwell, James Dobson, and a whole host of Dominionist people like to emphasize the fact that they have strong influence and access to the levers of government.

What should be the response to this? The next posting will examine the options.

POST SCRIPT

Well, it had to come, the inevitable link between capitalism and end-times theology. Mark Wilson's blog reports on a new series of video games based on the rapture to be released soon.

July 19, 2005

Religious beliefs and public policy

Barbara Rossing is an ordained minister in the Evangelical Lutheran Church and is a faculty member at the Lutheran School of Theology at Chicago. She is an evangelical who feels that the rapturists have, in trying to take the Bible literally, totally distorted its message. Her book The Rapture Exposed is her attempt to reclaim the message of the Bible. In the process, she argues that although this is a religious dispute between segments of Christianity, we should all, whatever our beliefs, take it seriously because it has public policy implications for all of us.

In a comment to a previous posting about the rapture, Professor Madigan spoke about her former sister-in-law who back in the 1980s was convinced that the rapture was imminent and that the day had been specified and that she would be one of the chosen. She then proceeded to run up her credit card bills, thinking that she would not have to pay it back. Of course, she had to deal with all the bills when the rapture did not happen. Dave's comment in the previous entry seems to indicate that this kind of credit-card behavior is quite widespread. (Here is an interesting conundrum: Is it unethical to run up bills that you have no intention of paying if you think that the end of the world is about to occur?)

I would imagine that this kind of extreme behavior is somewhat rare and that most believers in the rapture hedge their bets and continue to make their mortgage, credit card, and insurance payments.

But even if some people are tempted to act recklessly, such actions by private individuals do not do too much damage to the community at large. But not all rapture-influenced actions are that innocuous. Rossing's book reveals some startling information about rapture-influenced political appointees that I was not aware of (since I was not in the US at that time) but whose actions can affect all of us. One such person is James Watt who was appointed to the post of Secretary of the Interior after Ronald Reagan was elected in 1980. Rossing says:

Reagan-era Secretary of the Interior James Watt told U.S. senators that we are living at the brink of the end-times and implied that this justifies clearcutting the nation's forests and other unsustainable environmental policies. When he was asked about preserving the environment for future generations, Watt told his Senate confirmation hearing, "I do not know how many future generations we can count on before the Lord returns." (p. 7)

One might wonder how such a person as James Watt could ever have been confirmed to the post that is entrusted to protect the environment. One would think that the job description for the position of Secretary of the Interior requires someone who takes a very long-term view, and that anyone who cannot envisage the need to take care of the environment beyond the next few generations would be eliminated. And perhaps there was a time when such people would not be nominated to high positions but that seems to be no longer the case. Nowadays, politicians seem to feel obliged to wear their religion on their sleeves and proudly proclaim how it influences everything they do.

Of course, most people are religious in some way and there is no doubt that their religious beliefs will have an effect on what they do and what policies they support. We should protect people's right to believe whatever they want. But should that protection also extend to public policies that they wish to implement that are based on their religious beliefs? Can we draw a line between policies based on religion that are acceptable and those that are not? Or is it better to simply say that any public policy that has religion as its only basis is not acceptable.

These questions become more apparent with issues such as global warming. If global temperatures are rising at about one degree per century as experts suggest, then in a few centuries the melting of the polar ice gaps, the loss of glaciers, and the consequent rise in sea levels would have catastrophic consequences, causing massive flooding of coastal areas and huge climatic changes. Suppose the Secretary of the Interior says that since the end of the world is going to occur long before then, we should not worry about it, should that person be removed from office? Is it religious discrimination to say that we should not be basing public policy on religious beliefs?

If James Watt had been rejected as a nominee because of the feeling that his religion-based short-term views were dangerous for the environment, could it have been alleged that he was the subject of religious discrimination?

This is a very tricky question because while we do not want to impinge on people's right to religious beliefs, we do have a responsibility to base policies on empirical evidence. The public policy implications of religion becomes even more alarming when applied to issues of war and peace as we will see in the next posting.

July 18, 2005

The allure of rapture violence

I must say that since I recently started reading about the rapture (see here and here for previous posts on it), it has fascinated me. (Some readers of this blog who had never heard of the rapture before I started posting on it have told me they were startled to find people they know accepting the idea of it very matter-of-factly, as if it were nothing special.) Not that I take the basic idea of huge numbers of people being transported suddenly up into heaven seriously, of course. That strikes me as a wild flight of fancy that belongs in the same genre as Star Wars or Harry Potter films, i.e., enjoyable largely because it is so outrageously improbable.

No, what interests me is the sociology behind it, especially the question of what it is that attracts otherwise presumably regular people to believe in this dark tale of violence, revenge, cruelty, and blood.

The basic rapture story is that at some point (devotees think "very soon"), a select group of people will be raptured up to heaven by Jesus, and from that vantage point they will observe the Antichrist ruling the world, leading to seven years of violent struggle between the good and bad forces left behind on Earth. As the Left Behind novel series illustrates, the people propagating the rapture myths do not see the violence in the rapture story as something that is a necessary evil, to be passed over quickly before the final victory of God. No, they actually wallow in it, imagining it in the most lurid of details. In his highly entertaining review of the books, Gene Lyons recounts one passage:

Rayford watched through the binocs as men and women soldiers and horses seemed to explode where they stood. It was as if the very words of the Lord had superheated their blood, causing it to burst through their veins and skin. . . . Their innards and entrails gushed to the desert floor, and as those around them turned to run, they too were slain, their blood pooling and rising in the unforgiving brightness of the glory of Christ.

and says that "the slaughter runs on for close to eighty gleeful pages." (my emphasis)

I find it hard to fathom the attraction of something like this. I have never understood the appeal of violence, even fictionalized, so that I avoid films that have excessive amounts of it. This dislike started early for me. I remember even as a very young child hating circuses, mainly because they had people (like trapeze artists) who were risking death and injury just for the sake of entertaining others, and this just made no sense to me. I could have tolerated it if the trapeze artists and tightrope walkers performed with a safety net. I can understand that some people (like firefighters, police, and soldiers) have to take risks as part of their job, but they at least take as many safety precautions as they can. Taking huge and avoidable risks just for entertainment seems absurd to me.. The recent mauling of Roy Horn by the tiger in his act sickened me because the senselessness of it all.

I know that many people do not have the same reaction to violence as I do, but there is a difference (I feel) between being able to tolerate violence as a an unavoidable component of life and actually enjoying it, and it is the latter response that I find hard to understand. I know that there are people who flock to the scene of some disaster, hoping to see the injured or dead, and then relish repeating what they have seen over and over to whoever will listen. That kind of attitude is incomprehensible to me. But Barbara Rossing in her book The Rapture Exposed suggests that one can get addicted to violence. She quotes Chris Hedges who said that as a former New York Times war correspondent in El Salvador, Bosnia, Kuwait, Iraq, and elsewhere, he became addicted to war. It was like a narcotic and he had to tear himself away from it.

It is clear that rapture enthusiasts have no such qualms about the violence associated with their story. Perhaps the reason that violence may be more enjoyable to them is that they are able to see themselves as purely in a spectator role, like those who, safely from a distance, watch Siegfried and Roy with their tiger. Rossing (p. 138-140) suggests that this might indeed be the case. She suggests that since rapture believers are confident that they will be among the chosen few who are taken up to heaven and escape the carnage, they see themselves as essentially like TV viewers up in heaven, safely watching the violence from that Lazy-Boy in the sky, as if it were some spectator sport. It was interesting that some rapture believers apparently got excited by the recent bombings in London, believing that it signaled the beginning of the rapture.

Rossing argues that all the violence associated with the rapture is a misreading of the Bible. It is quite possible to interpret the Biblical second coming of Jesus in a peaceful way but rapturists insist on interpreting everything in the most gruesome way, trying to put in the most blood and gore possible.

For those who do not believe in the rapture, it might be tempting to dismiss the rapture phenomenon as harmless fantasy, and treat devotees the way we treat those who indulge in (say) violent video games. After all, if I do not believe the rapture is going to happen, why would I care if others do?

But Rossing suggests that we should not be so complacent and that belief in the rapture has public policy consequences that affect all of us. More about this in the next posting.

July 15, 2005

Why I blog

I reached a kind of landmark this week with this blog. I have been making entries since January 26th, posting one item each weekday, except for a three-week break in June. As a result I have now posted over 100 entries and consisting of over 100,000 words, longer than either of my two published books.

Why do I blog? Why does anyone blog? The Doonesbury comic strip of Sunday, July 3, 2005 fed into the stereotype of bloggers as self-important losers who cannot get real jobs as writers, and feed their ego by pretending that what they say has influence. The idea behind this kind of disparaging attitude is that if no one is willing to pay you to write, then what you have to say has no value.

Of course, there are a vast number of bloggers out there, with an equally vast number of reasons as to why they blog so any generalization is probably wrong. So I will reflect on why I blog. Some bloggers may share this view, others may have different reasons. So be it.

The first reason is the very fact that because of the blog, I have written the equivalent of a complete book in six months. Writing is not easy, especially starting to write on any given day. Having a blog enforces on me a kind of discipline that would not exist otherwise. Before I started this blog, I would let ideas swirl around in my head, without actually putting them down in concrete form. After awhile, I would forget about them, but be left with this nagging feeling of dissatisfaction that I should have explored the ideas further and written them down.

The second benefit of writing is that it forces you to clarify and sharpen your ideas. It is easy to delude yourself that you understand something when you have the idea only in your mind. Putting those ideas to paper (or screen) has the startling effect of revealing gaps in knowledge and weaknesses of logic and reasoning, thus forcing a re-evaluation of one's ideas. So writing is not a one-way process from brain to screen/paper. It is a dialectic process. Writing reveals your ideas but also changes the way you think. As the writer E. M. Forster said “How can I know what I am thinking until I see what I say?” This is why writing is such an important part of the educational process and why I am so pleased that the new SAGES program places such emphasis on it.

Another benefit for me is that writing this blog has (I hope) helped me become a better writer, able to spot poor construction and word choice more quickly. Practice is an important part of writing and the blog provides me with that. Given that the blog is public and can (in principle) be read by anyone prevents me from posting careless or shoddy pieces. It forces me to take the time to repeatedly revise and polish, essential skills for writers.

When I started this blog, I had no idea what form it would take. Pretty soon, almost without thinking, it slipped into the form that I am most comfortable with, which is that of a short essay around a single topic each day. I initially feared that I would run out of ideas to write about within a few weeks but this has not happened. In fact what happens is what all writers intuitively know but keep forgetting, which is that the very act of writing acts as a spur for new ideas, new directions to explore.

As I write, new topics keep coming into my mind, which I store away for future use. The ideas swirl around in my head as I am doing other things (like driving and chores), and much of the writing takes place in my mind during those times as well. The well of ideas to write about does not show any signs of going dry, although it does take time to get the items ready for posting, and that is my biggest constraint. Researching those topics so that I go beyond superficial "off the top of my head" comments and have something useful to say about them has been very educational for me.

Since I have imposed on myself the goal of writing an essay for each weekday, this has enabled me to essentially write the first draft (which is the hardest part of writing, for me at least) of many topics that may subsequently become articles (or even books) submitted for publication. If I do decide to expand on some of the blog item for publication, that process should be easier since I have done much of the preliminary research, organization, and writing already.

All these benefits have accrued to me, the writer, and this is no accident. I think most writing benefits the author most, for all the reasons given above. But any writer also hopes that the reader benefits in some way as well, though that is hard for the author to judge.

I remember when I was younger, I wanted to "be a writer" but never actually wrote anything, at least anything worthwhile. Everything I wrote seemed contrived and imitative. I then read a comment by someone who said that there is a big difference between those who want to be writers and those who want to write. The former are just enamored with idea of getting published, of being successful authors and seeing their name in print. The latter feel that they have something to say that they have to get out of their system. I realized then that I belonged to the former class, which I why I had never actually written anything of value. With that realization, I stopped thinking of myself as a writer and did not do any writing other than the minimum required for my work. It is only within the last ten years or so that I feel that I have moved into the latter category, feeling a compulsion to write for its own sake. This blog has given me a regular outlet for that impulse.

I would never have written so much without having this blog. I would recommend that others who feel like they have to write also start their own. Do not worry about whether anyone will read it or whether they will like it. Write because you feel you have something to say. Even if you are the only reader of your own writing, you will have learned a lot from the process.

POST SCRIPT

Paul Krugman is an economist at Princeton University and is a member of the reality-based community. His July 15, 2005 op-ed in the New York Times shows how far politics has moved away from this kind of world and into one in which facts are seen as almost irrelevant.

Thanks to Richard Hake for the following quote by F.M. Cornford, Microcosmographia Academica - Being A Guide for the Young Academic Politician (Bowes & Bowes, Cambridge, 4th ed., 1949 first published in 1908), which might well have been addressed to Krugman and other members of the reality-based community, although it was written over a century ago:

You think (do you not?) that you have only to state a reasonable case, and people must listen to reason and act upon it at once. It is just this conviction that makes you so unpleasant….are you not aware that conviction has never been produced by an appeal to reason which only makes people uncomfortable? If you want to move them, you must address your arguments to prejudice and the political motive….

July 14, 2005

"I know this is not politically correct but...."

One of the advantages of being older is that sometimes you can personally witness how language evolves and changes, and how words and phrases undergo changes and sometimes outright reversals of meaning.

One of the interesting evolutions is that of the phrase "politically correct." It was originally used as a kind of scornful in-joke within Marxist political groups to sneer at those members who seemed to have an excessive concern with political orthodoxy and who seemed to be more concerned with vocabulary than with the substance of arguments and actions.

But later it became used against those who were trying to use language as a vehicle for social change by making it more nuanced and inclusive and less hurtful, judgmental, or discriminatory. Such people advocated using "disabled" instead of "crippled" or "mentally ill" instead of "crazy," or "hearing impaired" instead of "deaf" and so on in an effort to remove the stigma under which those groups had traditionally suffered. Those who felt such efforts had been carried to an extreme disparaged those efforts as trying to be "politically correct."

The most recent development has been to shift the emphasis from sneering at the careful choosing of words to sneering at the ideas and sentiments behind those words. The phrase has started being used pre-emptively, to shield people from the negative repercussions of stating views that otherwise may be offensive or antiquated. This usage usually begins by saying "I know this is not politically correct but...." and then finishes up by making a statement that would normally provoke quick opposition. So you can now find people saying "I know this is not politically correct but perhaps women are inferior to men at mathematics and science" or "I know this is not politically correct but perhaps poor people are poor because they have less natural abilities" or "I know this is not politically correct but perhaps blacks are less capable than whites at academics." The opening preamble is not only designed to make such statements acceptable, the speaker can even claim the mantle of being daring and brave, an outspoken and even heroic bearer of unpopular or unpalatable truths.

Sentiments that would normally would be considered discriminatory, biased, and outright offensive if uttered without any supporting evidence are protected from criticism by this preamble. It is then the person who challenges this view who is put on the defensive, as if he or she was some prig who unthinkingly spouts an orthodox view.

As Fintan O'Toole of The Irish Times pithily puts it:

We have now reached the point where every goon with a grievance, every bitter bigot, merely has to place the prefix, "I know this is not politically correct but.....'" in front of the usual string of insults in order to be not just safe from criticism but actually a card, a lad, even a hero. Conversely, to talk about poverty and inequality, to draw attention to the reality that discrimination and injustice are still facts of life, is to commit the new sin of political correctness......... Anti-PC has become the latest cover for creeps. It is a godsend for every sort of curmudgeon or crank, from the fascistic to the merely smug.
Hate blacks? Attack positive discrimination - everyone will know the codes. Want to keep Europe white? Attack multiculturalism. Fed up with the girlies making noise? Tired of listening to whining about unemployment when your personal economy is booming? Haul out political correctness and you don't even have to say what's on your mind.

Even marketers are cashing in on this anti-PC fad, as illustrated by this cartoon.

Perhaps it is my physics training, but I tend to work from the principle that in the absence of evidence to the contrary, we should assume that things are equal. For example, physicists assume that all electrons are identical. We don't really know this for a fact, since it is impossible to compare all electrons. The statement "all electrons are identical" is a kind of default position and, in the absence of evidence to the contrary, does not need to be supported by positive evidence.

But the statement "some electrons are heavier than others" is going counter to the default position and definitely needs supporting evidence to be taken seriously. Saying "I know this is not politically correct but I think some electrons are heavier than others" does not make it any more credible.

The same should hold for statements that deal with people, because I would like to think that the default position is that people are (on average) pretty much the same in their basic needs, desires, feelings, hopes, and dreams.

POST SCRIPT 1

I love movies but am not a big fan of Tom Cruise's films. I was surprised, though, by the way people went after him (and his Church of Scientology) for his recent comments on psychiatry and mental illness. I was first bemused that this topic arose in the interview with him, and then by the subsequent reaction where it was as if people felt that he had no right to his views on this subject. Even if people disagree with him, why do they get so upset? Why do people even care what his views are about psychiatry? I was thinking of writing something on this incident but then came across this article which covers some of the ground I would have, and also raises the problematic role that the big pharmaceutical companies have played in this issue of treating illnesses with drugs.

POST SCRIPT 2

The invaluable website Crooks and Liars website has posted a funny The Daily Show video clip about the Valerie Plame affair. Since I do not have cable and don't watch too much TV anyway, I depend on Crooks and Liars and onegoodmove to alert me to TV segments that I otherwise would miss. These two sites are well worth bookmarking.

July 13, 2005

Should professors reveal their views?

During the last academic year, UCITE organized a faculty seminar on whether, and how much, of their own views professors should reveal to the students in their classes.

One faculty member recalled one of her own teachers admiringly. She said that he had guided the discussions in her college classes very skillfully and in such a way that no one knew what his own views were on the (often controversial) topics they discussed. She felt that his avoidance on revealing his own views led to a greater willingness on the part of students to express their own, since they were not agreeing or disagreeing with the authority figure. She felt that his model was one that others should follow.

Underlying this model is the belief that students may fear that going against the views of the professor might result in them being penalized or that agreeing with the professor might be viewed as an attempt at ingratiation to get better grades.

I am not convinced by this argument, both on a practical level and on principle, but am open to persuasion.

As a purely practical matter, I am not sure how many of us have the skills to pull off what this admired professor did. It seems to me that it would be enormously difficult to spend a whole semester discussing things with a group of people without revealing one's own position on the topics. It is hard to keep aloof from the discussion if one is intensely interested in the topic. As readers of this blog know, I have opinions on a lot of things and if such topics come up for discussion, I am not sure that I have the ability to successfully conceal my views from students. So many of us will betray ourselves, by word or tone or nuances, despite our best efforts at concealment.

But I am also not convinced that this is a good idea even in principle, and I'd like to put out my concerns and get some feedback, since I know that some of the readers of this blog are either currently students or have recently been students.

One concern about hiding my own views is precisely that the act of hiding means that I am behaving artificially. After all, I assume that students know that academics tend to have strong views on things, and they will assume that I am no exception. Those students who speak their minds irrespective of the instructor's views won't care whether I reveal my views or not, or whether they agree with me or not. But for those students for whom my views are pertinent, isn't it better for them to know exactly where I stand so that they can tailor their comments appropriately and accurately, rather than trying to play guessing games and risk being wrong?

Another concern that I have arises from my personal view that the purpose of discussions is not to argue or change people's views on anything but for people to better understand why they believe whatever they believe. And one of the best ways to achieve such understanding is to hear the basis for other people's beliefs. By probing with questions the reasoning of other people, and by having others ask you questions about your own beliefs, all of the participants in a discussion obtain deeper understanding. In the course of such discussions, some of our views might change but that is an incidental byproduct of discussions, not the goal.

Seen in this light, I see my role as a teacher as modeling this kind of behavior, and this requires me to reveal my views, to demonstrate how I use evidence and argument to arrive at my conclusions. I feel (hope?) that students benefit from hearing the views of someone who has perhaps, simply by virtue of being older, thought about these things for a longer time than they have, even if they do not agree with my conclusions. To play the role of just a discussion monitor and not share my views seems to defeat one of the purposes of my being in the classroom.

The fly in the ointment (as always) is the issue of grades. I (of course) think that I will not think negatively of someone who holds views opposed to mine and it will not affect their grades. But that is easy for me to say since I am not the one being graded. Students may not be that sanguine about my objectivity, and worry about how I view them if they should disagree with me.

When I raised this topic briefly with my own class last year, they all seemed to come down in favor of professors NOT hiding their personal views. But I am curious as to what readers of this blog think.

Do you think professors should reveal their views or keep them hidden?

POST SCRIPT 1

The website Crooks and Liars has posted a funny video clip from the Daily Show that addresses how high levels of fear are generated in America, a topic that I blogged about earlier.

This article by John Nichols compares the British response to the tragedy with the way the American media tried to frame it.

POST SCRIPT 2

Also, for those of you struggling to keep up with the complicated set of issues involved with the Valerie Plame-Joseph Wilson-Robert Novak-Judith Miller-Matthew Cooper-confidential journalistic sources issue, there is an excellent article by Robert Kuttner that (I hope) clears things up.

July 12, 2005

Catholic Church reversing course on evolution?

It was only on May 19 that I compared religious reaction to two major scientific revolutions, those identified with Copernicus and Darwin, and showed that in each case religious objections to the new theories only arose more than a half-century after the theories were published, and then began with Protestants, rather than the Catholic Church. The religious opposition may have been slow in coming because it took some time for the theological implications of the new cosmology to be realized. In fact, the religious opposition was rising just about the time that the scientific debates were ending, and the scientific community was coalescing behind the new theories as more and more supporting data were coming in.

Thomas Kuhn in his book The Copernican Revolution attributes the Catholic opposition that eventually arose to Copernicus as possibly connected to the challenges it felt from the newly emergent Protestant churches, who were quicker to criticize it.

I argued that this pattern had almost repeated itself with the evolution debate except that the Catholic Church in this case did not join the Protestants in vigorously opposing evolution, but seemed to be more accepting, with Pope John Paul II in 1996 saying that as long as the soul was divinely created, there was no problem with accepting the physical-biological aspects of evolution. I suggested that it was unlikely, giving the bruising it has received over the Copernicus affair, that the Catholic church would repeat their mistake and attack Darwin.

Well, so much for that prediction. Enter cardinal Christoph Schonborn (the Roman Catholic cardinal archbishop of Vienna, who was the lead editor of the official 1992 Catechism of the Catholic Church) with his July 7, 2005 op-ed in The New York Times which indicates the beginning of a serious back-pedaling from Pope John Paul II's stand. Although in interviews the cardinal claims that his essay represents a personal opinion and was not endorsed formally by the Church, there seems to be little doubt that this is a kite that is being sent up by the new Pope Benedict XVI to see what the reaction might be.

So what is the cardinal's position? It seems clear that his fundamental cause of concern is that natural selection is not teleological, in the sense that it has no predetermined goal and is not directed towards anything. It is this feature that the cardinal finds objectionable and it is this feature that is also a cause for major concern for those in the so-called "intelligent design" (ID) movement. In fact, the cardinal's statement "Evolution in the sense of common ancestry might be true, but evolution in the neo-Darwinian sense - an unguided, unplanned process of random variation and natural selection - is not. Any system of thought that denies or seeks to explain away the overwhelming evidence for design in biology is ideology, not science" could be lifted straight out of the ID literature. Indeed Mark Ryland, a vice president of the Discovery Institute (which is a driving force behind ID) said that he had urged the cardinal to write the essay.

One of the main items of belief for many Christians is that human beings are special in some way, chosen by God for some special purpose. The heliocentric model proposed by Copernicus challenged early versions of this belief since it opened up the idea that the universe was very large, possibly infinite, and the Earth was just one of many planets. But some sophisticated theological footwork enabled that hurdle to be overcome and the Church eventually came to terms with Copernican views.

But Darwin poses a more difficult challenge to human specialness because it deals with humans directly, and not just the place where humans live. The idea that human beings are the byproduct of the same natural processes that produced apes, butterflies, and daffodils raises the question of what exactly God's relationship to humans is.

Pope John Paul II (in a widely quoted statement in 1996) seemed to be satisfied with the idea that God breathes a soul into each human, and it is that action that creates the special relationship, and the physical-biological aspects of evolution did not create any problems. It this restricted view that seems to be being reconsidered, and the op-ed argues that God should have some role in the biological development of humans as well. The cardinal tries to co-opt Pope John Paul II to support this revisionist view by appealing to an earlier statement by him (made in 1985) and making the curious argument that it is the earlier statement that represent his "robust" view on the topic. Although he tries to dismiss the 1996 statement as "rather vague and unimportant" (surprisingly dismissive language I thought about his former boss and someone being proposed for sainthood), he does not explain why he thinks that statement (which was not an off-hand statement but addressed to no less a body than the Pontifical Academy of Sciences) should be taken less seriously than his 1985 statement.

The cardinal seems to want a teleological theory of evolution, one that is goal-directed so that humans were pre-ordained to come into being as the ultimate expression of God's plan for the universe. In fact he goes further, quoting this statement of Pope Benedict XVI at his installation ceremony where he said: "We are not some casual and meaningless product of evolution. Each of us is the result of a thought of God. Each of us is willed, each of us is loved, each of us is necessary." (emphasis added)

I don't see how this can be made compatible with the theory of natural selection, or indeed with any scientific theory since I have argued earlier that a necessary condition for scientific theories is that they be naturalistic and this rules out any kind of "hidden hand" that acts outside the discoverable laws of science. So if this trial balloon by cardinal Schonborn becomes official Catholic doctrine, then we are back on course for another science-Catholic collission. But this time, the tools of the Inquisition for enforcing orthodoxy do not exist, and placing books and teachings on a "banned" list would invite ridicule in the age of the internet.

It is sometimes said that the Supreme Court "follows the election returns," meaning that it is mindful of the popular mood when it rules on constitutional issues. The same is true of the church. It will not do for it to be too far out of step with its congregation's views. And it may feel that there is support (at least in the US) for the cardinal's view. The recent Harris poll (taken in June 2005) says that 47% of Americans reject the common ancestry of man and apes. Also only 38% of Americans agreed with the statement "Yes, I think human beings developed from earlier species" (compared with 44% in March 1994), and 64% now believe that "humans beings were created directly by God."

As I have said before, in the long run, scientific ideas tend to be more resilient than religious ones, losing out only to other scientific ideas and not to religiously based ones. But it will be interesting to see how this issue plays out in the days to come. I believe that whether the church adopts the new policy as formal doctrine will depend on the reaction to the cardinal's trial balloon. If there is a strong negative reaction, the cardinal's essay may lead to nothing and become just a curious footnote of interest to church historians.

July 11, 2005

Public and private grief

One of the things that strikes me is America seems to have a fascination with memorials and ceremonies honoring the dead.

There are memorials for the various major wars, there is a memorial built for the Oklahoma City bombing, for the Lockerbie disaster, and there is the present bitter argument over the proposed memorial at the site of the World Trade Center. But there is more to it than physical memorials. There are also memorial ceremonies held on the anniversaries of these events, complete with flags, prayers, political leaders, speeches, and media coverage.

Has it always been like this or is this a relatively new phenomenon? I ask because this extended public and organized brooding on tragedy seems strange to me. In my experience growing up in Sri Lanka, after a major disaster, people tend to quickly clear up the mess and move on. There are some memorials, but they tend to be for dead political figures and are built by their immediate families or their political supporters. The idea of public memorializing is not widespread.

Of course, the family and friends of people lost to tragedies feel grief, and this is a universal phenomenon, transcending national and cultural boundaries. It is perfectly natural for such people to feel a sense of sadness and loss when an anniversary date comes around, reminding them of those who are no longer part of their lives. The personal columns of newspaper in Sri Lanka are filled, like they are here, with the sad stories of loss, some from many, many years ago.

But I wonder how much of this memorializing and solemnity is widespread among people who do not suffer a direct personal loss. At each anniversary of 9/11, for example, the media solemnly report that the whole nation 'paused in grief' or something like that. But among the people I know and work with, no one talks about the events on the anniversaries. Are we a particularly callous group of people, or is my experience shared by others? Of course, people may reflect on the events on those days but how much of that is media inspired, because the newspapers and radio and TV keep talking about it? If the media ignored these anniversaries, would ordinary people give these anniversaries more than a passing thought? How many people feel a sense of grief or sorrow on the anniversaries of disasters that did not affect them personally?

In Sri Lanka, the recent tsunami killed about 40,000 people in a matter of minutes. It is the worst single disaster in country that has known a lot of tragedies, both natural and human-caused. Like disasters everywhere, it brought out the best in people as they overlooked ethnic, religious, and linguistic barriers and joined in the massive relief efforts, helping total strangers using whatever means at their disposal.

And yet, on my recent visit, I did not hear of any plans for a public tsunami memorial. I am fairly certain that if anyone proposed it, people would (I think rightly) argue that the money would be better spent on relief for the victims of the disaster rather than on something symbolic.

This made me wonder about the following: while private grief is a universal emotion, I wonder if public grief is a luxury that only the developed world can afford to indulge in. In countries where the struggle of day-to-day living takes most of one's energy, is grief a precious commodity that people can expend only on the loss of their nearest and dearest, except in the immediate aftermath of a major tragedy?

July 08, 2005

Undermining faith in the judiciary

I have always believed that people tend to behave better than one might expect them to when placed in positions of trust where high standards of behavior are expected of them. One particular kind of occupation exemplifies my belief, and that is judges.

The public expects members of the judiciary to act according to higher standards than the rest of us and I think that this expectation generally tends to be fulfilled. I believe that whenever someone enters a profession that has a noble calling, the very nature of the office tends to produce an ennobling effect.

This is particularly so in the case of the higher levels of judiciary. A person who becomes a Supreme Court judge, for example, is well aware that he or she is occupying a select position of great trust and responsibility, and I cannot help but believe that this will rub off on that person, making him or her strive to be worthy of that trust. This does not make them superhuman. They are still subject to normal human weaknesses and failures. They may still make wrong decisions. But I think that in general they behave better by virtue of occupying those positions than they might otherwise, and try to live up to the standards expected of them.

But this works only if the judges feel they are entering a noble calling and that they are expected to live up to it. If the prestige and the dignity of the judiciary is undermined by treating judges as if they were just political hacks, then they will behave accordingly. This is why I view with concern attempts by people, especially political leaders, to undermine faith in the judiciary. There are two ways in which this happens.

The first way is to personally attack judges whenever a decision does not go the way they wanted it to go. This tendency has accelerated in recent years in the US, as can be seen by the ugly venom heaped on the Florida judge in the Terry Schiavo case. We have seen similar invective hurled at judges when they have ruled in ways that people have not liked on hot button issues, ranging from First Amendment cases involving religion to flag burning to abortion. The judges have been decried as being "judicial activists" and worse.

The second way to undermine the judiciary is by clearly seeking to appoint judges precisely because they have a particular political agenda, and not because they display the intellect and independence of thought that a good judge should have.

Sri Lanka again offers an unfortunate precedent for this degeneration. It used to have a fairly independent judiciary whose members were nominated by a Judicial Services Commission, whose members were at least one step removed from direct political influence. It was expected that the JSC would nominate people who had serious credentials and hence there was the belief among the general public that judges were, on the whole, impartial although individual judges here and there may have been suspect. But again beginning in the 1970s, the government started to severely criticize judges who ruled against the government, even sending mobs to demonstrate in front of judges' homes and try to intimidate them.

After that, it was only a short step to create a more overtly political process for the selection of judges, in order to ensure that decisions would be more acceptable to the government. Despite this, the ennobling effect that I spoke of earlier helped to make the judges better that one might expect, but it was a losing battle. When I was in Sri Lanka last month, I was told that faith in the impartiality of the judiciary had been badly undermined by the cumulative effects over the years of such negative policies.

This is a real pity because this kind of credibility, once lost, is hard to regain. Undermining the judiciary in this way a dangerous trend for any nation that values the rule of law. When you undermine faith in the impartiality and honesty of judges, you are just one step away from mob rule.

As I said above, the US seems to have already started down this unfortunate road. The upcoming battle for the Supreme Court vacancy created by the retirement of Sandra Day O'Connor will provide a good indication of the shape of things to come. It will be unfortunate if people focus on the nominee's views on specific issues. I would much rather see an examination of whether the person shows a scholarly mind, whether he or she has shown an independence of thought, whether the person bases judgments and reasoning on evidence and on universal principles of justice and the constitution, whether the person shows compassion and understanding of the human condition in all its complexity, whether he or she listens to, understands, and appreciates the arguments of even those whom he or she rules against, and appreciates that the judiciary is the ultimate safeguard of rights and liberties for individuals against the massive power of governments and corporations. In other words, does the person have what we might identify as a "judicial temperament."

If you speak with lawyers, they can often identify those judges whom they respect, even when those judges rule against the lawyers. Identifying what makes judges respected despite their specific opinions on specific cases is what the discussions about selecting a new Supreme Court justice should be all about.

But if the discussion ends up being (as I fear they will) about the nominee's views on the Ten Commandments, abortion, gay marriage, flag burning, and the like, then we will be continuing to cheapen the whole Supreme Court.

The nature of the nominee and the discussion around it will tell us a lot about how we will be viewing the judiciary in the days, and perhaps generations, to come.

POST SCRIPT

Are you interested in having thoughtful discussions on deep topics? Consider attending the Socratease discussions. These are open to anyone and held every second Tuesday of each month (next one is on June 12) at Night Town restaurant on Cedar Road in Cleveland Heights (in the Cedar/Coventry area). The discussions are from 7:30-9:30pm. You are under no obligation to order food or drink from the restaurant.

The format is that all the people present who wish to can suggest a topic for the night's discussion, then a vote is taken, and the winning topic becomes the focus for that night.

July 07, 2005

Should one use raw political power or govern by consensus?

The second parallelism I saw between political developments in Sri Lanka and in the US has been the breakdown in the usual rules of behavior regarding building consensus.

To some extent, politics in both Sri Lanka and the US are insider's games. The people in the leadership of the two main parties tend to be members of the same elites, representing two wings of the same political and social class, and the same interests. Hence they tend to adopt 'rules of the game' that are usually civil and polite. One benefit of this civility is that the interests of the minority party at any given time is not completely ignored because the ruling party understands that it might be in the minority after the next election. The disadvantage is that the two parties do not challenge the basic status quo and the ruling elites, since they are both members of that same class.

Looking only at the positive side, the protection of minority interests in consensus-style governance has the effect of providing some political continuity, especially in parliamentary systems, where the legislature is a collection of individuals, each elected to represent a given geographic area. Under such a system, it is possible for a political party with just a bare majority of voters to have an overwhelmingly large majority in the legislature. In multiparty systems, it is even possible for a party that does not have even a majority of the popular vote to have a majority in the legislature. This can lead to wide swings in legislative majorities after each election (without a corresponding swing in actual votes or voter sentiment), so having rules that enable people to function both in and out of power becomes important

In Sri Lanka, this sense of following unwritten rules that benefited both sides existed until around 1970 and then started falling apart as each of the major parties started using raw political power to ram through policies that tended to ignore the interests of the opposing party. In 1970, one party (the SLFP) got a huge (more than two-thirds) majority and used that majority to change the constitution using a device of somewhat dubious legality. They even used their majority to unilaterally extend the life of the parliament by two years, so that instead of elections falling due in 1975, they were next held in 1977.

But in 1977, there was another big swing of the political pendulum and the opposition party (UNP) came to power with a huge (also more than two-thirds) majority. They too enacted sweeping constitutional changes, a chief one being replacing the old system of individual seats by a proportional representation system. Since the proportional representation system tends to provide a parliament that reflects more accurately the percentage of votes a party receives, it also had the effect of ensuring that future parliaments would be unlikely to have the two-thirds parliamentary majorities needed for undoing the constitutional changes that the UNP had put in place.

But since the huge majority that the UNP obtained in 1977 was still in place until the next election, the UNP was free to make constitutional changes freely, and this they proceeded to do, sometimes in the most self-serving way, making a mockery of democratic principles. For example, the government extended the life of the parliament (with its huge majority) by using a simple-majority referendum.

Another example of using raw power was how political defections were handled. It used to be that governments could be toppled if enough members of parliament switched allegiance from the government party to the opposition. Members had done this in the past for a variety of reasons, for political principles, as a mark of protest, or merely for personal ambition or other similarly ignoble reasons.

The post-1977 constitution eliminated this threat to the government's stability by saying that if a member switched parties, that person automatically lost his or her seat and was replaced by someone from the party being vacated, thus maintaining the party status quo. This was a huge disincentive for any member to switch, since nothing would be gained. But then some members of the opposition said they wanted to switch to the government side. Since the government wanted to encourage this defection for propaganda purposes, the government used its huge majority to make a constitutional change that said that people could switch sides without losing their seats provided parliament voted to approve each such switch. This guaranteed that only party switching that favored the government could place, since only the government side had the votes to approve the switch.

This kind of frequent and ad-hoc changes to the constitution to serve narrow partisan ends resulted in a devaluation of the respect that a constitution should have. At one time, a joke made the rounds in Sri Lanka that had someone entering a bookstore and asking to purchase a copy of the constitution and being told that "Sorry, we do not sell periodicals."

While I have observed this trend in Sri Lanka politics over the last three decades or so, its parallel development in the US is much more recent. The fight over the use of the filibuster, the attempts to enshrine a flag-burning amendment in the constitution, the battles over judicial nominees, the attempt to breach the establishment clause of the first amendment, are signs that using raw political power to gain short-term goals is gaining ground here too. The argument seems to be that political power is there to be used in whatever way possible.

There are two views on this trend. Some disapprove, saying that achieving consensus government is preferable, since that avoids nasty partisan battles and wild swings in policies. They appeal for 'bipartisanship.' Others argue that the problem with consensus politics and bipartisanship is that the politics of the most reactionary elements wins out, since bipartisanship usually results in the most intransigent person or party getting his or her own way. Also bipartisanship can be a symptom that the two major parties are in fact colluding to protect their common interests at the expense of the excluded classes. Such people argue that at least with using raw political power, there is a chance that your side will someday be in the ascendant and able to use it to pursue politics that you like.

This is a tricky question to which there is no simple answer, at least one that I can see.

July 06, 2005

"The Bible says…"

One of the things I benefited most from once being an ordained lay preacher was having to study the Bible in a fairly formal way. The Bible is a fascinating book, and studying it in some depth reveals treasures that might be missed by those who just pick outs bits here and there.

For example, I discovered that some of the books of the so-called "minor" prophets of the Old Testament (Jonah and Amos were my particular favorites), when taught by scholars, make for great reading and are full of insights into the human condition. The Bible also has passages that astound you with their poetic beauty and precision of thought. Take, for example, this verse from Ecclesiastes (9:11) that addresses the seeming disconnect between ability and reward, and the general randomness of life:

I returned and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favor to men of skill; but time and chance happeneth to them all.

And we are constantly reminded of how indebted we are to two sources (the Bible and Shakespeare) for so many of the phrases that we use in everyday language.

But another benefit of studying the Bible is that I am immediately on the alert when someone says "The Bible says X" in order to support some position. My first response is "Where exactly does it say it?" Quite often, they cannot quote a supporting verse and you realize that they simply think the Bible should say that, because they strongly believe it. It has become part of folklore.

So when someone says "The Bible says X", always ask for supporting evidence.

The second point is that even when such people actually have a quote to back up their assertion, you can often point to other quotes that contradict their position or puts it in a different light or context. This is because the Bible says a lot of things. It is an immense book with many authors, written over a long span of time, in more than one language, and from the perspective of many different cultures. There is also the fact that (as some of commenters to this blog have pointed out previously) the translations of ancient Hebrew and Greek and other texts into English involves the introduction of some unavoidable ambiguities. The Bible is by no means a clear statement of beliefs and values that can be easily inserted into modern day political and ideological battles, and it can be claimed to be so only by deliberately cherry-picking bits and pieces to serve an agenda. When, in the Merchant of Venice (act 1, sc. 3), Shakespeare has Antonio saying "The devil can cite Scripture for his purpose," he is right. The Bible can be quoted to support a vast range of positions, some of them truly bizarre, so arguing on the basis of Biblical texts, taken literally, is rarely conclusive.

I remember one time some years ago when Jehovah's Witnesses came to my house to sell their magazine and to try and convert me. I am usually friendly to them, since I admire their devotion to their cause and they are invariably polite (a quality that I like), but I try to tell them as gently as possible I am not interested. But one of them tried to pique my interest by pointing to the feature article in that month's magazine, which argued that AIDS was God's punishment on homosexuals. This definitely got my attention as I happen to think that that is one of the sickest ideas ever conceived, and thus got drawn into an argument. They produced the usual Biblical quotes against homosexuality. I argued that one had to interpret the Bible in the context of when it was written and the mores that existed at that time, and that the Bible's message could change with time.

The Witness flatly rejected my contention, saying that no re-interpretation was possible. The Bible's message was universal in scope and unchanging with time. I then mentioned Paul's letter to Philemon, in which he seems to have urged Philemon's runaway slave to accept his position and return to his master. Did that mean, I asked, that slavery was acceptable? The Witness (who was black, which was why I had chosen this particular story) was taken aback and said that we had to interpret that story in a sophisticated way in order to understand its real message. I then asked why we should do that for slavery and not for homosexuality, and of course, there is really no answer to that. In fact, the Bible asserts that God does and condones the most appalling things, actions that are truly monstrous. There is no way to resurrect a belief in a loving God without some serious textual criticism, re-interpretation, and re-evaluation of these passages.

The third thing you often find about people who glibly assert "The Bible says…" is that they rarely quote from Jesus' actual words, which is odd if you call yourself a Christian. For Christians, Christ's teachings are supposed to be the final word, and yet many Biblical fundamentalists seem to prefer to quote the Old Testament, the letters of Paul, or Revelations. Could this be because Jesus preached a far more tolerant message than many who now confidently claim to speak in his name? Jesus was constantly hanging out with those whom we would consider low-lifes, prostitutes and the like, and was not judgmental about them. He was more likely to be critical of those who sat in judgment on others.

For example, the Plain Dealer in its issue of Saturday, July 2, 2005 (page E3) had one of those inane features where the responses of anonymous people to some question. (What is the point of such features? To let random people vent their spleen?) The question this time was: "Would you want your religious leader to bless same-sex unions?" One respondent said no because "the Bible says to speak out against sin, and homosexual relations are a sin (1 Corinthians 6:9…I could never understand how one could be considered a Christian and be an unrepenting homosexual." To this person's credit, he/she gave a citation to one of Paul's letters. (Paul is the go-to guy in the New Testament if one is looking for support for intolerant views.) But if you look up the passage, this is what is says in full (in the authoritative [UPDATE: After the comment by Mark, I realize that I have been guilty of sloppy language and should have used the word 'familiar' instead of 'authoritative' since I am not really a competent judge of the latter] King James version): "Know ye not that the unrighteous shall not inherit the kingdom of God? Be not deceived: neither fornicators, nor idolaters, nor adulterers, nor effeminate, nor abusers of themselves with mankind, Nor thieves, nor covetous, nor drunkards, nor revilers, nor extortioners, shall inherit the kingdom of God.." So rather than being a particularly outrageous sin, homosexuality is not even mentioned but being effeminate is said to be evil. In some translations, 'effeminate' is replaced with 'homosexual', but the two words are clearly not equivalent. (The Living Bible, which is a modern (1971), much looser, translation with an evangelical tilt, gives the list as: idol worshipers, adulterers, male prostitutes, homosexuals, thieves, greedy people, drunkards, abusers, and swindlers." Note how "fornicators" have been dropped and how "effeminate" and "abusers of themselves with mankind" have been changed, showing significant distortions in meaning. For this reason, serious Biblical scholars do not recommend its use.)

Whatever one's religious beliefs, one can learn a lot from the Bible. But what you learn may not quite be what you expect.

POST SCRIPT

Steve Perry, the Editor of the Minneapolis/St. Paul weekly newspaper City Pages, is to my mind, one of the shrewdest observers of the domestic national political scene. Last week's Free Times had a cover story by him (Gagging Dr. Dean) that explains why the Democratic Party seems so reluctant to fight for the kinds of policies that its rank and file might want. For those of you who missed the article, you can read it here.

In an earlier essay written in 2002 titled Spank the Donkey, Perry is more cynical and argues that the Democratic Party may be beyond salvaging, so beholden has it become to its big-money contributors.

July 05, 2005

Politics and religion-3

There is no doubt that people's religious beliefs often have political implications. For example, if your religious beliefs require you to live according to certain principles, and the actions resulting from those principles bring you into conflict with the law, then one has an obligation to work to change the laws. Typically this is done by advocating and lobbying for specific legislation or, in the case of civil disobedience campaigns, by defying the law and taking the consequences in order to show the unjustness of the laws and thus sway public opinion. The latter strategy was used with great effect by Mahatma Gandhi and Martin Luther King. While Gandhi was secular, King was overtly religious and made no secret of the fact that he was driven at least partly by his religious convictions.

The key point is that although King was a deeply religious person, the policies he was advocating (equal rights for all people) were secular. He was not claiming any special rights for his religion. He was not even claiming that his demands should be met simply because they were based on the Bible. He was appealing to universal principles of justice and where religion came in was in the strength he drew from his faith to carry on a personally very difficult struggle.

And this is where people like King differ from the current evangelical movement in the US and the militant Buddhist movement in Sri Lanka. The latter groups assert that their particular religious beliefs carry a special weight and should form the basis of policies. In the US, this takes the form of saying that Biblical teachings should form the basis of policy. In Sri Lanka, it takes the form of saying that Buddhism should be the primary religion in the country.

There has been some media buzz recently about an op-ed in the June 16, 2005 issue of the New York Times, written by John Danforth, Republican Senator from Missouri 1976-1994 and briefly US ambassador to the UN in 2004. He is also an ordained minister in the Episcopalian Church. Reader Katie kindly alerted me to Danforth's essay where he makes some interesting points, mainly that the public face of Christianity that is currently presented in the media ignores a huge body of believers whom he calls "moderates." He calls upon those "moderates" to reclaim their place in the public debate.

It is important for those of us who are sometimes called moderates to make the case that we, too, have strongly held Christian convictions, that we speak from the depths of our beliefs, and that our approach to politics is at least as faithful as that of those who are more conservative. Our difference concerns the extent to which government should, or even can, translate religious beliefs into the laws of the state.
People of faith have the right, and perhaps the obligation, to bring their values to bear in politics. Many conservative Christians approach politics with a certainty that they know God's truth, and that they can advance the kingdom of God through governmental action. So they have developed a political agenda that they believe advances God's kingdom, one that includes efforts to "put God back" into the public square and to pass a constitutional amendment intended to protect marriage from the perceived threat of homosexuality.
Moderate Christians are less certain about when and how our beliefs can be translated into statutory form, not because of a lack of faith in God but because of a healthy acknowledgement of the limitations of human beings.
To assert that I am on God's side and you are not, that I know God's will and you do not, and that I will use the power of government to advance my understanding of God's kingdom is certain to produce hostility.
By contrast, moderate Christians see ourselves, literally, as moderators. Far from claiming to possess God's truth, we claim only to be imperfect seekers of the truth. We reject the notion that religion should present a series of wedge issues useful at election time for energizing a political base. We believe it is God's work to practice humility, to wear tolerance on our sleeves, to reach out to those with whom we disagree, and to overcome the meanness we see in today's politics.

This was the kind of Christianity that I admired and still admire. The religious tolerance Danforth speaks about were the standard views of the priests and clergymen who taught me when I was young. We were told to be humble, that we did not, and could not, know the will of God for certain. All that we could do was guess at God's intentions using the Bible as a guide, treat other people as well as we could, and hope that if we messed up in some way, God would treat us mercifully.

I always viewed Danforth as a political hack, a dutiful party apparatchik who never seemed to take any noticeably courageous stand. I still view him that way. What is remarkable is that his essay, which represented safe, mainstream views back in the 1980's and earlier, is seen as so unusual now.

July 01, 2005

Politics and religion-2

As I said before, the significant beginnings of Buddhist religious involvement in Sri Lankan politics began with the 1956 stunning landslide parliamentary victory by an underdog candidate who ran on a platform that shrewdly mixed nationalist politics with an appeal to the ethnic-religious Sinhala-Buddhist population that they would receive favorable treatment under his government.

While this resulted in a short-term benefit for the new Prime Minister and his government, they found it hard to meet the raised expectations of their aroused base and pretty soon things started falling apart. The most serious was the failure of the government to implement a deal to meet the needs of the minority Tamil population, because of the opposition from their more extreme Sinhala-Buddhist supporters, who argued then (and have done so ever since) that almost any concession to Tamil interests was a sell-out of the nation's Sinhala-Buddhist heritage. This was followed in 1958 by a pogrom aimed at Tamils that resulted in many deaths, injuries, and displacements, and in 1959 the Prime Minister himself was assassinated by a Buddhist monk in a plot led by some Buddhist clergy, people who had once been his supporters.

But despite this seriously negative outcome, the die had been cast as far as political appeals to ethnic-religious chauvinist elements were concerned. Other politicians noted how successful such appeals had been in garnering votes and immediately almost all members of political parties started falling over themselves in trying to pander to the majority religion. Politicians who had not been known for their religious devotion 'got religion' in a big way.

This pandering took the form of public piety, making sure that everyone was aware of how religiously observant they were. They would make public shows of going to Buddhist temples, paying courtesy calls on the major Buddhist clerics, incorporating religious themes into speeches, etc. (Does this seem familiar in the US context?) Even some of the members of Marxist parties started doing these things, such was the pressure to conform to this new standard.

Governments started public funding of temples and the clergy, going so far as to provide temples with Mercedes-Benz limousines to transport the clerics. The irony is that Buddhism itself is a religion in the ascetic tradition, with the Buddha himself (the former prince Gautama) rejecting all worldly goods and attachments, seeing such things as barriers to attaining enlightenment and nirvana.

Perhaps the best example of the extent to which this kind of religious pandering led to absurd policies came in the way the calendar was changed. (You are going to find the following story hard to believe but it is true. I lived though this.) The Buddhist calendar is based on the lunar cycle. The full moon has always had religious significance for Buddhists because it is believed that the Buddha was born, attained enlightenment, and died on a full moon day. So one government, in its desire to pander to religious sentiment, decided that the weekly calendar that had the weekend on Saturday and Sunday was too Christian-centered and that what was needed was a Buddhist-centered calendar that was built around the lunar cycle. So the full moon, quarter moon, new moon and three-quarter moon days were made holidays (called 'poya' days) as were the days just preceding them (called the 'pre-poya' days). Thus the pre-poya and poya days were the new weekends, replacing Saturday and Sunday.

Since these days need not coincide with Saturday and Sunday, a new system had to be devised to keep track of weekdays. So the weekdays were called P1, P2, P3, P4, and P5, standing for the 'first day after poya', 'second day after poya', etc. The catch is that since the lunar cycle is around 29 days, every fourth week or so (there was no definite pattern), you would have an extra workday in the week, which was called P6. Keeping track of these things and scheduling future events became a nightmare. Every time the week with the extra day kicked in, authorities would have to decide which of the five weekday schedules would have to be followed on the extra day.

It also made interactions with the rest of the world problematic, because the periodic occurrence of the extra-long week meant that the poya days did not have a fixed relationship to the standard days of the week. Since the rest of the world worked on the standard week, people outside Sri Lanka never knew when we were off on our weekends, disrupting international trade.

This was the system that existed when I was in middle and high school, and it was confusing for everyone, to put it mildly. It is surprising that it lasted as long as it did (many years), but it finally collapsed because everyone just got sick of it, and Sri Lanka reverted to the standard system, without any seeming religious objections.. As a sop to the religious wing, the full moon day every month was retained as a religious holiday so that Sri Lankans now have probably the most public holidays of any nation.

The point of this story is that once political parties start competing for religious support, there seems to be no end to the kinds of ridiculous things that can ensue. The messing around with the weekly calendar was confusing and ridiculous but relatively benign. More serious is when these actions result in one group feeling that it is only their religious sentiment that matters when it comes to forming public policy.

In the US, there are already signs of the increased public piety among elected officials. They talk about their religion and their visits to churches are publicized. Religious spokespersons are invited to the White House. "Prayer breakfasts" are held routinely by elected officials. We have official "days of prayer."

It also seems to have become routine for Presidents and other politicians to end their major speeches with the phrase "God Bless America." This is relatively new. When President Kennedy spoke to the nation on the eve of the Cuban Missile Crisis, perhaps the closest the world has come to all-out nuclear war, he ended his speech with a simple "Thank you and good night." This was the same ending used by President Nixon when in 1972 he spoke to nation about his plans for the war in Vietnam. Although Presidents up to and including the overtly religious Carter occasionally inserted references to God in their speeches, it was reserved for special rhetorical flourishes and it did not become a standard ending tagline for speeches until with President Reagan. It was his successor George H. W. Bush (the current president's father) who really went over the top, ending his speeches with almost pep-rally like appeals for God's blessings. (See the article by Jonathan Rauch in the National Journal for a review of God's appearance in presidential speeches. Rauch also makes the astounding claim that seven states even prohibit atheists from holding public office! His article was written in 1999.)

When politicians feel the need for public statements of piety, then I think we are going down a dangerous road. There is a gripping ten-minute video clip from the TV program The West Wing that captures this issue very well. The clip is must see TV. In it the senator portrayed by Alan Alda, under pressure to make a show of his religion, makes this comment to the press "If you demand expressions of religious faith from politicians, you are just begging to be lied to…And it will be one of the easiest lies to ever have to tell to get your votes." (To see the video, just click on the still of Alan Alda. You need Quicktime video to play it, and that is a free download if you don't already have it.)

I have always believed that the secular state is the most just state. It also would fit (I think) with John Rawl's 'justice as fairness' model for society. Many people think that 'secular' means atheist but that is wrong. A secular state means that laws must be neutral with respect to any or all religions or the lack of it. The government cannot promote any religion or deny people their right to practice the religion of their choice. The establishment clause of the First Amendment to the US constitution pretty much says it best: "Congress shall make no law respecting an establishment of religion, or prohibiting the free exercise thereof."

It would be a pity to undermine such a good idea.