THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries for May 2006

May 31, 2006

What is science?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

Because of my interest in the history and philosophy of science I am sometimes called upon to answer the question "what is science?" Most people think that the answer should be fairly straightforward. This is because science is such an integral part of our lives that everyone feels that they intuitively know what it is and think that the problem of defining science is purely one of finding the right combination of words that captures their intuitive sense.

But as I said in my previous posting, strictly defining things means having demarcation criteria, which involves developing a set of necessary and sufficient conditions, and this is extremely hard to do even for seemingly simple things like (say) defining what a dog is. So I should not be surprising that it may be harder to do for an abstract idea like science.

But just as a small child is able, based on its experience with pets, to distinguish between a dog and a cat without any need for formal demarcation criteria, so can scientists intuitively sense what is science and what is not science, based on the practice of their profession, without any need for a formal definition. So scientists do not, in the normal course of their work, pay much attention to whether they have a formal definition of science or not. If forced to define science (say for the purpose of writing textbooks) they tend to make up some kind of definition that sort of fits with their experience, but such ad-hoc formulations lack the kind of formal rigor that is strictly required of a philosophically sound demarcation criterion.

The absence of an agreed-upon formal definition of science has not hindered science from progressing rapidly and efficiently. Science marches on, blithely unconcerned about its lack of self-definition. People start worrying about definitions of science mainly in the context of political battles, such as those involving so-called intelligent design creationism (or IDC), because advocates of IDC have been using this lack of a formal definition to try to define science in such a way that their pet idea be included as science, and thus taught in schools as part of the science curriculum and as an alternative to evolution.

Having a clear-cut demarcation criterion that defines science and is accepted by all would settle this question once and for all. But finding this demarcation criterion for science has proven to be remarkably difficult.

To set about trying to find such criteria, we do what we usually do in all such cases, we look at all the knowledge that is commonly accepted as science by everyone, and see if we can see similarities among these areas. For example, I think everyone would agree that the subjects that come under the headings of astronomy, geology, physics, chemistry, and biology, and which are studied by university departments in reputable universities, all come under the heading of science. So any definition of science that excluded any of these areas would be clearly inadequate, just as any definition of 'dog' that excluded a commonly accepted breed would be dismissed as inadequate.

This is the kind of thing we do when trying to define other things, like art (say). Any definition of art that excluded (say) paintings hanging in reputable museums would be considered an inadequate definition.

When we look back at the history of the topics studied by people in those named disciplines and which are commonly accepted as science, two characteristics stand out. The first thing that we realize is that for a theory to be considered scientific it does not have to be true. Newtonian physics is commonly accepted to be scientific, although it is not considered to be universally true anymore. The phlogiston theory of combustion is considered to be scientific though it has long since been overthrown by the oxygen theory. And so on. In fact, since all knowledge is considered to be fallible and liable to change, truth is, in some sense, irrelevant to the question of whether something is scientific or not, because absolute truth cannot be established.

(A caveat: Not all scientists will agree with me on this last point. Some scientists feel that once a theory is shown to be incorrect, it ceases to be part of science, although it remains a part of science history. Some physicists also feel that many of the current theories of (say) sub-atomic particles are unlikely to be ever overthrown and are thus true in some absolute sense. I am not convinced of this. The history of science teaches us that even theories that were considered rock-solid and lasted millennia (such as the geocentric universe) eventually were overthrown.)

But there is a clear pattern that emerges about scientific theories. All the theories that are considered to be science are (1) naturalistic and (2) predictive.

By naturalistic I mean methodological naturalism and not philosophical naturalism. The latter, I argued in an earlier posting where these terms were defined, is irrelevant to science.

By predictive, I mean that all theories that are considered part of science have the quality of having some explicit mechanism or structure that enable the users of these theories to make predictions, of saying what one should see if one did some experiment or looked in some place under certain conditions.

Note that these two conditions are just necessary conditions and by themselves are not sufficient. (See the previous posting for what those conditions mean.) As such they can only classify things into "may be science" (if something meets both conditions) or "not science" (if something does not meet either one of the conditions.) As such, these two conditions together do not make up a satisfactory demarcation criterion. For example, the theory that if a football quarterback throws a lot of interceptions his team is likely to lose, meets both naturalistic and predictive conditions, but it is not considered part of science.

But even though we do not have a rigorous demarcation criterion for science, the existence of just necessary conditions still has interesting implications, which we shall explore in later postings.

May 30, 2006

What do creationist/ID advocates want-III?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

It is time to tackle head-on the notion of what is meant by the 'materialism' that the creationist/ID camp find so distasteful. (See part I and part II for the background.)

The word materialism is used synonymously with 'naturalism' and perhaps the clearest formulation of what it means can be found in the writings of paleontologist George Gaylord Simpson who said in Tempo and Mode in Evolution (p. 76.):

"The progress of knowledge rigidly requires that no non-physical postulate ever be admitted in connection with the study of physical phenomena. We do not know what is and what is not explicable in physical terms, and the researcher who is seeking explanations must seek physical explanations only." (Emphasis added)

Simpson was by no means an atheist (as far as I can tell) but he is saying something that all scientists take for granted, that when you seek a scientific explanation for something, you look for something that has natural causes, and you do not countenance the miraculous or the inscrutable. This process is properly called 'methodological naturalism', to be contrasted with 'philosophical naturalism.'

Despite the polysyllabic terminology, the ideas are easy to understand. For example, if you hear a strange noise in the next room, you might wonder if it is a radiator or the wind or a mouse or an intruder and you investigate each possible cause, looking for evidence. For each question that you pose, the answer is sought in natural causes. You would be unlikely to say "The noise in the next room is caused by God knocking over stuff." In general, people don't invoke God to explain the everyday phenomena of our lives, even though they might be quite religious.

Methodological naturalism is just that same idea. Scientists look for natural explanations to the phenomena they encounter because that is the way science works. Such an approach allows you to systematically investigate open questions and not shut off avenues of research. Any scientist who said that an experimental result was due to God intervening in the lab would be looked at askance, because that scientist would be violating one of the fundamental rules of operation. There is no question in science that is closed to further investigation of deeper natural causes.

Non-scientists sometimes do not understand how hard and frustrating much of scientific research is. People work for years and even decades banging their heads against brick walls, trying to solve some tough problem. What keeps them going? What makes them persevere? It is the practice of methodological naturalism, the belief that a discoverable explanation must exist and that it is only their ingenuity and skill that is preventing them from finding the solution. Unsolved problems are seen as challenges to the skills of the individual scientist and the scientific community, not as manifestations of God's workings.

This is what, for example, causes medical researchers to work for years to find causes (and thus possibly cures) for rare and obscure diseases. Part of the reason is the desire to be helpful, part of it is due to personal ambition and career advancement, but an important part is also the belief that a solution exists that lies within their grasp.

It is because of this willingness to persevere in the face of enormous difficulty that science has been able to make the breakthroughs it has. If, at the early signs of difficulty in solving a problem scientists threw up their hands and said "Well, looks like God is behind this one. Let's give up and move on to something else" then the great discoveries of science that we associate with Newton, Darwin, Einstein, Planck, Heisenberg, etc. would never have occurred.

For example, the motion of the perigee of the moon was a well-known unsolved problem for over sixty years after the introduction of Newtonian physics. It constituted a serious problem that resisted solution for a longer time than the problems in evolution pointed to by creationist/ID advocates. Yet no supernatural explanation was invoked, eventually the problem was solved, and the result was seen as a triumph for Newtonian theory.

So when creationist/ID advocates advocate the abandonment of methodological naturalism, they are not trying to ease just Darwin out of the picture. They are throwing out the operational basis of the entire scientific enterprise.

Philosophical naturalism, as contrasted with methodological naturalism, is the belief that the natural world is all there is, that there is nothing more. Some scientists undoubtedly choose to be philosophical naturalists (and thus atheists) because they see no need to have God in their philosophical framework, but as I said in an earlier posting, others reject that option and stay religious. But this is purely a personal choice made by individual scientists and it has no impact on how they do science, which only involves using methodological naturalism. There is no requirement in science that one must be a philosophical naturalist, and as I alluded to earlier, Gaylord Simpson was not a philosophical naturalist although he was a methodological naturalist.

The question of philosophical naturalism is, frankly, irrelevant to working scientists. Scientists don't really care if their colleagues are religious or not. I have been around scientists all my life. But apart from my close friends, I have no idea what their religious beliefs are, and even then I have only a vague idea of what they actually believe. I know that some are religious and others are not. It just does not matter to us. Whether a scientist is a philosophical naturalist or not does not affect how his or her work is received by the community.

But what the creationist/ID advocates want, according to their stated goal of "If things are to improve, materialism needs to be defeated and God has to be accepted as the creator of nature and human beings" is to enforce the requirement that scientists reject both philosophical and methodological naturalism. They are essentially forcing two things on everyone:

  • Requiring people to adopt the creationist/ID religious worldview as their own.
  • Requiring scientists to reject methodological naturalism as a rule of operation for science.

In other words, creationist/ID advocates are not asking us to reject only Darwin or to turn the clock back to the time just prior to Darwin, they want us to go all the way back to before Copernicus, and reject the very methods of science that has enabled it to be so successful. They want us to go back to a time of rampant and unchecked superstition.

This is probably not a good idea…

May 29, 2006

What do creationist/ID advocates want-II?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

We saw in an earlier posting that a key idea of the creationists is that it was with the arrival of Darwin, Marx, and Freud that has led to the undermining of Western civilization.

The basis for this extraordinary charge is the claim that it was these three that ushered in the age of materialism. These three people make convenient targets because, although they were all serious scientific and social scholars, they have all been successfully tarred as purveyors of ideas that have been portrayed as unpleasant or even evil (Darwin for saying that we share a common ancestor with apes, Marx with communism, Freud with sexuality).

But if you want to blame materialism for society's ills, you have to go farther back than that, at least as far as Copernicus, and possibly earlier. For example, as stated by Thomas S. Kuhn in his book The Copernican Revolution (p.2)

"[Copernicus'] planetary theory and his associated conception of a sun-centered universe were instrumental in the transition from medieval to modern Western society, because they seemed to affect man's relation to the universe and God…Men who believed that their terrestrial home was only a planet circulating blindly about one of an infinity of stars evaluated their place in the cosmic scheme quite differently than had their predecessors who saw the earth as the unique and focal center of God's creation. The Copernican Revolution was therefore also part of the transition in Western man's sense of values."

Copernicus was central to the development of Western civilization, as were Galileo, Kepler, and Newton after him. All of them sought to explain how the world works in materialistic ways. So if you want to pin the blame for society's ills on those who were influential in promoting materialistic ways of understanding the world, then you cannot pin the blame on Johnny-come-latelies like Darwin, Marx, and Freud.

But creationist/ID advocates do not go after these earlier giants of scientific materialism who justifiably occupy honored places in our history. To do so would be to be immediately labeled as crack-pots, on a par with flat-Earthers, UFO believers, and spoon benders. So they try to peel Darwin, Marx and Freud away from this distinguished line of scientists and treat them as if they started a parallel line of dubious thought, distinct from that of mainstream science.

But that argument just does not make sense. One may argue whether Marxism or Freudian psychoanalysis is scientific, but there is no controversy at all within the scientific community as to whether Darwin's ideas belong firmly in the scientific tradition. Darwin rightly takes his place among the giants of science and drew his materialist inspiration from the scientists who came before him.

The fact that all these scientists sought to explain the world in materialistic ways does not mean that they did not believe in God. For example, it is well known that Newton did believe in a God. He believed that the working of the solar system had a beauty that indicated the existence of God. But that did not stop him from pursuing the laws of motion and gravity that provided a completely material explanation for planetary motions. The residual features that his theories did not explain (such as the stability of the system) and which he ascribed to God were explained later by materialistic means using his own laws, after his death.

The same is true now. What creationist/ID advocates don't seem to grasp is that pursuing materialistic explanations for phenomena does not pose a problem for scientists who are also religious. Surveys conducted in 1996 and 1998 found that about 40% of scientists believe in a personal God as defined by the statement "a God in intellectual and affirmative communication with man … to whom one may pray in expectation of receiving an answer." Despite the explosive growth in science this century, this figure of 40% has remained stable since previous surveys done in 1914 and 1933. (Source: Edward J.Larson and Larry Witham, Scientists are still keeping the faith, Nature, vol. 386, April 1997, page 435.) The figure would undoubtedly be much higher if belief in a non-personal God (some sort of prime mover who acted only through natural laws) were included as well.

So why is it that scientists who are also religious have no trouble with materialism? Stay tuned…

May 26, 2006

What do ID advocates want?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In an earlier posting, I spoke about how those who view Darwin's ideas as evil see it as the source of the alleged decline in morality. But on the surface, so-called 'intelligent design' (or ID) seems to accept much of evolutionary ideas, reserving the actions of a 'designer' for just a very few (five, actually) instances of alleged 'irreducible complexity' that occur at the microbiological level.

This hardly seems like a major attack on Darwin since, on the surface, it seems to leave unchallenged almost all of the major ideas of the Darwinian structure such as the non-constancy of species (the basic theory of evolution), the descent of all organisms from common ancestors (branching evolution), the gradualness of evolution (no discontinuities), the multiplication of species, and natural selection.

So where does ID fit into this attack on evolution? It's role is explicitly outlined in the document that has been labeled the 'Wedge Strategy' or the 'Wedge Document' put out in 1999 by the Center for the Renewal of Science & Culture (now called the Center for Science and Culture) of the Seattle-based Discovery Institute, which is the well-funded 'think-tank' that funds and supports the work of creationists.

In the document it becomes clear that intelligent design is seen as kind of the shock troops that establish the beachhead on the fields of science, prior to the rest of the creationist army coming behind and occupying the entire landscape.

Here is an extended passage from the introduction of the document that outlines the issues as seen by them:

The proposition that human beings are created in the image of God is one of the bedrock principles on which Western civilization was built. Its influence can be detected in most, if not all, of the West's greatest achievements, including representative democracy, human rights, free enterprise, and progress in the arts and sciences.
Yet a little over a century ago, this cardinal idea came under wholesale attack by intellectuals drawing on the discoveries of modern science. Debunking the traditional conceptions of both God and man, thinkers such as Charles Darwin, Karl Marx, and Sigmund Freud portrayed humans not as moral and spiritual beings, but as animals or machines who inhabited a universe ruled by purely impersonal forces and whose behavior and very thoughts were dictated by the unbending forces of biology, chemistry, and environment. This materialistic conception of reality eventually infected virtually every area of our culture, from politics and economics to literature and art.
The cultural consequences of this triumph of materialism were devastating. Materialists denied the existence of objective moral standards, claiming that environment dictates our behavior and beliefs. Such moral relativism was uncritically adopted by much of the social sciences, and it still undergirds much of modern economics, political science, psychology and sociology.
Materialists also undermined personal responsibility by asserting that human thoughts and behaviors are dictated by our biology and environment. The results can be seen in modern approaches to criminal justice, product liability, and welfare. In the materialist scheme of things, everyone is a victim and no one can be held accountable for his or her actions.
Finally, materialism spawned a virulent strain of utopianism. Thinking they could engineer the perfect society through the application of scientific knowledge, materialist reformers advocated coercive government programs that falsely promised to create heaven on earth.
Discovery Institute's Center for the Renewal of Science and Culture seeks nothing less than the overthrow of materialism and its cultural legacies.

A little later in the document one comes across the "Governing Goals" of the movement, which are:

  • To defeat scientific materialism and its destructive moral, cultural and political legacies.
  • To replace materialistic explanations with the theistic understanding that nature and human beings are created by God.

So there you have it. In a nutshell, the argument is:

  1. The greatest achievements of Western civilization are largely due to the idea that human beings were created in God's image.
  2. Things were just peachy until a little over one hundred years ago.
  3. Then Darwin, Marx, and Freud dethroned this idea and instead introduced materialist ideas that spread into all areas of science and culture.
  4. Everything pretty much fell apart after that.
  5. If things are to improve, materialism needs to be defeated and God has to be accepted as the creator of nature and human beings.

This is a pretty sweeping line of reasoning. Such broad-brush analyses of society are inherently suspect since the way societies function and form is highly complex and claiming all the good for one belief structure and all the bad for the opposing side is to oversimplify on a massive scale.

I discussed in the previous posting some of the problems with this kind of reasoning.

But what is clear is that the ultimate goal of this movement is to eliminate 'scientific materialism' and bring back God into all areas of life. Getting ID into the science curriculum is just the first step, hence the name 'wedge' strategy.

This is the first of a series on this topic. I will look more closely into what 'scientific materialism' is and the implications of this strategy in future postings.

May 25, 2006

Evolutionary theory and falsificationism

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In response to a previous posting, commenter Sarah Taylor made several important points. She clearly articulated the view that evolutionary theory is a complex edifice that is built on many observations that fit into a general pattern that is largely chronologically consistent.

She also notes that one distinguishing feature of science is that there are no questions that it shirks from, that there are no beliefs that it is not willing to put to the test. She says that "What makes scientific theories different from other human proposals about the nature of the universe are their courage. They proclaim their vulnerabilities as their strengths, inviting attack."

I would mostly agree with this. Science does not shy away from probing its weaknesses, although I would not go so far as to claim that the vulnerabilities are seen as strengths. What is true is that the 'weaknesses' of theories are not ignored or covered up but are seen as opportunities for further research. Since there is no such thing in science as infallible knowledge, there is no inherent desire to preserve any theory at all costs, and the history of science is full of once dominant theories that are no longer considered credible.

But having said all that, it is not necessarily true that finding just one contradiction with a theory is sufficient to overthrow the theory. In the context of the challenge to Darwinian theory by intelligent design (ID) advocates, Sarah's statement that "All that any ID devotee has to do is to show ONE fossil 'out of place', to prove the theory doesn't work. Just one horse shoulder blade in a Cambrian deposit somewhere in the world, and we can say goodbye to Darwin" is a little too strong.

Sarah's view seems to be derived from the model of falsificationism developed by the philosopher of science Karl Popper (see his book Conjectures and Refutations: The Growth of Scientific Knowledge, 1963) who was trying to explain how science progresses. Afer showing that trying to prove theories to be true was not possible, Popper argued that what scientists should instead do is try to prove theories false by finding a single counter-instance to the theory's predictions. If that happens, the theory is falsified and has to be rejected and replaced by a better one. Hence the only status of a scientific theory is either 'false' or 'not yet shown to be false.'

But historians of science have shown that this model, although appealing to our sense of bravado, does not describe how science actually works. Scientists are loath to throw away perfectly productive theories on the basis of a few anomalies. If they did so, then no non-trivial theory would survive. For example, the motion of the perigee of the moon's orbit disagreed with Newton's theory for nearly sixty years. Similarly the stability of the planetary orbits was an unsolved problem for nearly 200 years.

Good theories are hard to come by and we cannot afford to throw them away at the first signs of a problem. This is why scientists are quite agreeable to treating such seeming counter-instances as research problems to be worked on, rather than as falsifying events. As Barry Barnes says in his T.S. Kuhn and Social Science (1982):
"In agreeing upon a paradigm scientists do not accept a finished product: rather they agree to accept a basis for future work, and to treat as illusory or eliminable all its apparent inadequacies and defects."

Dethroning a useful theory requires an accumulation of evidence and problems, and the simultaneous existence of a viable alternative. It is like a box spring mattress. One broken spring is not sufficient to make the mattress useless, since the other springs can make up for it and retain the mattress's functionality. It takes several broken springs to make the mattress a candidate for replacement. And you only throw out the old mattress if you have a better one to replace it with, because having no mattress at all is even worse. The more powerful and venerable the theory, the more breakdowns that must occur to make scientists skeptical of its value and open to having another theory replace it.

After a theory is dethroned due to a confluence of many events, later historians might point to a single event as starting the decline or providing the tipping point that convinced scientists to abandon the theory. But this is something that happens long after the fact, and is largely a rewriting of history.

So I do not think that finding one fossil out of place will dethrone Darwin. And ID does not meet the necessary criteria for being a viable alternative anyway, since it appeals to an unavoidable inscrutability as a factor in its explanatory structure, and that is an immediate disqualification for any scientific theory.

May 24, 2006

Where was God during the tsunami?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

Last Thursday I moderated a panel discussion (sponsored by the Hindu Students Association and the Religion Department at Case) on the topic of theodicy (theories to justify the ways of God to people, aka "why bad things happen to good people") in light of the devastation wreaked by the tsunami, which killed an estimated quarter million people.

The panel comprised six scholars representing Judaism, Islam, Jainism, Christianity, Hinduism, and Buddhism and the discussion was thoughtful with a good sharing of ideas and concerns.

As the lay moderator not affiliated with any religious tradition, I opened by saying that it seemed to me that events like the tsunami posed a difficult problem for believers in a God because none of the three immediate explanations that come to mind about the role of God are very satisfying. The explanations are:

1. It was an act of commission. In other words, everything that happens is God's will including the tsunami. This implies that God caused it to happen and hence can be viewed as cruel.
2. It was an act of omission. God did not cause the tsunami but did nothing to save people from its effects. This implies that God does not care about suffering.
3. It is a sign of impotence. God does care but is incapable of preventing such events. This implies that God is not all-powerful.

These questions can well be asked even for an isolated tragic event like the death of a child. But in those cases, it is only the immediate relatives and friends of the bereaved who ask such things. The tsunami caused even those not directly affected to be deeply troubled and it is interesting to ask why this is so.

Some possible reasons for this widespread questioning of religion are that the tsunami had a very rare combination of four features:

1. It was a purely natural calamity with no blame attached to humans. Other 'natural' disasters such as droughts and famines can sometimes be linked indirectly to human actions and blame shifted from God.
2. The massive scale of death and suffering.
3. The rapidity of the events, the large number of deaths on such a short time-scale
4. The innocence of so many victims, evidenced by the fact that a staggering one-third of the deaths were of children.

Of course, although rare, such combinations of factors have occurred in the past and all the major religions are old enough to have experienced such events before and grappled with the theological implications. It was interesting to see the different ways in which the four theistic religions (Judaism, Hinduism, Christianity, and Islam) and the two non-theistic religions (Buddhism and Jainism) responded. But whatever the religion, it was clear that something has to give somewhere in the image of an all-knowing, all-powerful, benevolent God, whose actions we can comprehend.

As one panelist pointed out, the last feature (of the ability to comprehend the meaning of such events) is dealt with in all religions with an MWC ("mysterious ways clause") that can be invoked to say that the actions of God are inscrutable and that we simply have to accept the fact that a good explanation exists, though we may not know it.

Each panelist also pointed out that each religious tradition is in actuality an umbrella of many strands and that there is no single unified response that can be given for such an event. Many of the explanations given by each tradition were shared by the others as well. In some ways, this diversity of explanations within each tradition is necessary because it is what enables them to hold on to a diverse base of adherents, each of whom will have a personal explanation that they favor and who will look to their religion for approval of that particular belief.

The possible explanations range over the following: that things like the tsunami are God's punishment for either individual or collective iniquity; that they are sent to test the faith of believers (as in the Biblical story of Job); that God created natural laws and lets those laws work their way without interference; that God is "playing" with the world to remind us that this life is transitory and not important; that the tsunami was sent as a sign that the "end times" (when the apocalypse arrives) are near and hence should actually be seen as a joyous event; that it was a sign and reminder of God's power and meant to inspire devotion; it was to remind us that all things are an illusion and that the events did not "really" happen.

All of these explanations posit a higher purpose for the tsunami, and some definitely relinquish the notion of God's benevolence.

The non-theistic religions have as their explanatory core for events the notion of karma. Karma is often loosely thought of as fate but the speakers pointed out that karma means action and carries the implication that we are responsible for our actions and that our actions create consequences. Thus there is the belief in the existence of cause-and-effect laws but there is no requirement for the existence of a law-giver (or God). The karma itself is the cause of events like the tsunami and we do not need an external cause or agent to explain it. The MWC is invoked even in this case to say that there is no reason to think that the ways the karmic laws work are knowable by humans.

The non-theistic karma traditions do not believe in the existence of evil or an evil one. But there is a concept of moral law or justice ("dharma") and the absence of justice ("adharma"), and events like the tsunami may be an indication that totality of dharma in the world is declining. These traditions also posit that the universe is impermanent and that the real problem is our ignorance of its nature and of our transitory role in it.

The problem for the karma-based religions with things like the tsunami is understanding how the karma of so many diverse individuals could coincide so that they all perished in the same way within the space of minutes. But again, the MWC can be invoked to say that there is no requirement that we should be able to understand how the karmic laws work

(One question that struck me during the discussion was that in Hinduism, a belief in God coexisted with a belief in karma and I was not sure how that could work. After all, if God can intervene in the world, then can the karmic laws be over-ridden? Perhaps someone who knows more about this can enlighten me.)

Are any of these explanations satisfying? Or do events like the tsunami seriously undermine people's beliefs in religion? That is something that each person has to decide for himself or herself.

May 23, 2006

Creationism and moral decay

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In the previous posting, I said that the reason that there is such hostility to the teaching of evolutionary theory by ID advocates and young-Earth creationists is that they feel that it implies a lack off special status for human beings, which leads to atheism, which has led to the current state of moral decay in the US from a more wholesome past. They feel that eliminating the teaching of evolution is the first step on the road to moral redemption.

There are many flaws in this line of reasoning but for the moment I want to look at one feature and pose the question as to why such people think that the moral state of America is in worse shape now than it was in the past.

It becomes clear that the reason is that the word 'morality' as used almost exclusively in relation to sex and nudity. Those who see us as currently living in a moral swamp use sex and nudity as the yardsticks for measurement.

Even taking this narrow view of morality, it is not clear that America is any less moral now than it was, say, fifty or more years ago. On the one hand, there is clearly a lot of public discussion now of sex-related issues and more nudity and sex in films and on television. But all that this might indicate is that things that were done and spoken in private in the past are now more in the open. In other words, we don't have more sex. We simply have less secrecy and hypocrisy.

It is not that public piety and hypocrisy about sex and nudity has disappeared, as can be seen by the ridiculous flap over the Janet Jackson Super Bowl incident, which was portrayed as if it had irreparably damaged the nation's psyche. In fact, America is a curious mass of contradictions when it comes to sex and nudity, publicly deploring it while relishing titillating stories in the media.

But it is hard for me to accept that we are in a worse state of morals than we were in the past when that word is used in a more meaningful and broader sense.

For example, it was only fifty years ago or less that civil rights legislation was enacted giving blacks the legal rights that white people had. Lynchings, beatings, fire hosing of peaceful marchers, Jim Crow laws, open discrimination in all areas of life, are all in the living memory of people. Was that a more moral time to live in?

Similarly, the status of women just one hundred years ago was no picnic either. Women had no vote, few career choices, and little hope for advancement or being taken seriously in the scientific, business, and professional worlds. They were seen as primarily homemakers and mothers and little else. Was that a more moral time to live in?

And one has to only go back to about two hundred years to get to the era of slavery and genocide against Native Americans. Was that a more moral time to live in?

While equality has still not been attained, it is only those who are looking at the past with blinkers who could see golden ages then and wish to return to them.

I think that there is a strong case to be made that in some ways morality has increased over time so that even if one were inclined to make this kind of correlation between morals in a broad sense and the passage of time since the publication of Darwin's On the Origin of Species in 1859, one would have to conclude that morals have actually improved with the advent of evolutionary theory.

May 22, 2006

Natural selection and moral decay

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In a previous posting, I discussed why some religious people found evolutionary theory so upsetting. It was because natural selection implies that human beings were not destined or chosen to be what they are.

While I can understand why this is upsetting to religious fundamentalists who believe they were created specially in God's image and are thus part of a grand cosmic plan, there is still a remaining puzzle and that is why they are so militant in trying to have evolution not taught in schools or its teaching to be undermined by inserting fake cautions about its credibility. After all, if a person dislikes evolutionary theory for whatever reason, all they have to do is not believe it.

I have had students who, after taking my physics courses, say that they cannot believe the theories of the origins of the universe that I taught them because those theories conflict with their religious beliefs, specifically their belief about a young Earth. I don't try to get them to change their views. I tell them that they are perfectly free to believe what they want and that it is not my duty to try and force them to agree with me. I believe that the purpose of science courses is to teach students the scientific paradigms that scientists use so that they will be able to use them in their own work. All I ask of my students is that they demonstrate to me that they understand how the scientific paradigms work and know how to use them within the scientific contexts in which they apply. I do not require them to swear allegiance to the theories themselves.

So it was initially puzzling to me why some people were objecting to the teaching of evolution. Why not let students learn it as best as they can so that they can function effectively in the world of science? After all, evolutionary theory is one of the cornerstones of modern science and to reject it as a framework for research is, frankly, to declare oneself to be a non-scientist.

It is true that some students will like the theory and accept it. Other won't. But that would be their individual choices. What would be the harm in that?

But my conversations with the ID people revealed that they have a much darker view of the consequences of teaching evolution. Let me try and summarize as best as I can their line of reasoning.

Their position is that America is currently in a state of deep moral decay. They look back on the past and see a time when the country was much more morally wholesome and they see the cause of the degeneration as due to people moving away from religious doctrines and towards a more secular outlook. And they see this shift as coinciding with the introduction of widespread teaching of evolution in schools.

They believe that you cannot have a moral sense unless it is rooted in the Bible. Not having the Bible as a basis for absolute moral standards results in the slippery slope of moral relativism and situational ethics, where there are no absolutes and what is a right or wrong choice is determined by the context.

They pin the blame for this shift in morals directly on evolutionary theory. They argue that teaching evolution means teaching that human beings are not God's special creation. This leads to atheism and hence to moral decay.

So the fight against the teaching of evolution is seen by them as a fight for America's very soul and this explains the passion which is expended by them on what, to the rest of us, might seem just another aspect of the science curriculum. It also means that the ultimate goal of the movement is the complete elimination of any teaching of evolution, and that the current push to introduce ID as merely an "alternative theory" is just the first step in a longer-term strategy.

While this line of reasoning can be criticized on very many different levels (and I will do so in a later posting), I was impressed with the sincerity of many of the people at the ID meeting who made it. They are doing what they do because they care about the souls of all of us, and are trying to save us from ourselves. But some of the leaders and spokespersons of the ID movement are not as straightforward as their followers. They hide this broader argument and try to portray what they are doing as purely an issue of science and that they would be satisfied if ID was accepted as an alternative to evolution. (I will discuss this so-called 'wedge strategy' later.)

This is why ID advocates feel they cannot allow the teaching of evolution. For them it is not just a scientific theory they have problems with. They see this as a battle for the soul of America, and the world.

May 19, 2006

What makes us good at learning some things and not others?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

One of the questions that students ask me is why it is that they find some subjects easy and others hard to learn. Students often tell me that they "are good" at one subject (say writing) and "are not good" at another (say physics), with the clear implication that they feel that there is something intrinsic and immutable about them that determines what they are good at. It is as if they see their learning abilities as being mapped onto a multi-dimensional grid in which each axis represents a subject, with their own abilities lying along a continuous scale ranging from 'awful' at one extreme to 'excellent' at the other. Is this how it is?

This is a really tough question and I don't think there is a definitive answer at this time. Those interested in this topic should register for the free public lecture by Steven Pinker on March 14.

Why are some people drawn to some areas of study and not to others? Why do they find some things difficult and others easy? Is it due to the kind of teaching that one receives or parental influence or some innate quality like genes?

The easiest answer is to blame it on genes or at least on the hard-wiring of the brain. In other words, we are born the way we are, with gifts in some areas and deficiencies in others. It seems almost impossible to open the newspapers these days without reading that scientists have found the genes that 'cause' this or that human characteristic so it is excusable to jump to genes as the cause of most inexplicable things.

But that is too simple. After all, although the brain comes at birth with some hard-wired structures, it is also quite plastic and the direction in which it grows is also strongly influenced by the experiences it encounters. But it seems that most of the rapid growth and development occurs fairly early in life and so early childhood and adolescent experiences are important in determining future directions.

But what kinds of experiences are the crucial ones for determining future academic success? Now things get more murky and it is hard to say which ones are dominant. We cannot even say that the same factors play the same role for everyone. So for one person, a single teacher's influence could be pivotal. For another, it could be the parent's influence. The influences could also be positive or negative.

So there is no simple answer. But I think that although this is an interesting question, the answer has little practical significance for a particular individual at this stage of their lives in college. You are now what you are. The best strategy is to not dwell on why you are not something else, but to identify your strengths and use them to your advantage.

It is only when you get really deep into a subject (any subject) and start to explore its foundations and learn about its underlying knowledge structure that you start to develop higher-level cognitive skills that will last you all your life. But this only happens if you like the subject because only then will you willingly expend the intellectual effort to study it in depth. With things that we do not care much about, we tend to skim on the surface, doing just the bare minimum to get by. This is why it is important to identify what you really like to do and go for it.

You should also identify your weaknesses and dislikes and contain them. By "contain" I mean that there is really no reason why at this stage you should force yourself to try and like (say) mathematics or physics or Latin or Shakespeare or whatever and try to excel in them, if you do not absolutely need to. What's the point? What are you trying to prove and to whom? If there was a really good reason that you needed to know something about those areas now or later in life, the higher-level learning skills you develop by charging ahead in the things you like now could be used to learn something that you really need to know later.

I don't think that people have an innate "limit", in the sense that there is some insurmountable barrier that prevents them from achieving more in any area. I am perfectly confident that some day if you needed or wanted to know something in those areas, you would be able to learn it. The plateau or barrier that students think they have reached is largely determined by their inner sense of "what's the point?"

I think that by the time they reach college, most students have reached the "need to know" stage in life, where they need a good reason to learn something. In earlier K-12 grades, they were in the "just in case" stage where they did not know where they would be going and needed to prepare themselves for any eventuality.

This has important implications for teaching practice. As teachers, we should make it our goal to teach in such a way that students see the deep beauty that lies in our discipline, so that they will like it for its own sake and thus be willing to make the effort. It is not enough to tell them that it is "useful" or "good for them."

In my own life, I now happily learn about things that I would never have conceived that I would be interested in when I was younger. The time and circumstances have to be right for learning to have its fullest effect. As Edgar says in King Lear: "Ripeness is all."

(The quote from Shakespeare is a good example of what I mean. If you had told me when I was an undergraduate that I would some day be familiar enough with Shakespeare to quote him comfortably, I would have said you were crazy because I hated his plays at that time. But much later in life, I discovered the pleasures of reading his works.)

So to combine the words from the song by Bobby McFerrin, and the prison camp commander in the film The Bridge on the River Kwai, my own advice is "Don't worry. Be happy in your work."

Sources:

John D. Bransford, Ann L. Brown, and Rodney R. Cocking, eds., How People Learn, National Academy Press, Washington D.C.,1999.

James E. Zull, The Art of Changing the Brain, Stylus Publishing, Sterling, VA, 2002.

May 18, 2006

Why is evolutionary theory so upsetting to some?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

One of the questions that sometimes occur to observers of the intelligent design (ID) controversy is why there is such hostility to evolutionary theory in particular. After all, if you are a Biblical literalist, you are pretty much guaranteed to find that the theories of any scientific discipline (physics, chemistry, geology, astronomy, in addition to biology) contradict many of the things taught in the Bible.

So what is it about evolution in particular that gets some people's goat?

I had occasion to attend the annual program held by the ID advocates in Kansas a couple of years back, having been invited to be on a panel that was to debate the question of whether ID was a science. I took the opportunity to speak with a lot of the people who were attendees of the program about why they found evolution so offensive. The people I spoke to seemed to be almost all Biblical literalists.

It became clear very quickly that their main concern was that evolution by natural selection implied that human beings had no special status among living things. Natural selection implies that while human beings are quite impressive in the way they are put together, we did not have to be the way we are. Indeed, we did not have to be here at all.

To understand this concern better, here is a somewhat imperfect analogy to understand how natural selection works.

Think of starting out on a journey by car. At each intersection, we toss a coin and if it is heads, we turn left and if it is tails we turn right. After millions of tosses, we will have ended up somewhere, but it could have been anywhere. It might be San Francisco or it might be in the middle of a cornfield in Kansas. There is no special meaning that can be attached to the end point. We can try and reconstruct our journey starting from the end and working backwards to the beginning (which is what evolutionary biologists do) but the end point of our journey was not predetermined when we began.

The important point is that, according to natural selection we were not destined to end up as we did. The many small random genetic mutations that occurred over the years are the analog of the coin tosses, and the end point could have been something quite different.

For people who believe that humans are created in God's image, this is pretty tough to take because it is a steep drop in one's self-image. One day you are the apple of God's eye, the next you are the byproduct of random genetic mutations with no underlying plan at all. One can understand why this is so upsetting to those who want to feel that they are special and that their lives have a divine purpose.

Those who adhere to a belief structure labeled 'theistic evolution' strike a middle ground and argue that God created the laws of natural selection but guided the process by working within those laws. This is analogous to intervening only during some or all of the coin tosses to influence the way the coin landed. So what may appear to be random to us may not have been truly so.

Depending on how far one wants to take this, one can argue that God intervened at every coin toss or intervened only sparingly, say to prevent us doing something really stupid like driving off a cliff.

Yet other religious believers say that they are comfortable with God just creating the world and its randomly acting laws and then letting the chips fall where they may, by taking a completely hands off attitude and not intervening in any of the coin tosses.

Where one falls in this spectrum of beliefs depends on what one feels comfortable with. But it is clear that the fact that evolution by natural selection is not goal-directed is what bothers many religious people the most. They dislike the fact that according to the theory of evolution, all we can say is what we have evolved from, and that we cannot say that we are evolving towards anything.

POST SCRIPT: Lisa Simpson defends evolution

When the city of Springfield decrees that only creationism will be taught in schools, Lisa springs into action and gets arrested.

May 17, 2006

Can we ever be certain about scientific theories?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

A commenter to a previous posting raised an interesting perspective that requires a fresh posting, because it reflects a commonly held view about how the validity of scientific theories get established.

The commenter says:

A scientist cannot be certain about a theory until that theory has truly been tested, and thus far, I am unaware of our having observed the evolution of one species from another species. Perhaps, in time, we will observe this, at which point the theory will have been verified. But until then, Evolution is merely a theory and a model.

While we may have the opportunity to test Evolution as time passes, it is very highly doubtful that we will ever be able to test any of the various theories for the origins of the Universe.

I would like to address just two points: What does it mean to "test" a theory? And can scientists ever "verify" a theory and "be certain" about it?

Verificationism as a concept to validate scientific theories has been tried and found to be wanting. The problem is that any non-trivial theory generates an infinite number of predictions. All the predictions cannot be exhaustively verified. Only a sample of the possible predictions can be tested and there is no universal yardstick that can be used to measure when a theory has been verified. It is a matter of consensus judgment on the part of scientists as to when a theory becomes an accepted one, and this is done on a case-by-case basis by the practitioners in that field or sub-field.

This means, however, that people who are opposed to a theory can always point to at least one particular result that has not been directly observed and claim that the theory has not been 'verified' or 'proven.' This is the strategy adopted by ID supporters to attack evolutionary theory. But using this kind of reasoning will result in every single theory in science being denied scientific status.

Theories do get tested. Testing a theory has been a cornerstone of science practice ever since Galileo but it means different things depending on whether you are talking about an experimental science like chemistry and condensed matter physics, or a historical science like cosmology, evolution, geology, and astronomy.

Any scientific theory is always more than an explanation of prior events. It also must necessarily predict new observations and it is these predictions that are used to test theories. In the case of experimental sciences, laboratory experiments can be performed under controlled conditions in order to generate new data that can be compared with predictions or used to infer new theories.

In the case of historical sciences, however, observations are used to unearth data that are pre-existing but as yet unknown. Hence the 'predictions' may be more appropriately called 'retrodictions', in that they predict that you will find things that already exist. For example, in cosmology the retrodictions were the existence of a cosmic microwave background radiation of a certain temperature, the relative abundances of light nuclei, and so forth. The discovery of the planet Neptune was considered a successful 'prediction' of Newtonian theory, although Neptune had presumably always been there.

The testing of a historical science is analogous is to that of the investigation of a crime where the detective says things like "If the criminal went through the woods, then we should be able to see footprints." This kind of evidence is also historical but is as powerful as those of futuristic predictions, so historical sciences are not necessarily at a lower level of credibility than experimental sciences.

Theories in cosmology, astronomy, geology, and evolution are all tested in this way. As Ernst Mayr (who died a few days ago at the age of 100) said in What Evolution Is (2001): "Evolution as a whole, and the explanation of particular evolutionary events, must be inferred from observations. Such inferences must be tested again and again against new observations, and the original inference is either falsified or considerably strengthened when confirmed by all of these tests. However, most inferences made by evolutionists have by now been tested successfully so often that they are accepted as certainties." (emphasis added).

In saying that most inferences are 'accepted as certainties', Mayr is exaggerating a little. Ever since the turn of the 20th century, it has been accepted that scientific knowledge is fallible and that absolute certainty cannot be achieved. But scientists do achieve a remarkable consensus on deciding at any given time what theoretical frameworks they have confidence in and will be used to guide future research. Such frameworks have been given the name 'paradigms' by Thomas Kuhn in The Structure of Scientific Revolutions (1970).

When scientists say they 'believe' in evolution (or the Big Bang), the word is being used in quite a different way from that used in religion. It is used as shorthand to say that they have confidence that the underlying mechanism of the theory has been well tested by seeing where its predictions lead. It is definitely not "merely a theory and a model" if by the word 'merely' the commenter implies a theory that is unsupported or untested.

So yes, evolution, like all the other major scientific paradigms, both historical and experimental, has been well tested.

May 16, 2006

Wanted: 'Godwin's Law'-type rule for science

(I will be traveling for the next few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

Mike Godwin coined a law (now known as Godwin's Law) that states: "As an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches one."

This makes sense. As the discussion drags on, people start running out of fresh or relevant arguments, begin repeating themselves, lose their tempers, reach for something new to say, and Hitler/Nazi comparisons inevitably follow.

But Godwin's rule has been extended beyond its original intent and is now used as a decision rule to indicate that a discussion has ceased to be meaningful and should be terminated. In other words, as soon as the Hitler/Nazi comparison is brought into any discussion where it is not relevant, Godwin's rule can be invoked to say that the discussion is over and the person who introduced the Hitler/Nazi motif has lost the argument.

I was thinking that this might be a good model to follow in finding a resolution to the interminable discussions over whether so-called 'intelligent design' theory (ID) is a part of science. My rule would read as follows:

"As soon as the advocates of any theory go to legislative or other non-scientific bodies to get their theory labeled as a science, they have lost the argument and their theory is automatically declared to be not a science."

Why do we need such a rule? Because ID advocates are the latest in a long line of people who have tried to bypass the normal processes of science by going outside the scientific community to implement their agenda.

The historical record of such attempts is not pretty. The Roman Catholic Church attempted in 1616 to ban Copernicus' theory. The Soviet Central Committee tried in 1949 to dismiss Mendelian genetics as pseudoscience. Louisiana and Arkansas passed legislation in the 1980s to force the teaching of so-called 'creation science' in science classes and were overturned by the US Supreme Court. Even more recently Kansas tried to ban the teaching of evolution and failed. All these attempts ended as debacles for their proponents but in the process wasted the time and energy of huge numbers of people.

ID advocates, like their predecessors in having failed to convince the scientific community of the merits of their case, now argue that the scientific community is conspiring to unfairly keep their theory out, and that this is why they need to appeal to legislative or judicial bodies to get their way. In making this argument, they reveal a profound misunderstanding of the way science operates.

The agenda of scientists is not a secret. It is, simply, to have good science. And few will deny that science has delivered the goods in spectacular ways. It has achieved this by allowing the scientific community to achieve consensus as to what is the best paradigm to govern research activity in any given field at any given time.

This does not mean that individual scientists always make the best decisions in any given situation. It does not mean that scientists are incapable of making mistakes. It does not mean that scientists don't have philosophical and scientific prejudices that color their views. It does not mean that scientists cannot be arrogant or pig-headed.

But despite all this, no reasonable person will dispute the point that science has been extraordinarily successful. This happens because scientists, whatever their other views and attributes, need to have good science because that is what is important to the health of their profession. Good science is in their best self-interest.

Good science would not happen if outside bodies were the arbiters of what is science, since they have their own agendas and can thus be pressured to make decisions on political or other grounds. So if ID advocates are successful in their efforts, they would be threatening the very foundations of science's success.

In their opposition to such legislative intrusions, scientists are similar to artists and craftsmen. Would anyone argue that legislatures should decide on what constitutes a good painting?

In the long run, academic communities in scientific disciplines, despite their wide internal divergences, know that they must serve as the judges of what is good for their field and take that responsibility seriously. This is why the elaborate mechanism of peer-review, despite its faults, plays such an important role and why scientists, despite their differences in nationalities, religions, ethnicities, languages, ages, and genders, repeatedly arrive at remarkable levels of worldwide consensus on what is good science and what is bad science, and what is science and what is not science.

As the philosopher of science Barry Barnes says in his T.S. Kuhn and Social Science, (1982): "In science…there is no basis for validation superior to the collective contingent judgment of the paradigm-sharing community itself."

But the proponents of ID, like their predecessors, just don't get this and keep trying to have outside agencies legislate what scientists should and should not do. These discussions, like those on internet discussion boards, can drag on and on and waste the time and energy of everyone concerned since, if history is any guide, the net result is to revert to the original situation where scientists decide what science is.

So let's invoke my rule and declare that ID is not a science. Then we can get on with real work.

May 15, 2006

Evolution III: Scientific knowledge is an interconnected web

(I will be traveling for the next few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In an earlier posting, the question was posed as to whether it was intellectually consistent to reject the findings of an entire modern scientific discipline (like biology) or of a major theoretical structure (like the theory of evolution) while accepting all the other theories of science.

The short answer is no. Why this is so can be seen by examining closely the most minimal of creationist theories, the one that goes under the label of 'intelligent design' or ID.

ID supporters take great pains to claim that theirs is a scientific theory that has nothing to do with religion or God, and hence belongs in the school science curriculum. (This particular question whether ID can be considered a part of science or of religion will be revisited in a later posting. This is becoming a longer series than I anticipated…)

ID advocates say that there are five specific biochemical systems and processes (bacterial flagella and cilia, blood clotting, protein transport within a cell, the immune system, and metabolic pathways) whose existence and/or workings cannot be explained by evolutionary theory and that hence one has to postulate that such phenomena are evidence of design and of the existence of a designer.

The substance of their arguments is: "You can claim all the other results for evolutionary theory. What would be the harm in allowing these five small systems to have an alternative explanation?"

Leaving aside the many other arguments that can be raised against this position (including those from biologists that these five systems are hardly intractable problems for evolutionary theory), I want to focus on just one feature of the argument. Is it possible to accept that just these five processes were created by a 'designer,' while retaining a belief in all the other theories of science?

No you cannot. If some undetectable agent had intervened to create the cilia (say), then in that single act at a microscopic level, you have violated fundamental laws of physics such as the law of conservation of energy, the law of conservation of momentum, and (possibly) the law of conservation of angular momentum. These laws are the bedrock of science and to abandon them is to abandon some of the most fundamental elements of modern science.

So rejecting a seemingly small element of evolutionary theory triggers a catastrophe in a seemingly far-removed area of science, a kind of chaotic 'butterfly effect' for scientific theories.

Scientific theories are so interconnected that some philosophers of science have taken this to the extreme (as philosophers are wont to do) and argued that we can only think of one big scientific theory that encompasses everything. It is this entire system (and not any single part of it) that should be compared with nature.

Pierre Duhem in his The Aim and Structure of Physical Theory (1906) articulated this position when he declared that: "The only experimental check on a physical theory which is not illogical consists in comparing the entire system of the physical theory with the whole group of experimental laws, and in judging whether the latter is represented by the former in a satisfactory manner." (emphasis in original)

Of course, in practical terms, we don't do that. Each scientific subfield proceeds along its own path. And we know that there have been revolutions in one area of science that have left other areas seemingly undisturbed. But this interconnectedness is a reality and explains why scientific theories are so resistant to change. Scientists realize that changing one portion requires, at the very least, making some accommodations in theories that are connected to it, and it is this process of adjustments that takes time and effort and prevents trivial events from triggering changes.

This is why it usually requires a major crisis in an existing theory for scientists to even consider replacing it with a new one. The five cases raised by ID advocates do not come close to creating that kind of crisis. They are like flies in the path of a lumbering evolutionary theory elephant, minor irritants that can be ignored or swatted away easily.

May 12, 2006

Iran's president poses some tough questions for Bush

During the run up to the invasions of Afghanistan and Iraq, the leaders of those countries tried to open a dialogue with the Bush administration but were summarily rebuffed, since Bush and his neoconservative clique were determined to go to war from the get-go and all their posturing about preferring diplomacy have been revealed to be just that - posturing. The media was complicit in this dismissal of possibilities for peaceful resolution, hardly ever reporting the full extent of the overtures that those governments made to the US.

We see the same thing being repeated with Iran. The latest example is the way the Iranian President Mahmood Ahmadi-Nejad's letter to Bush is being portrayed and dismissed. I think it is important for people to not depend on selective quotes released by interested parties or the interpretations of the media without also reading the original documents and making judgments for themselves. The full text of the translated letter can be seen here.

The letter is mixture of politics, philosophy, and religion. On the issue of nuclear technology, Ahmadi-Nejad asks a pertinent question:

Why is it that any technological and scientific achievement reached in the Middle East
 regions is translated into and portrayed as a threat to the Zionist regime? Is not scientific 
R&D one of the basic rights of nations.

You are familiar with history. Aside from the Middle Ages, in what other point in history has
 scientific and technical progress been a crime? Can the possibility of scientific achievements being utilised for military purposes be reason enough to oppose science and technology altogether? If such a supposition is true, then all scientific disciplines, including physics, chemistry, mathematics, medicine, engineering, etc. must be opposed.


He may have suspected that he was over-estimating Bush's familiarity with history (despite majoring in it at Yale), so he helpfully provides him with a Cliff notes version of Iranian grievances against the US, many of which are likely to be also unfamiliar to most Americans:

The brave and faithful people of Iran too have many questions and grievances, including: the 
coup d’etat of 1953 and the subsequent toppling of the legal government of the day, opposition to the Islamic revolution, transformation of an Embassy into a headquarters supporting, the activities of those opposing the Islamic Republic (many thousands of pages of
 documents corroborates this claim), support for Saddam in the war waged against Iran, the shooting down of the Iranian passenger plane, freezing the assets of the Iranian nation, increasing threats, anger and displeasure vis-à-vis the scientific and nuclear progress of the Iranian nation (just when all Iranians are jubilant and collaborating their country’s progress),
and many other grievances that I will not refer to in this letter.

Ahmadi-Nejad's political analysis can be quite shrewd. For example, he provides a checklist of things by which he thinks a country's leaders should be judged and it is hard to quarrel with it. But he has clearly selected the items in the list to hit Bush at all the points where he knows he is weak.

[M]y main contention - which I am hoping you will agree to some of
 it - is: Those in power have specific time in office, and do not rule indefinitely, but their names will
 be recorded in history and will be constantly judged in the immediate and distant futures. The people will scrutinize our presidencies.

Did we manage to bring peace, security and prosperity for the people or insecurity and
 unemployment?

Did we intend to establish justice, or just supported especial interest groups, and by forcing 
many people to live in poverty and hardship, made a few people rich and powerful - thus 
trading the approval of the people and the Almighty with theirs?



Did we defend the rights of the underprivileged or ignore them?

Did we defend the rights of all people around the world or imposed wars on them, interfered illegally in their affairs, established hellish prisons and incarcerated some of them?

Did we bring the world peace and security or raised the specter of intimidation and threats?



Did we tell the truth to our nation and others around the world or presented an inverted
 version of it?

Were we on the side of people or the occupiers and oppressors?



Did our administration set out to promote rational behaviour, logic, ethics, peace, fulfilling 
obligations, justice, service to the people, prosperity, progress and respect for human dignity
 or the force of guns.

Intimidation, insecurity, disregard for the people, delaying the progress and excellence of other nations, and trample on people’s rights?

And finally, they will judge us on whether we remained true to our oath of office - to serve
 the people, which is our main task, and the traditions of the prophets - or not?
. . .
If billions of dollars spent on security, military campaigns and troop movement were instead spent on investment and assistance for poor countries, promotion of health, combating different diseases, education and improvement of mental and physical fitness, assistance to the victims of natural disasters, creation of employment opportunities and production, development projects and poverty alleviation, establishment of peace, mediation between disputing states and distinguishing the flames of racial, ethnic and other conflicts were would the world be today? Would not your government, and people be justifiably proud? Would not your administration’s political and economic standing have been stronger? And I am most sorry to say, would there have been an ever increasing global hatred of the American governments?

No wonder the administration is downplaying the letter. It is hard to see Bush being able to answer these questions.

The letter then veers off into religious talk and appeals to the one thing he says they have in common, the way religious beliefs of Bush and Ahmadi-Nejad influence their political actions. And he is right, he does seem to have a lot in common with Bush and the religious fundamentalists in the US, even to the extent of quoting religious texts in support of public policy. For example,

We believe a return to the teachings of the divine prophets is the only road leading to salvations. I have been told that Your Excellency follows the teachings of Jesus (PBUH), and 
believes in the divine promise of the rule of the righteous on Earth.

We also believe that Jesus Christ (PBUH) was one of the great prophets of the Almighty. He 
has been repeatedly praised in the Koran. Jesus (PBUH) has been quoted in Koran as well; [19,36] And surely Allah is my Lord and your Lord, therefore serves Him; this is the right
 path, Marium.

(Note: PBUH stands for Peace Be Upon Him, and is used by devout Muslims whenever they refer to a revered figure. I don't know what 'Marium' means but it is also probably some religious convention.)

Ahmadi-Nejad even throws in some oblique allusions to the Rapture, speaking of the "belief in the Last Day," and continues:

The day will come when all humans will congregate before the court of the Almighty, so that
 their deeds are examined. The good will be directed towards Haven and evildoers will meet
 divine retribution.

He even shares Bush's disdain for the trappings of liberalism and western style democracy, and enthusiastically trumpets the advantages of a theocratic state:

Liberalism and Western style democracy have not been able to help realize the ideals of
 humanity. Today these two concepts have failed. Those with insight can already hear the
 sounds of the shattering and fall of the ideology and thoughts of the liberal democratic
 systems. We increasingly see that people around the world are flocking towards a main focal point -
that is the Almighty God.

(The slight difference between Bush and Ahmadi-Nejad is that Bush still pays lip-service to democracy while undermining it with actions that reflect his belief that he is above the law and can thumb his nose at the courts and the US constitution.)

It is not comforting to think that issues of war and peace are being determined by two leaders who think that liberal democracy is useless, and use religious texts as a source of their policy decision-making.

Read James Wolcott and Justin Raimondo for what the media here left out in their reporting of the Iranian leader's letter.

And if you have the time, check out the Iranian President's official website which has a nice Photoshopped picture of him writing a letter at his desk, while in the same room in the corner Bush is seated in a chair thoughtfully reading the letter. Which immediately raised the thought in my mind: Did Bush actually read the letter? Or did some aide give him a one-paragraph summary or a set of bullet points?

POST SCRIPT: Traveling

For the next two or three weeks I will be traveling to Australia and New Zealand to visit family, friends, and the occasional kangaroo, koala bear, and kiwi.

My access to a computer will be somewhat erratic during these days and I will be unable to post any original material. As a result, until my return I will be re-running some of my favorite posts from the very early days of this blog, which may be original for those who started visiting here later.

May 11, 2006

Burden of proof-3: The role of negative evidence

In my previous post, I suggested that in science, the burden of proof lies with the proponent for the existence of some thing. The default assumption is non-existence. So if you propose the existence of something like electromagnetic radiation or neutrinos or N-rays, then you have to provide some positive evidence that it exists of a kind that others can try to replicate.

But not all assertions, even in science, need meet that positive evidence standard. Sometimes negative evidence, what you don't see, is important too. Negative evidence is best illustrated by the famous Sherlock Holmes story Silver Blaze, in which the following encounter occurs:

Gregory [Scotland Yard detective]: "Is there any other point to which you would wish to draw my attention?"
Holmes: "To the curious incident of the dog in the night-time."
Gregory: "The dog did nothing in the night-time."
Holmes: "That was the curious incident."

There are times when the absence of evidence can be suggestive. This is true with the postulation of universal laws. The substance of such laws (such as that the total energy is conserved) is that they hold in every single instance. But we cannot possibly examine every possibility. The reason that we believe these types of laws to hold is because of negative evidence, what we do not see. If someone postulates the existence of a universal law, the absence of evidence that contradicts it is taken as evidence in support of the law. There is a rule of thumb that scientists use that if something can happen, it will happen. So if we do not see something happening, that suggests that there is a law that prevents it. This is how laws such as baryon and lepton number conservation originated.

Making inferences from absence is different from proving a negative about the existence of something, be it N-rays or god. You can never prove that an entity doesn't exist. So at least at the beginning, it is incumbent on the person who argues for the existence of something to provide at least some evidence in support of it. The case for the existence of entities (like neutrinos or X-rays or god) requires positive evidence. Once that has been done beyond some standard of reasonable doubt, then the burden can shift to those who argue for non-existence, to show why this evidence is not credible.

This rule about evidence was not followed in the run up to the attack on Iraq. The Bush administration simply asserted that Iraq had weapons of mass destruction without providing credible evidence of it. They then (aided by a compliant media) managed to frame the debate so that the burden of proof shifted to those who did not believe the weapons existed. Even after the invasion, when the weapons did not turn up, Donald Rumsfeld famously said "There's another way to phrase that and that is that the absence of evidence is not the evidence of absence. It is basically saying the same thing in a different way. Simply because you do not have evidence that something does exist does not mean that you have evidence that it doesn't exist." But he was wrong. When you are asserting the existence of an entity, if you have not provided any evidence that they do exist, then the absence of evidence is evidence of absence.

It is analogous to criminal trials. People are presumed innocent until proven guilty, and the onus is on the prosecution to first provide some positive evidence. Once that is done, the accused usually has to counter it in some way to avoid the risk that the jury will find the evidence sufficiently plausible to find the accused guilty.

So the question boils down to whether believers in a god have provided prima facie evidence in support of their thesis, sufficient to shift the burden to those who do not believe in god to show why this evidence is not convincing. Personal testimony by itself is usually not sufficient in courts, unless it is corroborated by physical evidence or direct personal observation by other credible sources who have observed the same phenomenon.

One of the common forms of evidence that is suggested is that since many, many people believe in the existence of god, that should count as evidence. My feeling is that that is not sufficient. After all, there have been universal beliefs that have subsequently been shown to be wrong, such as that the Earth was located at the center of the universe.

Has the evidence for god met the standard that we would accept in science or in a court of law? I personally just don't see that it has but that is a judgment that each person must make. Of course, people can choose to not require that the evidence for god meet the same standard as for science or law, and if that is the case, then that pretty much ends the discussion. But at least we can all agree as to why we disagree.

May 10, 2006

Burden of proof-2: What constitutes evidence for god?

If a religious person is asked for evidence of god's existence, the type of evidence presented usually consist of religious texts, events that are inexplicable according to scientific laws (i.e., miracles), or personal testimonies of direct experience of god. Actually, this can be reduced to just two categories (miracles and personal testimonies) since religious texts can be considered either as miraculously created (in the case of the Koran or those who believe in Biblical inerrancy) or as the testimonies of the writers of the texts, who in turn recorded their own or the testimonies of other people or report on miraculous events. If one wants to be a thoroughgoing reductionist, one might even reduce it to one category by arguing that reports of miracles are also essentially testimonies.

Just being a testimony does not mean that the evidence is invalid. 'Anecdotal evidence' often takes the form of testimony and can be the precursor to investigations that produce other kinds of evidence. Even in the hard sciences, personal testimony does play a role. After all, when a scientist discovers something and publishes a paper, that is kind of like a personal testimony since the very definition of a research publication is that it incorporates results nobody else has yet published. But in science those 'testimonies' are just the starting point for further investigation by others who try to recreate the conditions and see if the results are replicated. In some cases (neutrinos), they are and in others (N-rays) they are not. So in science, testimonies cease to be considered as such once independent researchers start reproducing results under fairly well controlled conditions.

But with religious testimonies, there is no such promise of such replicability. I recently had a discussion with a woman who described to me her experiences of god and described something she experienced while on a hilltop in California. I have no reason to doubt her story but even she would have thought I was strange if I asked her exactly where the hilltop was and what she did there so that I could try and replicate her experience. Religious testimonies are believed to be intensely personal and unique and idiosyncratic, while in science, personal testimony is the precursor to shared, similar, consistently reproducible experiences, under similar conditions, by an ever-increasing number of people.

The other kind of experience (miracles) again typically consists of unique events that cannot be recreated at will. All attempts at finding either a consistent pattern of god's intervention in the world (such as the recent prayer study) or unambiguous violations of natural laws have singularly failed. All we really have are the stories in religious texts purporting to report on miraculous events long ago or the personal testimonies of people asserting a miraculous event in their lives.

How one defines a miracle is also difficult. It has to be more than just a highly improbable event. Suppose someone is seriously ill with cancer and the physicians have given up hope. Suppose that person's family and friends pray to god and the patient suffers a remarkable remission in the disease. Is that a miracle? Believers would say yes, but unbelievers would say not necessarily, asserting that the body has all kinds of mechanisms for fighting disease that we do not know of. So what would constitute an event that everyone would consider a miracle?

Again, it seems to me that it would have to have the quality of replicability to satisfy everyone. If for a certain kind of terminal disease, a certain kind of prayer done under certain conditions invariably produced a cure where medicine could not, then that would constitute a good case for a miracle, because that would be hard to debunk, at least initially. As philosopher David Hume said: "No testimony is sufficient to establish a miracle unless the testimony be of such a kind that its falsehood would be more miraculous than the fact which it endeavors to establish..." (On Miracles)

But even this is problematical, especially for believers who usually do not believe in a god who acts so mechanically and can be summoned at will. Such predictable behavior is more symptomatic of the workings of as-yet-unknown natural laws than of god. The whole allure of belief in god is that god can act in unpredictable ways, to cause the dead to come back to life and the Earth to stop spinning.

So both kinds of evidence (miracles and testimonies) used to support belief in a god are inadequate for what science requires as evidentiary support.

The divide between atheists and religious believers ultimately comes down to whether an individual feels that all beliefs should meet the same standards that we accept for good science or whether we have one set of standards for science or law, and another for religious beliefs. There is nothing that compels anyone to choose either way.

I personally could not justify to myself why I should use different standards. Doing so seemed to me to indicate that I was deciding to believe in god first and then deciding on how to rationalize my belief later. Once I decided to use the yardstick of science uniformly across all areas of knowledge and see where that leads, I found myself agreeing with Laplace that I do not need the god hypothesis.

In a future posting, I will look at the situation where we can infer something from negative evidence, i.e., when something does not happen.

POST SCRIPT: Faith healing

The TV show House had an interesting episode that deals with some of the issues this blog has discussed recently, like faith healing (part 1 and part 2) and what to make of people who say god talks to them.

Here is an extended clip from that episode that pretty much gives away the entire plot, so don't watch it if you are planning to see it in reruns. But it gets to grips with many of the issues that are discussed in this blog.

House is not very sympathetic to the claims of the 15-year old faith healer that god talks to him. When his medical colleagues argue with House, saying that the boy is merely religious and does not have a psychosis, House replies "You talk to god, you're religious. God talks to you, you're psychotic."

May 09, 2006

Burden of proof

If a religious person asks me to prove that god does not exist, I freely concede that I cannot do so. The best that I can do is to invoke the Laplacian principle that I have no need of hypothesizing god's existence to explain things. But clearly most people feel that theydo need to invoke god in order to understand their lives and experience. So how can we resolve this disagreement and make a judgment about the validity of the god hypothesis?

Following a recent posting on atheism and agnosticism, I had an interesting exchange with commenter Mike that made me think more about this issue. Mike (who believes in god) said that in his discussions with atheists, they often were unable to explain why they dismissed god's existence. He says: "I find that when asked why the 'god hypothesis' as Laplace called it doesn't work for them, they often don't know how to respond."

Conversely, Mike was perfectly able to explain why he (and other believers) believed in god's existence:

The reason is that we have the positive proof we need, in the way we feel, the way we think, the way we act, things that can't easily be presented as 'proof'. In other words, the proof comes in a different form. It's not in a model or an equation or a theory, yet we experience it every day.

So yes, we can ask that a religious belief provide some proof, but we must be open to the possibility that that proof is of a form we don't expect. I wonder how often we overlook a 'proof' - of god, of love or a new particle - simply because it was not in a form we were looking for - or were willing to accept.

Mike makes the point (with which I agree) that it is possible that we do not have the means as yet to detect the existence of god. His argument can be supported by analogies from science. We believe we were all bathed in electromagnetic radiation from the beginning of the universe but we did not realize it until Maxwell's theory of electromagnetism gave us a framework for understanding its existence and enabled us to design detectors to detect it.

The same thing happened with neutrinos. Vast numbers of them have been passing though us and the Earth but we did not know about their existence until the middle of the 20th century when a theory postulated their existence and detectors were designed that were sensitive enough to observe them.

So electromagnetic radiation and neutrinos existed all around us even during the long period of time when no one had any idea that they were there. Why cannot the same argument be applied to god? It can, actually. But does that mean that god exists? I think we would all agree that it does not, anymore than my inability to prove that unicorns do not exist implies that they do. All that this argument does is leave open the possibility of a hitherto undetected existence.

But the point of departure between science and religion is that in the case of electromagnetic radiation and neutrinos, their existence was postulated simultaneously along with suggestions of how and where anyone could look for them. If, after strenuous efforts, they could still not be detected, then scientists would cease to believe in their existence. But eventually, evidence for their existence was forthcoming from many different sources in a reproducible manner.

What if no such evidence was forthcoming? This has happened in the past with other phenomena, such as in 1903 with something called N-rays, which were postulated and seemed to have some evidentiary support initially, but on closer examination were found to be spurious. This does not prevent people from still believing in the phenomenon, but the scientific community would proceed on the assumption that it does not exist.

In the world of science the burden of proof is always on the person arguing for the existence of whatever is being proposed. If that evidence is not forthcoming, then people proceed on the assumption that the thing in question does not exist (the Laplacian principle). It is in parallel to the legal situation. We know that in the legal context in America, the presumption is that of innocence until proven guilty. This results in a much different kind of investigation and legal proceedings than if the presumption were guilty until proven innocent.

So on the question of god's existence, it seems to me that it all comes down to the question of who has the burden of proof in such situations. Is the onus on the believer, to prove that god exists? Or on the atheist to argue that the evidence provided for god's existence is not compelling? In other words, do we draw a parallel with the legal situation of 'presumed innocent until proven guilty beyond a reasonable doubt' and postulate a principle 'non-existence until existence is proven beyond a reasonable doubt'? The latter would be consistent with scientific practice.

As long as we disagree on this fundamental question, there is little hope for resolution. But even if we agree that the burden of proof is the same for religion as for science, and that the person postulating existence of god has to advance at least some proof in support, that still does not end the debate. The question then shifts to what kind of evidence we would consider to be valid and what constitutes 'reasonable doubt.'.

In the next few postings, we will look at the kinds of evidence that might be provided and how we might evaluate them.

May 08, 2006

Driving etiquette

Now that the summer driving season is upon us, and I am going to be on the highway today, here are some musings on driving.

Driving means never being able to say you're sorry

We need a non-verbal sign for drivers to say "I'm sorry." There have been times when I have inadvertently done something stupid or discourteous while driving, such as changing lanes without giving enough room and thus cutting someone off or accidentally blowing the horn or not stopping early enough at a stop sign or light and thus creating some doubt in the minds of other drivers as to whether I intended to stop. At such times, I have wanted to tell the other driver that I was sorry for unsettling them, but there is no universally recognized gesture to do so.

If we want to thank someone, the raised flat upturned palm works. And there are so many ways to show annoyance at others, ranging from blaring the horn, angry yells, and rude gestures. But there is nothing that says sorry. I think we need one.

Any suggestions?

Car friendliness

When I am walking along the street and pass someone, people almost always make eye-contact, nod, smile, and say "hello" or "how are you?" But when people are in cars, they studiously avoid giving any sign that other people exist. If you stop at a light next to another car, or are cruising along a highway parallel to another car, everyone stares straight ahead. If by chance you make eye-contact, people quickly look away. Why this difference?

It is as if the inside of a car is considered a zone of privacy, although it is almost as public as standing in the street. I am not sure why this is but it does explain why people do things in cars (eat, read, comb their hair, put on makeup, pick their teeth, check for zits, etc.) that they might not normally do in public.

The only exceptions to this rule seem to be if there are friendly-looking dogs or small children in the car. The owners of such dogs tend to welcome attention, and nods and smiles are exchanged. Small children will also wave cheerily to you.

I have been trying a small experiment these last few days. I decided that when I stop at lights or am in a traffic jam, I would glance around and if I make eye-contact with people in adjoining cars, I would smile and nod, just as if I were passing them in the street. Interestingly, only one person so far has made eye contact with me, and we exchanged smiles and nods. Everyone else stares straight ahead, sometimes rapidly turning away after a very brief look.

I hope no one reports me to the police as this weird guy who is smiling at them while driving.

Merging on highways

I am sure everyone has experienced this on highways. You are driving along and see a sign that says your lane is closed ahead and to merge into the adjacent lane. What you will observe is that traffic in your lane will slow down and even stop long before the actual merge point, as drivers seek to blend into the other lane.

It seems to me that the most efficient thing to do is to drive right up to the point where your lane ends and then merge. If you start merging earlier, you are effectively making the amount of highway that has a reduced number of lanes even longer than it is, and thus slowing down your journey even more. But although no one has explicitly told me this, I get the feeling that to do this is impolite, as if I am jumping the queue. So although I feel that the sensible thing to do is to cruise right up to the end and then merge, I succumb to this pressure and merge earlier. Of course, it increases travel time usually by just a few minutes so time is not primarily the issue. The issue is why it seems to be considered impolite to merge early.

Could we start spreading the word that it is actually more sensible for everyone to merge as late as possible?

May 05, 2006

Madman theory: Bush and god

Recently trial balloons have been floated by the administration that they are seeking to carry out an attack on Iran, even to the extent of using nuclear 'bunker buster' bombs. Seymour Hersh reports in The New Yorker that: "One of the military's initial option plans, as presented to the White House by the Pentagon this winter, calls for the use of a bunker-buster tactical nuclear weapon, such as the B61-11, against underground nuclear sites."

This revelation naturally prompts the question "Are they insane?" And that prompts the further question "Does the administration want people to think that Bush is insane as a means of achieving some goals?" Now it is true that the Pentagon develops contingency plans for all kinds of bizarre scenarios (even involving invading Canada) but Hersh's article seems to indicate that these contingency plans are operational which implies a greater likelihood of being actually implemented.

Faking insanity, or at least recklessness, to achieve certain ends has a long history, both in fact and fiction. Hamlet did it. President Nixon, frustrated by the indomitable attitude of the Vietnamese forces opposing the US tried the same tactic, hoping that it would cause the North Vietnamese to negotiate terms more palatable to the US because of fears that he would do something stupid and extreme, such as use a nuclear weapon. (See here for a review of the use of 'madman theory' to achieve political ends.) Nixon also liked to talk about his religion but in his case it was to refer to his own Quaker background, to exploit that religious groups' reputation for strong ethical behavior, at a time when his own ethics were under severe scrutiny.

Bush does have one advantage over Nixon in making his madman theory more plausible in that he has put the word out earlier that god had chosen him to be president. In 2003, a news report says that "Bush believes he was called by God to lead the nation at this time, says Commerce Secretary Don Evans, a close friend who talks with Bush every day." Bush's claims to close links with god have been reported periodically.

More recently, it was revealed that god is so chummy with Bush that he even calls him by his first name. (I mean that god calls Bush by his first name, of course, not the other way around. Bush has probably given god a nickname like he gives everyone else.) During these chats god tells him what to do. In a BBC program, Nabil Shaath who met with Bush as part of a delegation is quoted as saying:

President Bush said to all of us: 'I'm driven with a mission from God. God would tell me, "George, go and fight those terrorists in Afghanistan." And I did, and then God would tell me, "George, go and end the tyranny in Iraq. . ." And I did. And now, again, I feel God's words coming to me, "Go get the Palestinians their state and get the Israelis their security, and get peace in the Middle East." And by God I'm gonna do it.'

What are we to make of something that reads like Tuesdays with God? Those of us who are atheists would say that Bush is either lying about his tete-a-tetes with the almighty to pander to his extremist religious base or suffers from the same kind of delusions that cause some people to see the Virgin Mary in a slice of toast, neither of which is reassuring for those of us who seek a more down-to-earth basis for actions by political leaders, especially those who have the power to cause tremendous damage.

Of course, all of our actions are influenced by our beliefs and values, and for religious people their religious beliefs are bound to be influential in the principles that guide their decision making. That is not the question here. The question is whether even religious people are reassured when Bush says that he took some concrete action because god specifically directed him to do so.

Somehow, even if I were still religious, I would still be uneasy about political leaders claiming to be acting under direct instructions from god because we know that schizophrenics also sometimes think they hear such voices. People who claim to have their actions explicitly directed by god are usually considered to be delusional and at worst insane.

But I am curious as to what religious people think of Bush's claims to have this kind of hotline to god. Are they pleased? Or, despite their own religious beliefs, are they uneasy? It would be interesting to survey religious people with this question: "If Bush says god told him to attack Iran, would that be sufficient justification for you to support such an action?"

The basic question for religious people, even if they do not think Bush is lying, is how they judge whether the voices Bush claims to hear are really from the deity or due to some chemical imbalance in his brain.

May 04, 2006

Dover's dominoes-7: The Ohio domino falls

(This is the final installment of the series, which got pre-empted by more topical items. Sorry! See part 1, part 2, part 3, part 4, part 5, part 6.)

The domino effect of the Dover verdict was seen soon after in Ohio where on February 14, 2006 the Ohio State Board of Education reversed itself and threw out the benchmarks in the state's science standards that called for the critical analysis of evolution and the lessons plans that had been based on them. This happened even though the Ohio policy did not explicitly mention intelligent design. However, the move was clearly influenced by the ripples from the Dover trial and it is instructive to see why.

What Ohio had done in 2002 was to include a benchmark in its 9th grade biology standards in the section that dealt with biological evolution that said "Describe how scientists continue to investigate and critically analyze aspects of evolutionary theory." They added additional language that said (in parentheses) "The intent of this benchmark does not mandate the teaching or testing of intelligent design."

The pro-IDC OBE members also inserted people into the lesson plan writing team who drafted a lesson plan called Critical Analysis of Evolution that essentially recycled IDC ideas, again without explicitly mentioning intelligent design.

But on February 14, 2006, the Ohio Board of Education voted 11-4 to reverse itself and eliminate both the benchmark and its associated lesson plan. Why did they do so since, as some pro-IDC people on the Board said, they should have nothing to fear from the Dover decision since they had carefully avoided requiring the teaching of IDC?

Again, Judge Jones' ruling indicates why. In his ruling, he said that what determines whether a law passes constitutional muster is how an informed observer would interpret the law. He said (Kitzmiller, p. 15):

The test consists of the reviewing court determining what message a challenged governmental policy or enactment conveys to a reasonable, objective observer who knows the policy's language, origins, and legislative history, as well as the history of the community and the broader social and historical context in which the policy arose.

In the case of challenges to evolutionary theory, he looked at precedent and especially (p. 48) at:

a factor that weighed heavily in the Supreme Court's decision to strike down the balanced-treatment law in Edwards, specifically that "[o]ut of many possible science subjects taught in the public schools, the legislature chose to affect the teaching of the one scientific theory that historically has been opposed by certain religious sects."

He went on (p. 57):

In singling out the one scientific theory that has historically been opposed by certain religious sects, the Board sent the message that it "believes there is some problem peculiar to evolution," and "[i]n light of the historical opposition to evolution by Christian fundamentalists and creationists[,] . . . the informed, reasonable observer would infer the School Board's problem with evolution to be that evolution does not acknowledge a creator."

Notice that the standard used for judging is what an 'informed, reasonable observer' would infer from the action. IDC advocates tend to implement their strategy by carefully choosing words and sentences so that it meets the letter of the law and thus hope it will pass constitutional scrutiny. But what Judge Jones says is that it is not merely how the law is worded but also how a particular kind of observer, who is assumed to be much more knowledgeable about the issues than your average person in the street, would interpret the intent of the law:

The test consists of the reviewing court determining what message a challenged governmental policy or enactment conveys to a reasonable, objective observer who knows the policy's language, origins, and legislative history, as well as the history of the community and the broader social and historical context in which the policy arose. (emphasis added)

And this is the most damaging part of the verdict to the ID case. Their strategy has always been to single out evolutionary theory in science for special scrutiny in order to undermine its credibility. They have never called for 'teaching the controversy' in all the other areas of science. Judge Jones said that since an 'informed, reasonable observer' would know that Christians have had long-standing objections to evolutionary theory on religious grounds, singling it out for special treatment is tantamount to endorsing a religious viewpoint.

In a further telling statement that has direct implications for the Discovery Institute's 'teach the controversy' strategy, he said (p. 89):

ID's backers have sought to avoid the scientific scrutiny which we have now determined that it cannot withstand by advocating that the controversy, but not ID itself, should be taught in science class. This tactic is at best disingenuous, and at worst a canard. The goal of the IDM [Intelligent Design Movement] is not to encourage critical thought, but to foment a revolution which would supplant evolutionary theory with ID.

There is no way to see the Dover ruling as anything but a devastating blow to the whole stealth strategy promoted by the Discovery Institute. After all, their strategy had precisely been to single out evolutionary theory for special treatment. They have resolutely opposed any attempt to call for 'critical analysis' and 'teaching the controversy' in all areas of science.

What will they do in response? It is hard to say. My guess is that they will put all their efforts into supporting the policy adopted by the Kansas school board, which was done according to their preferences, unlike the ham-handed efforts of the people of Dover, El Tejon, and Kirk Cameron's friend and the banana. (I had not known who Kirk Cameron was before this. I have been informed that he used to be a TV sitcom actor before he saw the light.)

The next domino is the science standards adopted by Kansas's Board of Education. I have not looked too closely at what the school board decided there, so will defer commenting on it until I do so. But it is likely to end up in the courts too.

POST SCRIPT: More on The Israel Lobby article

A few days ago, I wrote about the stir created by the Mearsheimer and Walt article on The Israel Lobby and the petition started by Juan Cole to defend them against charges of anti-Semitism.

In the May 15, 2006 issue of The Nation, Philip Weiss has a good analysis titled Ferment Over 'The Israel Lobby' on the personalities of the authors and the other people involved, what went on behind the scenes of the article prior to and after its publication, and why it had the effect it did.

May 03, 2006

Stephen Colbert crashes the party

Some of you may have heard of Stephen Colbert's speech at the annual White House Correspondents Association Dinner on Saturday, April 29, 2006. This is the annual occasion where the President and other members of his administration and the journalists who cover them plus assorted celebrities get together for an evening of schmoozing, eating, and drinking.

(See here for a report on the dinner. You can see Colbert's full speech here or here. Or, if you prefer, you can read the transcript.)

This occasion serves to reinforce a peculiar myth (reinforced by the media) about the way that journalism works in Washington. We (i.e., the outsiders) are expected to believe that there is an antagonistic, even hostile, relationship between the administration and journalists and this dinner is the one occasion in the year when they can laugh at themselves and each other, to show that they are all good sports and there are no hard feelings.

The reality is that there is an extremely cozy and almost incestuous relationship among four groups of people in Washington: the administration, congress, lobbyists, and the journalists who cover them and who are supposedly acting on our behalf. Peel back the covers and you find that there is a dense tangle of personal, professional, and financial relationships that bind them all together. They go to one another's parties, vacation in the same places, live in the same neighborhoods, move among the same group of friends, marry and have romances with each other. They all are upper-middle class or wealthy people who share the same concerns and values and class interests. They have little connection or identification with the lives of the 99% of people who are outside that circle.

Understanding this explains a lot about the way that the media performs. Even a fairly casual observer can see that in so-called press conferences or news shows, the journalists rarely ask the kinds of questions of powerful people that might elicit useful information. They also have the same people over and over again on these programs. In general the journalists are extremely deferential to those in power. Partly this is because modern-day journalists tend to focus more on access journalism (where you use prominent people as sources) rather than the slower, more expensive, and time-consuming investigative journalism (where you dig up records, closely examine documents, follow paper and money trails, examine history, etc.).

Access journalism requires you to keep on the good side of those from whom you seek information or exclusive interviews, and thus you are completely at their mercy. If Tim Russert of NBC's Meet the Press were to really grill Bush or Cheney on his show about all their lies, his access to administration officials and their sympathizers would disappear overnight.

This is why journalists love fairly trivial disagreements between administration members, like the current disagreement between Colin Powell and Condoleeza Rice about whether Powell argued for more troops in Iraq. This enables the journalists to play the role of tough reporter while essentially engaging in trivial 'gotcha' games, and thus mask the underlying coziness that prevails. (See my two earlier posts on The Questions Not Asked here and here.)

The bonhomie so apparent at the annual White House correspondents dinner is not a break from an otherwise hostile relationship. It reveals the normal state of affairs. The only thing that is different about it is that we, the outsiders, for once get to see what actually goes on all the time.

It took a comedian, Stephen Colbert (host of the Comedy Central show The Colbert Report), to shatter this facade. For those not aware of his show, his character is a parody of Bill O'Reilly: loud, pompous, overbearing, patronizing, and grandstanding. Colbert was completely in character at the dinner and gave a biting, satirical, and funny speech that hit all the points that journalists avoid because it might ruffle the feathers of the President and his people and prevent them from getting future interviews. Colbert provided a non-stop litany of backhanded compliments to Bush and backhanded praise to the assembled media. In one brilliant stroke he took the trademark sycophancy that the media displays towards the powerful, and by carrying it to the extreme managed to simultaneously skewer both the media and the president. This was extremely skilful satire, not the kind that brings loud guffaws but evokes instead an internal 'yes!' because someone is at last saying what many of us want to say but do not have the opportunity.

He effectively said to the president seated just two places away what the administration should hear (but never does) from journalists. And to the journalists in the audience he charged them with ignoring the big questions. "Over the last five years you people were so good -- over tax cuts, WMD intelligence, the effect of global warming. We Americans didn't want to know, and you had the courtesy not to try to find out. Those were good times, as far as we knew." And he urged them to "Write that novel you got kicking around in your head. You know, the one about the intrepid Washington reporter with the courage to stand up to the administration. You know - fiction!"

The nervous laughter mixed with stunned silence of the assembled journalists, and the strained expressions of George and Laura Bush give a good indication of how much his jokes touched a nerve. These pampered people never hear people tell such things to their faces. Those who disliked Colbert's message took the lack of uproarious laughter as a sign that Colbert's act had "bombed." They are missing the point. The people at the dinner were not his intended audience. They were his target. We, the people who live outside that privileged bubble, were the audience.

As blogger Billmon points out:

Colbert used satire the way it's used in more openly authoritarian societies: as a political weapon, a device for raising issues that can't be addressed directly. He dragged out all the unmentionables -- the Iraq lies, the secret prisons, the illegal spying, the neutered stupidity of the lapdog press -- and made it pretty clear that he wasn't really laughing at them, much less with them. It may have been comedy, but it also sounded like a bill of indictment, and everybody understood the charges.

As you can imagine, the press would not take kindly to having their inadequacies openly derided. (See here for a round up of the media reaction.) Clearly Colbert does not care if he is never invited again to this kind of event again, and it is very likely that he will not. But his career does not depend on currying favor with politicians in order to get a crumb or two from them in return. So oddly enough, it took a comedian to act like a real journalist. It is for this same reason that Jon Stewart's The Daily Show is so successful. Neither Colbert nor Stewart really need to be pals with the pols in order to do their jobs. (See Stewart's and Colbert's post-mortem of the evening here.)

James Wolcott sums things up in his own review of the Colbert speech:

Colbert was cool, methodical, and mercilessly ironic, not getting rattled when the audience quieted with discomfort (and resorting to self-deprecating "savers," as most comedians do), but closing in on the kill, as unsparing of the press as he was of the president. . .The we-are-not-amused smile Laura Bush gave him when he left the podium was a priceless tribute to the displeasure he incurred. To me, Colbert looked very relaxed after the Bushes left the room and he greeted audience members, signed autographs. And why wouldn't he be? He achieved exactly what he wanted to achieve, delivered the message he intended to deliver. Mission accomplished.

I have always felt that there should be no social relationships like between journalist and the people they cover. The proper role of journalists is to keep their distance from politicians, lobbyists, and other powerful people. I am in total agreement with that great journalist I. F. Stone, who wrote:

It's just wonderful to be a pariah. I really owe my success to being a pariah. It is so good not to be invited to respectable dinner parties. People used to say to me, 'Izzy, why don't you go down and see the Secretary of State and put him straight.' Well, you know, you're not supposed to see the Secretary of State. He won't pay any attention to you anyway. He'll hold your hand, he'll commit you morally for listening. To be a pariah is to be left alone to see things your own way, as truthfully as you can. Not because you're brighter than anybody else is -- or your own truth so valuable. But because, like a painter or a writer or an artist, all you have to contribute is the purification of your own vision, and add that to the sum total of other visions. To be regarded as nonrespectable, to be a pariah, to be an outsider, this is really the way to do it. To sit in your tub and not want anything. As soon as you want something, they've got you!

Colbert, like Stone, will be treated as a pariah, both by the administration and by the beltway journalists. He should regard that as a high honor.

POST SCRIPT: The case for seat belts

In case you are ever tempted to drive without putting on a seat belt, take a look at this video that shows what happened to someone who was not wearing one, fell asleep at the wheel, and was involved in an accident. (For the squeamish: It is startling but NOT gruesome.)

May 02, 2006

About SAGES -3: The difficult task of changing education

It is a natural human trait to confuse 'is' with 'ought,' to think that what currently exists is also how things should be, especially with long-standing practices. The same is true with teaching methods. Once a way of teaching reaches a venerable stage, it is hard to conceive that things could be any different.

This post will be largely excerpts of an excellent article titled Making the Case by David A. Garvin from the September-October 2003, Volume 106, Number 1 issue of Harvard Magazine (p. 56), showing how hard it is to change the way we teach. It gives as an example the way that legal education changed to what it is today, what is now called the case method. Although this has become the 'standard' way law colleges operate, initial efforts to introduce this method faced enormous resistance from students and faculty and alumni. This is because all of us tend to be most comfortable with doing what we have always done and fear that change will be for the worse.

The article suggests that to succeed, the changes must be based on a deep understanding of education and require support and commitment over the long haul.

Christopher Columbus Langdell, the pioneer of the case method, attended Harvard Law School from 1851 to 1854 - twice the usual term of study. He spent his extra time as a research assistant and librarian, holed up in the school's library reading legal decisions and developing an encyclopedic knowledge of court cases. Langdell's career as a trial lawyer was undistinguished; his primary skill was researching and writing briefs. In 1870, Harvard president Charles William Eliot appointed Langdell, who had impressed him during a chance meeting when they were both students, as professor and then dean of the law school. Langdell immediately set about developing the case method.

At the time, law was taught by the Dwight Method, a combination of lecture, recitation, and drill named after a professor at Columbia. Students prepared for class by reading "treatises," dense textbooks that interpreted the law and summarized the best thinking in the field. They were then tested - orally and in front of their peers - on their level of memorization and recall. Much of the real learning came later, during apprenticeships and on-the-job instruction.

Langdell's approach was completely different. In his course on contracts, he insisted that students read only original sources - cases - and draw their own conclusions. To assist them, he assembled a set of cases and published them, with only a brief two-page introduction.

Langdell's approach was much influenced by the then-prevailing inductive empiricism. He believed that lawyers, like scientists, worked with a deep understanding of a few core theories or principles; that understanding, in turn, was best developed via induction from a review of those appellate court decisions in which the principles first took tangible form. State laws might vary, but as long as lawyers understood the principles on which they were based, they should be able to practice anywhere. In Langdell's words: "To have a mastery of these [principles or doctrines] as to be able to apply them with consistent facility and certainty to the ever-tangled skein of human affairs, is what constitutes a true lawyer…."

This view of the law shifted the locus of learning from law offices to the library. Craft skills and hands-on experience were far less important than a mastery of principles - the basis for deep, theoretical understanding
. . .
This view of the law also required a new approach to pedagogy. Inducing general principles from a small selection of cases was a challenging task, and students were unlikely to succeed without help. To guide them, Langdell developed through trial and error what is now called the Socratic method: an interrogatory style in which instructors question students closely about the facts of the case, the points at issue, judicial reasoning, underlying doctrines and principles, and comparisons with other cases. Students prepare for class knowing that they will have to do more than simply parrot back material they have memorized from lectures or textbooks; they will have to present their own interpretations and analysis, and face detailed follow-up questions from the instructor.

Langdell's innovations initially met with enormous resistance. Many students were outraged. During the first three years of his administration, as word spread of Harvard's new approach to legal education, enrollment at the school dropped from 165 to 117 students, leading Boston University to start a law school of its own. Alumni were in open revolt.

With Eliot's backing, Langdell endured, remaining dean until 1895. By that time, the case method was firmly established at Harvard and six other law schools. Only in the late 1890s and early 1900s, as Chicago, Columbia, Yale, and other elite law schools warmed to the case method - and as Louis Brandeis and other successful Langdell students began to speak glowingly of their law-school experiences - did it diffuse more widely. By 1920, the case method had become the dominant form of legal education. It remains so today.

What we see being tried in SAGES has interesting parallels with what Langdell was trying to achieve. The idea is for students, rather than being the recipients of the distilled wisdom of experts and teachers and told directly what they should know, to study something in depth and to inductively argue their way to an understanding of basic but general principles. The Socratic format of the instructor interrogating students is not used in SAGES, replaced by the somewhat more gentle method of having peer discussions mediated by the instructor.

Taking a long view of past educational changes enables us to keep a sense of proportion. The way we teach now may seem natural and even the only way but usually when we look back it was deliberately introduced, often over considerable opposition, because of some developments in understanding of the nature of learning. As time goes by and our understanding of the learning process changes and deepens, it is natural to re-examine the way we teach as well.

I believe that SAGES takes advantage of what we understand now to be important new insights into learning. But we need to give it a chance to take root. Abandoning it at the first sign of difficulty is absurd because all innovations invariably run into difficulty at the beginning as everyone struggles to adapt to the new ways.

POST SCRIPT: 'Mission Accomplished' by the numbers

As usual, I am tardy in my recognition of anniversaries. But here is a sobering reminder of what has transpired since the infamous photo-op three years ago yesterday on the aircraft carrier.

May 01, 2006

About SAGES -2: Implementation issues

When I talk about the SAGES program (see here for a description of the program and how it came about) to faculty at other universities they are impressed that Case put into place something so ambitious. They immediately see how the program addresses the very same problems that all universities face but few attempt to tackle as comprehensively as we have sought to do.

Of course, the very ambitiousness of the program meant that there would be challenges in implementation. Some of the challenges are institutional and resource-related. Creating more small classes meant requiring more faculty to teach them, more classrooms (especially those suitable for seminar-style interactions), more writing support, and so on. This necessarily imposed some strain on the system.

But more difficult than these resource issues was that the SAGES program was taking both students and faculty away from their comfort zones. Faculty and students tend to know how to deal with the more traditional knowledge-giver/knowledge-receiver model of teaching. They have each played these roles for a long time and can slip comfortably into them. Now both were being asked to shift into different modes of behavior in class.

Students were being asked to play a more active role in creating knowledge, in participating in class, and in taking more responsibility for their own learning outside of class. Faculty were being asked to play a facilitator role, talking far less than they were used to, and learning how to support students as they struggled to learn how to learn on their own, and generating and sustaining focused discussions.

It should have come as no surprise to anyone that when the program was made fully operational, that there would be many problems as both faculty and students adjusted to these changes. What surprises me is that both faculty and students seem to have unrealistic expectations of how smooth the transition should be, and when there were breakdowns, as was inevitable, tended to take these as signs of the program's inadequacy rather than as the necessary growing pains of any change or bold innovation.

In my own teaching over these many years, I have tried all kinds of teaching innovations. The one common feature to all of them is that they rarely worked well (if at all!) the first time I tried them. Even though I read the literature on the methods and planned the changes carefully, I always made mistakes in implementation because the unexpected always occurs and until one has some experience with dealing with a new method of teaching, one does not always respond well to surprises. So even though I was an enthusiastic supporter of the seminar teaching mode, it still took me some time to work out some of the major kinks that occurred and to become comfortable with it. But even now, I keep thinking of many ways to improve it the next time I teach it.

I believe that teaching is a craft, like woodworking. One can and should learn the theory behind it but one only becomes good by doing it, and one has to anticipate that the first attempts are not going to be smooth. But during the period of transition from the old to the new, people tend to compare an old system that has been refined over many years with a new system that has not had its rough edges smoothed out. It was to be expected that faculty teaching in a new way would encounter situations that they had not anticipated and flounder a bit, even if they attended the preparatory sessions that were held for all faculty on how to teach in a seminar format.

I have noticed an odd feature whenever teachers are asked to try a new method of teaching. If the new method does not work perfectly right out of the box, it is jettisoned as a failure. But that is not the methodology that should be used. The actual comparison that should be made is not to some standard of perfection but whether the new method works better than whatever we are currently doing.

For example, I remember when I first introduced cooperative groups in my large (about 200 students) lecture classes and had them work in groups on problems in class. Some colleagues asked me whether it wasn't possible that some students were discussing other things during that time, instead of the physics problem I had assigned. The answer is that of course some do and I knew that. But they could just as easily do these other things if I lectured non-stop. In fact it would be easier since when I lecture I would be busy at the blackboard more and thus even more oblivious to what was going on in the auditorium. I felt that the active learning methods I introduced increased the amount and quality of student engagement from that in a pure lecture, and that was the relevant yardstick to measure things by, not whether I had perfect results. I would never go back now to teaching such large classes without groups.

The SAGES implementation problems, from the faculty point of view, arise from them being uncomfortable with not being in complete control of the flow of information and discussion, being uneasy with not constantly imparting authoritative knowledge, worrying about students learning incorrect things from their peers, concern about time 'wasted' in discussions, discomfort with silences, and not trusting students to be responsible for their own learning.

As a result of these concerns, faculty can succumb to the temptation to relapse into a lecture mode and students take their cue and relapse into the listener mode. This leaves both dissatisfied. Faculty (especially those in research universities who tend to be highly specialized) also sometimes worry that students in seminars will talk and write about topics in which the faculty are not experts, and they will thus not have the 'answers' at their fingertips.

Another major source of concern, especially for faculty in the sciences and engineering, was their feeling that they were not really competent to judge writing and give good feedback to their students on how to improve since they had had little prior experience with essay assignments.

Faculty also do not realize that it takes quite a bit of planning and organization on their part in order to create a free-flowing, substantive, and seemingly spontaneous discussion. Running good discussion seminars actually takes more preparatory work than giving a lecture. It involves a lot more than strolling into class and saying "So, what did you think about the readings?"

The problems that students had with SAGES again stem from a discomfort with an unfamiliar mode of teaching. In seminar-discussion classes, much of the learning has to occur outside the classroom, in the form of reading and writing, by the student. The classroom discussions are used to stimulate interest and provide focus, but students have to do a lot on their own. But some first year students may not be able to handle this responsibility yet. After all, many of them have been very successful by simply going to class, listening to what the teacher said, and doing the assignments. It is natural for such students to prefer to be told what to do and how to do it and this new responsibility thrust upon them may make them uneasy. Some are also shy and speaking in class is difficult, if not an ordeal, and making a formal presentation may be quite terrifying. Reassuring such students and making them comfortable with different types of behavior in class is also a role that faculty may not know how to play.

In the long run, I think both faculty and student will grow from this experience. Personally, I have found teaching the SAGES seminars to be a profoundly rewarding and transformative experience. I have got to know all the students in my classes much better and that has been delightful. I have learned a lot from the research topics they have selected (for their essays and presentation) in areas that are unfamiliar to me. And I have learned a lot about what makes for good writing and how to provide the kinds of feedback and structure to help students learn how to write better.

Of course, there is still a lot more that I still need to learn in order to run seminar classes better. But that is part of the fun of teaching, the fact that you are always learning along with your students. As I said, teaching is a craft and it is characteristic of craftsmen, like say a violin maker, that one just gets better with time, learning from one's mistakes and acquiring new skills and techniques.

In time, I am confident that faculty and students in SAGES will shed their nervousness about it and embrace the seminar method of teaching as well. But it will require patience and perseverance.

If the next posting, I will look back in history to see how law education was transformed in the US. The transition to the present system was extremely rough even though now the current mode is seen as so 'natural' and even inevitable.

POST SCRIPT: Mearsheimer and Walt Petition

As some of you may be aware, Professors John Mearsheimer (University of Chicago) and Stephen Walt (Harvard University) have written an article entitled The Israel Lobby and American Foreign Policy where they argue that this lobby has had too great an influence on American foreign policy. As a result of this, they have been subject to attacks and charges of anti-Semitism. You can see their article in the London Review of Books here and their longer and more detailed working paper on the same subject here, and read about the controversy generated by it here.

Professor Juan Cole (University of Michigan) has organized a petition to defend Mearsheimer and Walt from what he calls "baseless charges of anti-Semitism." Cole says "I feel it is time for teachers in higher education to stand up and be counted on this issue of the chilling of academic inquiry through character assassination. At a time when the use of congressional funding to universities to limit and shape curricula and research is openly advocated, all of us academics are on the line. And if scholars so eminent as Mearsheimer and Walt can be cavalierly smeared, then what would happen to others?"

You can read Cole's discussion of why he created the petition, its contents, and the signatories so far by going here. Cole is requesting that signatories be from those affiliated with universities, because of the way the petition is worded.