THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries in "Science"

January 08, 2012

Guess I won't be invited to write for The Huffington Post

They have started a new science section and Arianna Huffington says this of her hopes for it:

I'm particularly looking forward to HuffPost Science's coverage of one of my longtime passions: the intersection of science and religion, two fields often seen as contradictory -- or at least presented that way by those waging The War on Science. A key part of HuffPost Science's mission will be to cut through the divisions that have resulted from that false war.

Rather than taking up arms in those misguided, outdated battles, HuffPost Science will work in the tradition of inquisitive minds that can accommodate both logic and mystery. It's a tradition exemplified by Brown University biology professor Kenneth Miller, who, when I visited with him last year, told me that he sees Darwin not as an obstacle to faith but as "the key to understanding our relationship with God."

Ah, yes, the old "accommodate both logic and mystery" ploy, as Inspector Clousseau would say. Expect to see full-bore accommodationism that tells you that magical thinking is perfectly compatible with science, as long as you throw in sexy sciency words such as 'quantum' and 'indeterminancy' to mask the woo that lurks beneath. I don't know why they don't call it the 'Deepak Chopra section' and be done with it.

January 06, 2012

My brain is already falling apart

A new study says that people start losing their brain powers as early as 45 years of age.

The results of the tests show that cognitive scores declined in all categories except vocabulary - and there was a faster decline in older people.

The study found a 9.6% decline in mental reasoning in men aged 65-70 and a 7.4% decline for women of the same age.

For men and women aged 45-49, there was a 3.6% decline.

Since my work involves mainly words, the lack of decline in vocabulary skills may be masking my decrepitude.

The study can be read here.

January 04, 2012

The wonder of science

One of the common criticisms that one hears against us science-based atheists is that our search for naturalistic explanations of hitherto mysterious phenomena, coupled with a relentless assault on irrational and unscientific thinking, results in all the wonder being drained from life. We are told, for example, that to explain that the rainbow is the product of multiple scattering of light by water droplets in the air is to somehow detract from its beauty or that when gazing at the billions of twinkling stars on a beautifully clear cloudless night, to be aware that they are the products of nuclear fusion reactions that took place billions of years ago is to reduce their grandeur.

I must say that I don't understand the criticism. For me at least, understanding how these things come about actually enhances my sense of wonder about the universe. The more I learn about how the universe works and how the impersonal forces of nature created everything around us, the more I am impressed.

To illustrate my point, I am now going to show you something that I think is incredibly beautiful. It is the equation:

T = 2tanh-1(√ΩΛ)/(3H0√ΩΛ)

So what is so great about this equation? It is the equation that tells us the age of the universe. Note that the age T depends on just two quantities H0 and the square root of ΩΛ, both of which are measured quantities. H0 is the value of the Hubble constant at the present time and is given by the slope of the straight line obtained when one plots the speed of distant galaxies (on the y-axis) versus the distance to those galaxies (on the x-axis). ΩΛ is the ratio of the density of dark energy in the universe to the total energy density of the universe.

As with all scientific results, there are some basic theoretical assumptions that go into obtaining them. This particular one requires that the universe be governed by Einstein's equations of general relativity and that its current state is 'matter dominated' (i.e., the energy contribution of pure radiation is negligible) and 'flat' (i.e., the total density of the universe is at its critical value so that the curvature of space is neither convex nor concave). These 'assumptions' are supported by other measurements, so they are not arbitrary.

The values of H0 and ΩΛ are obtained using satellite probes that collect a vast body of data from stars and galaxies and scientists then do a best fit to those data for multiple parameters, of which these are just two. The current values were obtained in 2009 by the WMAP (Wilkinson Microwave Satellite Probe) satellite launched in 2001, and are given by H0=70.5 km/s/Mpc and ΩΛ=0.726. Insert these values into the above equation (with the appropriate units) and you get that the age of the universe is 13.7 billion years.

Why do I think this equation is a thing of extraordinary beauty? Just think about the implications of that equation for a moment. We humans have been around for just an infinitesimally small period of time in history and occupy an infinitesimally small part of the universe. And yet we have been able, using pure ingenuity and by steadily building upon the scientific achievements of our predecessors, to not only figure out the large-scale structure of the vast universe we happen to occupy but to determine, in a simple equation, its actual age! That is truly incredible. If that does not strike you with wonder, then I don't know what will.

Furthermore, note how simple the equation is. The tanh-1 function (which represents the inverse of the hyperbolic tangent) may be intimidating for some but it is such a standard mathematical function that it can be found on any scientific hand calculator. If a news report states that new satellite data have given revised best fit values for by H0 and ΩΛ, anyone can calculate the revised age of the universe themselves in a few minutes.

But as this xkcd cartoon captures accurately, it is not that scientists lose their sense of wonder but that they find wonder in learning about the universe, and do not need to invoke mystery to sense it.

xkcd beauty.jpg

December 31, 2011

Antonio Damasio on the quest to understand consciousness









December 26, 2011

New particle state discovered at CERN

While a lot of the science media attention has focused on the search for the Higgs boson, we should not forget that that is not sole purpose of the Large Hadron Collider at CERN. Its high energies allow it to do more conventional work and there is now a report of the discovery of an excited state of the bottom quark-antiquark, a consequence of the standard model of particle physics. The preprint of the paper can be read here.

December 07, 2011

The factors that drive obedience and conformity

There was an old TV program called Candid Camera that used hidden cameras to capture what people did when confronted with awkward or unexpected situations. While the aim of the program was humorous, usually at the expense of the hapless person who happened to be caught on camera, some of the episodes serve as useful experiments on human behavior.

One particularly revealing one involved the desire of people to conform to powerful norms of behavior that we all follow without even thinking about it. For example, when people get into an elevator, they space themselves as far as possible from others, immediately turn around and face the front, and not make eye contact or speak, apart from sometimes a quick nod of greeting upon entering. But in this episode, the camera noted what happens when the norms seem to suddenly change.

Although the above experiment is amusing, psychologist Philip Zimbardo, the person behind the famous Stanford Prison Experiment (SPE) , reflects on it and the Milgram obedience study, and says that the Candid Camera elevator experiment reveals how the strong desire to conform to the norms of the people around us can lead to behaviors that are evil, something he calls 'the Lucifer effect'. (Zimbardo has written a book titled The Lucifer Effect: Understanding How Good People Turn Evil that I have bought and plan to read and write about soon.)

Zimbardo points an interesting feature in the Milgram obedience and the SPE studies about the role that religion plays in the willingness to obey authority and inflict pain on others even when one's own moral instincts are repulsed by the idea.

The large, diverse cast of ordinary characters in the obedience studies and the normal, healthy, intelligent cast in the prison study also serve to make vivid the tragic conclusion that we all hate to acknowledge: The goodness of Everyman and of Everywoman can be transformed and overwhelmed by the an accumulation of small forces of evil. The character transformation seen in many of the participants in both studies represents "The Lucifer Effect" in action. Both studies teach us lessons about authority; the obedience research teaches us to question authority when it is excessive and unjust, while the SPE teaches us the dangers of too little responsible authority when it is needed to perform oversight of the behavior of individuals within its agency.

Religious upbringing also comes to play in a complex way, leading both to unquestioning obedience to doctrinal beliefs as well as a profound caring for one's fellows. The first values should lead to greater obedience to authority in the Milgram paradigm, while the second should lead to less obedience to such authority. Support for the first prediction comes from a Milgram-like study that compared participants with various measured levels religious orientation in the extent to which they obeyed one of three authority figures: neutral, scientific, or religious. The results reveal that the shock scores elicited in this experiment were highest for the most religious participants, less for those moderately religious, and lowest for the least religious. Among those highly and moderately religious, the scientific and religious authorities were more effective than the neutral authority in eliciting the most obedience. Those who scored lowest on the religious measures, that centered around beliefs that one's life is under divine control, tended to reject any authority, be it religious or scientific. [My emphasis]

There is no question that scientific figures carry authority which is why scientific malpractice or fraud is taken so seriously. It is perhaps not hard to see why being religious or having a religious authority figure makes people more likely to be persuaded to go along with cruel acts. Religious people have usually been indoctrinated from childhood to believe that god is the ultimate authority figure and that unquestioning obedience to god's commands constitutes a virtue that will be rewarded. Their religious texts also have countless examples of the most appalling atrocities that their god has done or commanded people to do and which are supposed to serve a greater good. The appalling doctrine known as 'divine command theory' justifies such actions by saying that whatever god commands has to be good, even if it goes against every norm of humane behavior. Such beliefs can be a powerful force that can overcome the scruples that come with normal feelings of empathy towards other living things.

As a side note, a few months ago, I wrote about people who get lost in Death Valley and have even died because they followed the instructions of their GPS system even when it erroneously instructed them to take roads that barely existed. I wonder if that is another symptom of this phenomenon. After all, an assured and confident disembodied voice telling them what to do is somewhat like what they imagine some god-like authority figure would do, and they follow blindly.

December 05, 2011

Sleep

I like to sleep. I need a minimum of eight hours a night. But it is not just the good feeling that comes with resting that I find attractive. I really enjoy sleeping, the sensation of drowsing off, and usually have no difficulty doing so anywhere at any time, even on cramped airline seats on long flights. On weekends, I take a long nap after lunch and sometimes take a short nap seated up at my desk during the weekday.

I used to worry that this was a sign that I was lazy but learned later that most people don't get enough sleep and that this can really be harmful.

Here is a 60 Minutes report on the importance of getting enough sleep every day.

Now comes a new study that suggests that the variations in sleep needs can be traced to the influence of a specific gene.

I learned from the news report that Einstein needed 11 hours of sleep per night, which makes me a real slacker in the sleep department.

November 30, 2011

Inequality makes us less happy

Via reader Norm, I learned about a new study using brains scans that suggest that people are aren't nearly as self-interested as some might think and that inequality makes people unhappy. "The scientists speculate that people have a natural dislike of inequality. In fact, our desire for equal outcomes is often more powerful (at least in the brain) than our desire for a little extra cash. It's not that money doesn't make us feel good — it's that sharing the wealth can make us feel even better."

November 29, 2011

Curiosity landing

The Mars explorer named Curiosity was launched successfully on Saturday and is expected to land on the planet on August 6, 2012. Because Curiosity is a much larger object than previous explorers, engineers needed to develop a new way of giving it a soft landing and this new technique is causing some anxiety to mission scientists about whether the rover can survive the landing. Some of them refer to the final stages of the landing as 'six minutes of terror'.

You can see an animation (made back in 2005) of what the landing should look like.

Here is a test run of the final stage done in the laboratory.

November 17, 2011

Where does our morality come from?

For reasons that are not clear to me, some religious people seem to think that the moral sense that we possess is evidence for god. In fact, some of them (such as Francis Collins in his book The Language of God) go so far as to claim that this is a really powerful argument for god. They point to the fact that there are quite a few moral impulses that seem to be universal and claim that this must mean that they were implanted in us by god.

This is a specious argument. In my series of posts on the biological basis for justice and altruism (part 1, part 2, part 3, and part 4), I discussed how our ideas of justice and our altruistic impulses can be traced to biological origins. What science is making abundantly clear is that the foundation of our moral senses also are evolutionary in origin and that culture builds on those basic biological impulses to create moral system of increasing generality.

Paul Bloom has studied this question by looking at what we can learn about the moral thinking of babies and in his article The Moral Life of Babies in the New York Times issue on May 5, 2010 writes:

The notion at the core of any mature morality is that of impartiality. If you are asked to justify your actions, and you say, "Because I wanted to," this is just an expression of selfish desire. But explanations like "It was my turn" or "It's my fair share" are potentially moral, because they imply that anyone else in the same situation could have done the same. This is the sort of argument that could be convincing to a neutral observer and is at the foundation of standards of justice and law. The philosopher Peter Singer has pointed out that this notion of impartiality can be found in religious and philosophical systems of morality, from the golden rule in Christianity to the teachings of Confucius to the political philosopher John Rawls's landmark theory of justice. This is an insight that emerges within communities of intelligent, deliberating and negotiating beings, and it can override our parochial impulses.

The aspect of morality that we truly marvel at — its generality and universality — is the product of culture, not of biology. There is no need to posit divine intervention. A fully developed morality is the product of cultural development, of the accumulation of rational insight and hard-earned innovations. The morality we start off with is primitive, not merely in the obvious sense that it's incomplete, but in the deeper sense that when individuals and societies aspire toward an enlightened morality — one in which all beings capable of reason and suffering are on an equal footing, where all people are equal — they are fighting with what children have from the get-go.

Babies possess certain moral foundations — the capacity and willingness to judge the actions of others, some sense of justice, gut responses to altruism and nastiness. Regardless of how smart we are, if we didn't start with this basic apparatus, we would be nothing more than amoral agents, ruthlessly driven to pursue our self-interest. But our capacities as babies are sharply limited. It is the insights of rational individuals that make a truly universal and unselfish morality something that our species can aspire to.

There is a nice video of the experiments that Bloom has done with babies.

This is why science and religion are at loggerheads. As science advances, religion simply has less room to exist. This is true in all areas of knowledge and, in particular, in the area of morality. We now realize that evolution has given us two great gifts: basic moral instincts and the capacity to reason. The latter has enabled us to build on the former and create the complex moral systems that currently exist. God is entirely superfluous.

November 16, 2011

General relativity versus modified Newton theories of gravity

In the case of the large-scale structure of the universe, the dominant paradigm is that the dynamics of the universe are governed by the theory of general relativity, augmented by the postulation of the existence of dark matter and dark energy. Classical Newtonian theory of gravity was not believed to hold, because it could not explain many features of galaxies.

But in science, one can always come up with alternative theories to the dominant paradigm to explain any phenomenon and there have been efforts to develop what are known as MOND theories (standing for MOdified Newtonian Dynamics) to explain the properties of the universe that would dispense with general relativity and revert to Newtonian gravity with slight modifications. Via blog reader Hunter, I came across this article that says that they have tested one form of the MOND hypothesis and found that it cannot explain the measured gravitational redshift of galaxy clusters, while general relativity and dark matter can.

This does not definitely rule out MOND theories since any theory can always be tweaked to accommodate any experimental result. But such negative results do make them less plausible to scientists.

November 02, 2011

Richard Feynman on science

He makes a good analogy for how scientists go about their work.

October 30, 2011

What use is half a wing?

Creationists like to challenge the theory of evolution by asking how it can be that things can evolve incrementally since in its early stages the new feature seems to lack its final functionality. They pose questions like "What is the use of half an eye or half a wing?" Of course, scientists have long explained this. They have shown how the eye could have evolved by tiny changes and in fact even right now almost the full spectrum of differential eye development can be seen in existing species.

They have also pointed out that it is a mistake to assume that the final functionality of a feature was the only functionality all along, and that features may have had other functions in the early stages and only later became adapted to its final use.

Carl Zimmer had a nice article earlier this year in National Geographic about the evidence that feathers might have evolved for a different purpose long before flight occurred. More recently, he reports on new research results that add to our knowledge of what purpose those non-flying feathers in primitive wing forms might have served.

October 25, 2011

Climate change skeptic changes mind

Global-warming deniers eagerly embrace anyone who supports their cause, however much of a crank that person may be. So any respectable scientist who expresses skepticism about global warming or who criticizes the work of those scientists who have warned us about it is makes them delirious with joy.

They were particularly pleased when Richard Muller did so because he is a physicist at the University of California-Berkeley and thus comes with good credentials. Based on preliminary work he had done, Muller had said that he thought the previous studies that said global warming was happening were wrong. Republicans invited him to testify to Congress and in 2010 many right wing groups, including the Koch brothers, were willing to fund his Berkeley Earth Surface Temperature (BEST) project, which aimed to do a new and independent study as a check on all the other global warming studies, no doubt expecting him to contradict them.

But things didn't go quite according to plan. In a press release announcing the first set of four papers that they have submitted to journals, Muller says, "Our biggest surprise was that the new results agreed so closely with the warming values published previously by other teams in the US and the UK." In an op-ed in the Wall Street Journal titled The Case Against Global-Warming Skepticism: There were good reasons for doubt, until now, Muller reinforced that message, adding:

When we began our study, we felt that skeptics had raised legitimate issues, and we didn't know what we'd find. Our results turned out to be close to those published by prior groups. We think that means that those groups had truly been very careful in their work, despite their inability to convince some skeptics of that. They managed to avoid bias in their data selection, homogenization and other corrections.

Global warming is real. Perhaps our results will help cool this portion of the climate debate.

One has to be a bit concerned that Muller announced his results in a press release and in a newspaper op-ed and not after the papers had undergone peer review. Bypassing the normal processes of science and going straight to the public tends not to have good results.

The problem for climate change skeptics when they try to co-opt real scientists to their cause is that real scientists deal with the data they have and not the data they wish they had. Whatever the private beliefs of scientists, they cannot go outside the bounds allowed them by the data, unless they are dishonest and suppress or fabricate them.

As Kevin Drum comments:

The BEST report is purely an estimate of planetary warming, and it makes no estimate of how much this warming is due to human activity. So in one sense, its impact is limited since the smarter skeptics have already abandoned the idea that warming is a hoax and now focus their fire solely on the contention that it's man-made. (And the even smarter ones have given up on that, too, and now merely argue that it's economically pointless to try to stop it.) Still, the fact that climate scientists turned out to be careful and thorough in their basic estimates of temperature rise surely enhances their credibility in general. Climategate was always a ridiculous sideshow, and this is just one more nail in its coffin. Climate scientists got the basic data right, and they've almost certainly gotten the human causes right too.

Those deniers, like James M. Taylor of the Heartland Institute who had earlier embraced Muller as one of them are now disowning him, calling these new results "meaningless" and attacking his credibility, saying that he might be having the "intent of deceiving casual observers about the true nature of the global warming debate." Other deniers are also edging away from their earlier embrace of Muller.

Global warming deniers will probably still give a platform to people like the Briton Lord Monckton, who has made quite a name for himself talking about this subject even though he has no expertise whatsoever in this area and makes outrageous statements such as calling an Australian government climate adviser a Nazi. The Australian comedy show The Chasers interviews Monckton and he clearly has no suspicions until the very end that his leg is being pulled and that he is being made to look a fool.

October 23, 2011

Siri and the Turing test

I don't have an iPhone of any kind but was intrigued by the reports of the latest one that had the voice recognition software known as Siri that seems to have a conversational ability reminiscent of HAL in 2001: A Space Odyssey, as can be seen from this compilation of a conversation.

I am not sure if this is a hoax but the person who put up the video assures skeptics that this is real and says that anyone can test it by getting hold of a Siri-enabled iPhone. I am curious if any blog reader who has it can confirm.

As an aside, I am a bit bothered by Siri referring to the user as 'Master'. I know it is not a real person but the feudal overtone is jarring.

Taking his claims at face, it seems as if Siri is able to pass at least a low-level Turing test.

October 21, 2011

When did humans arrive in the Americas?

It used to be thought that they came 13,000 years ago across the then-existing land bridge connecting Siberia and Alaska, during what is known as the 'Clovis' period.

A paper published today in the journal Science has measured with high precision (with new techniques) the age of a mastodon fossil bone with a weapon point embedded in it that was found in 1970. It found that it is 13,800 years ago, with an uncertainty of only 20 years, suggesting that humans were here earlier than thought, supporting other evidence that there was human hunter activity here as early as 15,000-16,000 years ago.

A large number of mammals (mastodons, woolly mammoths, sabre-toothed cats, giant sloths, camels) disappeared rapidly around 12,700 years ago and it was thought that this must have been due to rapid climate change as the Ice Age ended, since Clovis hunters were not thought to have been around for that long.

But the new earlier date for humans in the Americas suggests that mammal extinction may have been accelerated by humans hunting them with weapons.

October 18, 2011

Scientific responsibility

Science has a unique role in the growing recognition that it is the source of authoritative and reliable knowledge. But that carries with it a great burden to make sure that the public's trust is not abused. Via Machines Like Us, I learned about the General Assembly of the International Council for Science (ICSU) issuing a statement last month on "The Principle of Universality (freedom and responsibility) of Science" that spelled out what the responsibilities of scientists are.

The free and responsible practice of science is fundamental to scientific advancement and human and environmental well-being. Such practice, in all its aspects, requires freedom of movement, association, expression and communication for scientists, as well as equitable access to data, information, and other resources for research. It requires responsibility at all levels to carry out and communicate scientific work with integrity, respect, fairness, trustworthiness, and transparency, recognising its benefits and possible harms.

This followed up on the second World Conference on Research Integrity held in Singapore in July 2010 that issued a statement that "emphasizes the need for honesty in all aspects of research, accountability in the conduct of scientific research, professional courtesy and fairness in working with others, and good stewardship of research on behalf of others."

Scientists have to be vigilant in maintaining these standards.

October 01, 2011

Carl Sagan

I never met Carl Sagan but in addition to being a good scientist, prolific writer, great popularizer and advocate for science, he had the reputation of being a really nice person, which is probably why so many of us mean and nasty new atheists are urged to be more like him.

Neil deGrasse Tyson relates an anecdote that reinforces that last characteristic.

The true character of a person is revealed in the way they treat people who, by the usual standards of society, are of no importance to them whatsoever.

September 29, 2011

Lioness saves her cub

I am a sucker for animal stories that have happy endings.

Another example of altruism in the animal kingdom.

September 28, 2011

Reading your brains

A new study reports that fMRI machines can roughly reconstruct the images of film clips that test subjects have been viewing.

What I found interesting was that the reconstructed images, while retaining the general shape of the original, seemed to replace the details with what to me seemed like the details of another image.

September 26, 2011

Narrowing the search for the Higgs particle

It looks like the search for the elusive Higgs particle is getting close. The so-called Standard Model of particle led to the existence of the Higgs being proposed 1964 as an explanation of how elementary particles get their mass and it is the final particle of the model to be yet directly detected. If it is not found, that would require us to re-think some important theories of particle physics.

They are hoping for something definite to emerge within the next year. But if the Higgs is not found by then, the search may drag on longer because concluding that something is not there is more difficult than concluding that it is.

September 23, 2011

Faster-than-light neutrinos?

I came across this BBC report about some observations at CERN that suggested that neutrinos may be traveling faster than the speed of light. If this is true, it would mean that one of the pillars of modern science, the theory of special relativity, would have to undergo serious scrutiny.

I personally was not too excited by the news and was not even planning to comment on it but it seems to be causing a media sensation and several blog readers sent me clippings from various sources and asked for my opinion, so here it is.

I think that this result is unlikely to hold up and so am not too excited. The reason that I am underwhelmed is that I have been around long enough to recall many previous sightings of tachyons (the technical term for faster-than-light particles) that turned out to be false alarms. They are like Elvis sightings in that there is an initial flurry of excitement that then fades under closer scrutiny. The scientists who reported the recent events are aware of this history and are understandably cautious about making any grandiose claims. They can depend on the media to do that. If other research groups study this is some detail and the results hold up, then there will be cause for excitement. This will likely take a couple of years. Until then, I treat this with considerable skepticism.

So my present attitude is captured in this xkcd cartoon that I saw via Jeff at Have Coffee Will Write.

Sorry to be such a downer but if the history of science teaches us anything it is that the great and enduring theories of physics are never overthrown on the basis of a single experiment.

September 22, 2011

The scientific basis for justice and altruism-part 4

(An expanded version of a talk given at CWRU's Share the Vision program, Severance Hall, Friday, August 26, 2011 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the ideas in the common reading book selection Justice: What's the right thing to do? by Michael Sandel. See part 1, part 2, and part 3.)

In the previous post, I pointed out that experiments with babies suggested that although the theory of evolution supports the idea that the desire for justice and fairness is part of our genetic makeup, it is also limited in that seems to stop with our relatives and immediate community or nation. It is not entirely limited, though. There are many examples in evolution of characteristics that evolved to serve one purpose but then get used for other purposes. Sex is a good example. The pleasure it gives served the purpose of encouraging procreation but now people indulge in it for pleasure alone. Similarly, although the desire for justice my have evolved within the domain of kin and the immediate community to benefit the propagation of genes, it can still drive our relationships with the broader community even when there is no genetic benefit.

But there is another important evolutionary development that extends the drive for justice and fairness. What ethicist Peter Singer points out in an excellent book titled The Expanding Circle (2009) is that evolution has also given us the power of reasoning and it is the use of this power that has enabled us to build upon our biological sense of justice to encompass more and more people within our sphere of concern. In other words, our reasoning power has enabled us to go far beyond the initial biological impulse to seek justice only for our relatives and local community and has helped us to develop the idea of impartiality, which is a core feature of the desire for justice.

The way this happens is that while biology might instill in us a desire to treat just our own relatives fairly, our sense of reason tells us that there is nothing particularly special about our families, that ours is just one among many families and that all of them are equally worthy of being treated as fairly as our own. It is then a natural extension to realize that our own community or nation is also just one among many communities and nations and that they deserve fairness and justice too. Once we start reasoning along those lines, the advance is inexorable and we start increasing the size of the circle that encompasses our concern. Reason can overcome parochialism.

As a result of this process, over time we can see that the circle of concern has expanded greatly. We now think that discrimination towards anyone based on gender, race, ethnicity, national origin, sexuality, etc. is wrong. We are also expanding the circle to include non-human animals, with the realization that they too should have many of the rights that we take for granted. As a result we see the rise of animal rights movements, the increased adoption of vegan and vegetarian diets, the drive to eliminate factory farming to ensure that animals are treated humanely, much stricter controls on animal research, and so on.

So while the basic drive for justice and fairness is innate in us, in the sense that it is hardwired into our genetic makeup as a result of our evolutionary history, it required the further evolutionary development of the sense of reason to bring it to fruition, where we seek to maximize justice for everyone, not just our own group.

In his essay Morals Without God?, primatologist Frans de Waal said that Charles Darwin foresaw that this expanded concept of morality would follow naturally in any species that developed social instincts along with sophisticated intellectual powers:

Charles Darwin was interested in how morality fits the human-animal continuum, proposing in The Descent of Man: "Any animal whatever, endowed with well-marked social instincts … would inevitably acquire a moral sense or conscience, as soon as its intellectual powers had become as well developed … as in man."

As Sandel makes clear in the book, it is not always clear or obvious how to decide what is just in any given situation. What is clear is the importance of developing our ability to reason, so that we can break free of, and rise above, our tribal instincts that makes us want to give special privileges and favors to our own group that we deny to others.

This is where all of you are particularly fortunate. For the next four years, you will be in an environment at Case Western Reserve University that is dedicated almost exclusively to helping you develop your sense of reason and all the other critical thinking skills. During this period of your education you will have access to the finest teachers and scholars, incredible knowledge resources in the library, and most importantly, like-minded and concerned fellow students. You should take maximum advantage of this opportunity to equip yourself with the knowledge and reasoning powers to overcome the challenges you will undoubtedly face in your lifetime.

Such a deep education will also enable you to better judge what is the right thing to do. It is important to do so because the quality of our entire civic life depends on having people work for justice. The writer H. L. Mencken put it well when he said, "If you want peace, work for justice."

September 21, 2011

Earth seen from the ISS

This time-lapse film of the Earth as viewed from the International Space Station is nice to see.

It also shows that the ISS and the shuttles did not fly as far out in space as people often think, being on average just about 225 miles up. So they are quite close to the Earth.

The scientific basis for justice and altruism-part 3

(An expanded version of a talk given at CWRU's Share the Vision program, Severance Hall, Friday, August 26, 2011 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the ideas in the common reading book selection Justice: What's the right thing to do? by Michael Sandel. See part 1 and part 2.)

There is considerable evidence that the desire for justice and fairness is innate in us. In an article titled The Moral Life of Babies (New York Times, May 5, 2010) child development psychologist Paul Bloom describes how very young children have a strong sense of justice.

A growing body of evidence, though, suggests that humans do have a rudimentary moral sense from the very start of life. With the help of well-designed experiments, you can see glimmers of moral thought, moral judgment and moral feeling even in the first year of life. Some sense of good and evil seems to be bred in the bone.

He reports on experiments in which babies were presented with puppets who either helped or hindered other puppets.

In the end, we found that 6- and 10-month-old infants overwhelmingly preferred the helpful individual to the hindering individual. This wasn’t a subtle statistical trend; just about all the babies reached for the good guy.

We found that, given a choice, infants prefer a helpful character to a neutral one; and prefer a neutral character to one who hinders. This finding indicates that both inclinations are at work — babies are drawn to the nice guy and repelled by the mean guy. Again, these results were not subtle; babies almost always showed this pattern of response.

Sometimes the babies were quite emphatic about their preferences.

Not long ago, a team of researchers watched a 1-year-old boy take justice into his own hands. The boy had just seen a puppet show in which one puppet played with a ball while interacting with two other puppets. The center puppet would slide the ball to the puppet on the right, who would pass it back. And the center puppet would slide the ball to the puppet on the left . . . who would run away with it. Then the two puppets on the ends were brought down from the stage and set before the toddler. Each was placed next to a pile of treats. At this point, the toddler was asked to take a treat away from one puppet. Like most children in this situation, the boy took it from the pile of the "naughty" one. But this punishment wasn’t enough — he then leaned over and smacked the puppet in the head.

The toddlers also watched pairs of puppets in which one puppet did a good or bad thing and the other puppet rewarded or punished the first. Of the four possible combinations of actions and consequences, toddlers overwhelmingly preferred the puppets that rewarded good acts and punished bad acts over puppets that rewarded bad acts and punished good acts. This showed that the babies were not basing their preferences on what they perceived as good or bad actions but viewed the actions in the context of the purpose they served. This is pretty sophisticated thinking about crime and punishment and justice.

The desire for justice is strong and biological but is limited. For example, toddlers tend to prefer people of their own races, who speak their own language and share their taste in food. Bloom writes that:

3-month-olds prefer the faces of the race that is most familiar to them to those of other races; 11-month-olds prefer individuals who share their own taste in food and expect these individuals to be nicer than those with different tastes; 12-month-olds prefer to learn from someone who speaks their own language over someone who speaks a foreign language. And studies with young children have found that once they are segregated into different groups — even under the most arbitrary of schemes, like wearing different colored T-shirts — they eagerly favor their own groups in their attitudes and their actions.

So are babies and little children racists? If you waggle your finger and go "kitchy-coo" at a baby of a different racial group, will it bite you? It might, but the babies are not making conscious decisions to prefer their own, which is the real mark of racism. They are simply reacting instinctively based on their biology. So biology seems to strongly suggest that our desire for justice, though it is biologically based on our long history of evolution, is also limited to our in-group. This difference in the way we treat in-group members versus the way we view those who are 'out-group' members can and does lead to all manner of strife and tribal behavior between communities, religions, castes, and nations.

So does the theory of evolution say that our biological desire for justice stops with our relatives and immediate community or nation? In the next and final post in this series, I will look at how we overcome that kind of parochialism.

September 20, 2011

The biological basis for justice and altruism-part 2

(An expanded version of a talk given at CWRU's Share the Vision program, Severance Hall, Friday, August 26, 2011 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the ideas in the common reading book selection Justice: What's the right thing to do? by Michael Sandel. See part 1 here.)

The primatologist Frans de Waal in his excellent book The Age of Empathy (2009) provides case study after case study of animals displaying a keen sense of justice and fairness, providing convincing evidence that these impulses are innate in us and arise from our common evolutionary history with other animals. In a newspaper article titled Morals Without God? he writes about his observations:

Chimpanzees and bonobos will voluntarily open a door to offer a companion access to food, even if they lose part of it in the process. And capuchin monkeys are prepared to seek rewards for others, such as when we place two of them side by side, while one of them barters with us with differently colored tokens. One token is "selfish," and the other "prosocial." If the bartering monkey selects the selfish token, it receives a small piece of apple for returning it, but its partner gets nothing. The prosocial token, on the other hand, rewards both monkeys. Most monkeys develop an overwhelming preference for the prosocial token, which preference is not due to fear of repercussions, because dominant monkeys (who have least to fear) are the most generous.

It is not only humans who are capable of genuine altruism; other animals are, too. I see it every day. An old female, Peony, spends her days outdoors with other chimpanzees at the Yerkes Primate Center's Field Station. On bad days, when her arthritis is flaring up, she has trouble walking and climbing, but other females help her out. For example, Peony is huffing and puffing to get up into the climbing frame in which several apes have gathered for a grooming session. An unrelated younger female moves behind her, placing both hands on her ample behind and pushes her up with quite a bit of effort, until Peony has joined the rest.

We have also seen Peony getting up and slowly move towards the water spigot, which is at quite a distance. Younger females sometimes run ahead of her, take in some water, then return to Peony and give it to her. At first, we had no idea what was going on, since all we saw was one female placing her mouth close to Peony's, but after a while the pattern became clear: Peony would open her mouth wide, and the younger female would spit a jet of water into it.

Such observations fit the emerging field of animal empathy, which deals not only with primates, but also with canines, elephants, even rodents. A typical example is how chimpanzees console distressed parties, hugging and kissing them, which behavior is so predictable that scientists have analyzed thousands of cases. Mammals are sensitive to each other's emotions, and react to others in need.

A few years ago Sarah Brosnan and I demonstrated that primates will happily perform a task for cucumber slices until they see others getting grapes, which taste so much better. The cucumber-eaters become agitated, throw down their measly veggies and go on strike. A perfectly fine food has become unpalatable as a result of seeing a companion with something better.

We called it inequity aversion, a topic since investigated in other animals, including dogs. A dog will repeatedly perform a trick without rewards, but refuse as soon as another dog gets pieces of sausage for the same trick. Recently, Sarah reported an unexpected twist to the inequity issue, however. While testing pairs of chimps, she found that also the one who gets the better deal occasionally refuses. It is as if they are satisfied only if both get the same. We seem to be getting close to a sense of fairness.

Can we assume that the human species has also inherited this biological predisposition to justice? Yes, because we are all linked by the great tree of life to all other species. If we go back far enough in our lineages, we will find a common ancestor for all of use, which makes us all effectively cousins, and so you can treat this occasion, where all of us have gathered together in this magnificent concert hall, as a family reunion where you are meeting long-lost relatives. In fact, if you and your pet dog or cat trace your lineages back about a hundred million years, you will find that you have a common ancestor, which is a nice thing to realize.

So given that the desire for justice is so widespread among so many different species, it is very likely that we have inherited the desire for justice from deep evolutionary times. In his book, de Waal concludes that studies in the fields of anthropology, psychology, biology, and neuroscience reveal that we are essentially group animals: "highly cooperative, sensitive to injustice, sometimes warmongering, but mostly peace-loving. A society that ignores these tendencies cannot be optimal." (p. 5)

But is there any direct evidence that humans have a biological predisposition that makes them favor justice and fairness? Yes there is, and I will explore that in the next (and last) post of this series.

September 19, 2011

The biological basis for justice and altruism-part 1

(An expanded version of a talk given at CWRU's Share the Vision program, Severance Hall, Friday, August 26, 2011 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the ideas in the common reading book selection Justice: What's the right thing to do? by Michael Sandel.)

This year's common reading book assumes that there is something fundamental about justice that makes its desirability self-evident. What the book discusses are three approaches to justice: the first based on the greatest happiness for the greatest number, the second on respect for the freedom of choice of individuals, and the third on the cultivation of virtue and the common good.

In this talk, I want to examine the very premise that justice is something desirable. What makes us think that people want or seek justice as an end in itself and that the only problem is how to implement that ideal in specific situations? For example, John Rawls's model of justice (as elucidated in his book The Theory of Justice) assumes that when people are given the opportunity to design a society under the veil of ignorance so that no one knows what situation in life they personally will be placed in, they will create one that is based on the idea of 'justice as fairness'. Is Rawls justified in assuming that? Is it self-evident that justice is such an obvious good thing that people will want to use it as a central organizing principle?

We may think that it is obvious but one of the characteristics of academia is to not accept things just because they seem obvious and instead look for underlying reasons.

It turns out that there is a solid scientific basis for the desire that humans have for justice and it arises from the theory of evolution. This may come as a surprise to those who think of evolution as based on fierce competition for survival in which justice and fairness plays no role. But in fact, not just justice but also altruism, which can be roughly defined as an act that benefits someone else at a personal cost to us, has been studied extensively and we think we know how it originated biologically.

In his landmark book On the Origin of Species, Charles Darwin carefully avoided all discussions of human evolution, limiting it to just one statement near the end: "Light will be thrown on the origin of man and his history." That has to rank as one of the greatest understatements ever. It turns out that the theory of evolution, in addition to providing explanations for the physical features of all life, is increasingly explaining our morality as well. The basic desire for justice is ingrained in us as a result of biological evolution.

The reason for this, as was developed over fifty years ago and summarized by Richard Dawkins in his classic book The Selfish Gene (1989), is that while natural selection acts on the whole organism (whether human or fish or snake), the fundamental unit of evolution is not the whole organism but the individual gene, and evolution can be understood as the means by which individual genes try to maximally propagate themselves. But while organisms are unique in the particular combination of genes they possess, each individual gene is shared by many people, with the closer the relationship, the greater the number of genes being shared. So each one of us shares exactly half our genes with our parents and (on average) half with our siblings, one-eighth of them with our first cousins, and so on, with the fraction shared becoming lower the more distant the kin. As biologist W. D. Hamilton showed, as a result there are circumstances in which can be beneficial for a gene if the organism in which it exists sacrifices its own needs to benefit its relatives. When the eminent population geneticist J. B. S. Haldane, who pioneered a lot of the mathematical studies in this area, was asked if he would give his life to save his brother, he jokingly replied, "No, but I would to save two brothers or eight cousins." In short, the mathematics of genes can favor a limited form of self-sacrifice among relatives and so we should not be surprised if that gene is widely present.

But this kind of altruism, known as kin altruism, is just one form of it. Another important form that was shown by Robert Trivers to be biologically based is reciprocal altruism whereby an organism will do a favor for another that is not a relation in the expectation that in its own time of need the favor will be returned. Take vampires, which seem to have grabbed the public's imagination for some reason and are now all over popular culture. Vampire bats need to drink some blood every day or they will die. But in bat colonies it has been observed that those who return after having obtained a good meal will regurgitate some of the blood to a less fortunate unrelated bat and in return will receive blood from that bat on the days that they are unlucky.

This kind of behavior has been observed in a wide range of animals, and is another source of the idea that our desire for justice has biological roots. Reciprocal altruism only works if people carry out their obligation to return favors. If cheating or other forms of selfishness occur, the system breaks down and so it should not be surprising that quite elaborate structures have evolved in the animal kingdom, of which we are a part, to monitor behavior so as to reward good citizens and punish cheaters, so that the community as a whole benefits.

This sense of fairness and justice even extends to larger groups. For example, there is a remarkable video of penguins in the Antarctic, where temperatures can reach minus 50 degrees Fahrenheit with wind speeds greater than 100 miles per hour. How do they survive in such bitter conditions? They do so by large groups of thousands of them huddling together very closely at a density of about 2 animals per square foot. The temperatures in the inner regions of the group can rise up to our human body temperatures, which is nice and pleasant. Of course, the penguins on the outer rim of the group will be cold but what the video shows is that the density is just sufficient to provide warmth while at the same time allowing for a constant shuffling around. The penguins all face in roughly the same direction and penguins enter at the rear and then slowly work their way to the front and then return to the rear. As a result, each penguin spends a small time on the cold outer rim in return for much longer times in the warmth inside and thus everyone benefits.

Similar cooperative behavior is seen in locusts and fish schools. It is quite remarkable how widespread such practices are in nature.

Next: More evidence from nature

August 30, 2011

A national weight problem?

A new study suggests that obesity is increasing in the US:

Currently, figures from the U.S. Centers for Disease Control and Prevention put the prevalence of overweight and obesity in adults at about 66 percent. But lead study author Dr. Youfa Wang of the Johns Hopkins Bloomberg School of Public Health in Baltimore says that if current overweight and obesity trends continue, 86 percent of Americans could be overweight or obese by the year 2030.

The standard measure used is the body mass index (BMI) that is obtained by diving your mass (measured in kilograms) by the square of your height (measured in meters). This website calculates it for those who use pounds and feet and inches. A BMI of 30 or over indicates obesity while 25 or over means overweight. The 'normal' (i.e., supposedly desirable) range lies between 18.5 and 25

The study's authors also say that, "By 2048, all American adults would become overweight or obese." I tend to be wary of this kind of extrapolation, especially when it involves human behavior. A self-correction usually sets in at some point.

Another study released around the same time projects figures that are not quite as high:

If obesity rates continue to climb in the U.S. as they've done in the past, about half of all men and women could be obese in 20 years, adding an extra 65 million obese adults to the country's population.

The current figure of 66% of overweight and obese adults surprised me. Can it really be that two out of every three people are like that or is the cut-off for being overweight too low? One common comment I hear from overseas visitors is their initial surprise at the number of overweight people they see in the US. Have I simply got used to thinking of larger people as the norm after living in the US for so long?

One of the peculiar features of the coverage of people's weight in the media is the appearance of headless torsos accompanying the stories. News stories on obesity will be accompanied by photos and videos of people from the neck down, an indication of the stigma associated with being overweight. In fact, overweight people are often subjected to gratuitously rude comments and made to feel as if they have some kind of moral failing.

Some are fighting back, saying that they do not see obesity as a disease or even a problem, and definitely not anything to be ashamed of or have to apologize for. They say that that is simply who they are and the rest of the population simply has to deal with it. They have rejected the idea that the word fat is some kind of slur requiring the use of euphemisms to soften it, and have embraced it and made it their own, the way that the gay community did with the word queer. They are fat and proud of it.

The Daily Show had a segment on the coverage of obesity some time ago, and interviewed some who see the campaigns against obesity and the drive to eat healthier as a sign of creeping fascism.

The Daily Show With Jon StewartMon - Thurs 11p / 10c
Chubby Chasers
www.thedailyshow.com
Daily Show Full EpisodesPolitical HumorTea Party

August 29, 2011

Hurricane Irene

Cleveland was not in the path of Irene so we just observed it from afar but I am puzzled by those who now claim that it was over-hyped, merely because it caused less damage than expected.

It is quite extraordinary that the National Hurricane Center is able to predict the track and intensity of a swirling storm five days out with pretty good precision, enabling cities and people to take safety precautions. David Kurtz points out that there have been huge gains recently in the ability to predict the track of hurricanes, and less progress in our ability to predict the intensity, as was the case with Irene.

But it was still quite an impressive feat for which the people at the NHC deserve a lot of credit.

August 14, 2011

Tests of the existence of other universes

When Louis de Broglie first proposed in 1924 that particles had wavelike properties, the technological challenges to investigating the idea were so immense that the prospects for testing it seemed to lie very far into the distant future, if at all. But one of the features of science is that however incredible an idea may seem when it is first proposed, if it gains credibility and acceptance from the scientific community as a whole, it will only be a matter of time before someone finds an ingenious way to try and test it. So it was with de Broglie's idea. It was such so beautiful in the way that it unified waves and particles in a symmetric way in quantum mechanics, that it spurred creative thinking and within just three years C. J. Davisson and L. Germer were able to construct an experiment that confirmed it, resulting in de Broglie receiving the Nobel Prize in 1929, an incredibly rapid pace of advance.

So it is with the multiverse idea, that entire universes can be created spontaneously from the vacuum and thus our own universe may be just one of an enormous number (as many as 10500) of universes, each having their own laws and structure. This idea not only does not violate the laws of science, it is not even a new theory, being in fact a prediction of other theories.

As with de Broglie's hypothesis, when the multiverse idea was initially proposed there seemed to be no way to test it. But now people have come along with suggestions of how to do it, by looking for disk-like patterns in the cosmic microwave background that may be the telltale relics of collisions of other universes with our own.

Science is such fun.

August 13, 2011

Radioactive heating of the Earth

Recent measurements show that about half of the 40 trillion watts of heat radiated continuously by the Earth comes from radioactivity taking place in its mantle and crust, while the remainder is due to the primordial heat that was created at the formation of the Earth and is located mainly in the core.

Historians of science are aware of the importance of the discovery of the radioactivity as an ongoing source of the heating of the Earth. Before the immense amount of heat associated with radioactive decay was discovered around 1903, physicists like Lord Kelvin had calculated the age of the Earth by treating it as an initially hot body that was steadily cooling. They concluded that it could not be older than 100 million years and could be as low as 20 million years. This made it very difficult, if not impossible, for the theory of evolution by natural selection, because it was a slow process that required long time scales. This was seized upon by religious people to argue against the evolution and in favor of the special creation of species by god. (See my series on the age of the Earth for a more detailed discussion of this.)

The discovery of radioactivity had two revolutionary impacts. It created an awareness that radioactivity was an ongoing source of the heating of the Earth that undermined all the earlier calculations of Kelvin and others, and it provided an important new tool for measuring time that opened the gates to new discoveries that rapidly pushed the age of the Earth to more than four billion years, giving plenty of time for evolution to take place.

August 05, 2011

Is there anything that makes humans special?

Primatologist Frans de Waal's latest book The Age of Empathy (2009) argues against the idea that we humans have some special quality that separates us from all the other animals. Some people, especially those who are religious, seem to be very reluctant to accept that idea that other animal species share pretty much all the same basic physical and emotional characteristics that we humans have.

There is an interesting passage in the book (p. 206-208) where he says that this wrong idea in Christianity, Judaism, and Islam originated because the part of the world in which those religions originated were those that did not contain our closest non-human relatives.

For the Darwinist, there is nothing more logical than the assumption of emotional continuity. Ultimately, I believe that the reluctance to talk about animal emotions has less to do with science than religion. And not just any, religion, but particularly religions that arose in isolation from animals that look like us. With monkeys and apes around every corner, no rain forest culture has ever produced a religion that places humans outside of nature. Similarly, in the East-surrounded by native primates in India, China, and Japan-religions don't draw a sharp line between humans and other animals. Reincarnation occurs in many shapes and forms: A man may become a fish and a fish may become God. Monkey gods, such as Hanuman, are common. Only the Judeo-Christian religions place humans on a pedestal, making them the only species with a soul. It's not hard to see how desert nomads might have arrived at this view. Without animals to hold up a mirror to them, the notion that we're alone came naturally to them. They saw themselves as created in God's image and as the only intelligent life on earth. Even today, we're so convinced of this that we search for other such life by training powerful telescopes on distant galaxies.

It's extremely telling how Westerners reacted when they finally did get to see animals capable of challenging these notions. When the first live apes went on display, people couldn't believe their eyes. In 1835, a male chimpanzee arrived at London Zoo, clothed in a sailor's suit. He was followed by a female orangutan, who was put in a dress. Queen Victoria went to see the exhibit, and was appalled. She called the apes "frightful, and painfully and disagreeably human." This was a widespread sentiment, and even nowadays I occasionally meet people who call apes "disgusting." How can they feel like this unless apes are telling them something about themselves that they don't want to hear? When the same apes at the London Zoo were studied by the young Charles Darwin, he shared the queen's conclusion but without her revulsion. Darwin felt that anyone convinced of man's superiority ought to go take a look at these apes.

All of this occurred in the not too distant past, long after Western religion had spread its creed of human exceptionalism to all corners of knowledge. Philosophy inherited the creed when it blended with theology, and the social sciences inherited it when they emerged out of philosophy. After all, psychology was named after Psykhe, the Greek goddess of the soul. These religious roots are reflected in continued resistance to the second message of evolutionary theory. The first is that all plants and animals, including ourselves, are the product of a single process. This is now widely accepted: also outside biology. But the second message is that we are continuous with all other life forms, not only in body but also in mind. This remains hard to swallow. Even those who recognize humans as a product of evolution keep searching for that one divine spark, that one "huge anomaly" that sets us apart. The religious connection has long been pushed to the subconscious, yet science keeps looking for something special that we as a species can be proud of.

When it comes to characteristics that we don't like about ourselves, continuity is rarely an issue. As soon as people kill, abandon, rape, or otherwise mistreat one another we are quick to blame it on our genes. Warfare and aggression are widely recognized as biological traits, and no one thinks twice about pointing at ants or chimps for parallels. It's only with regard to noble characteristics that continuity is an issue, and empathy is a case in point. Toward the end of a long career, many a scientist cannot resist producing a synopsis of what distinguishes us from the brutes. American psychologist David Premack focused on causal reasoning, culture, and the taking of another's perspective, while his colleague Jerome Kagan mentioned language, morality, and yes, empathy. Kagan included consolation behavior, such as a child embracing his mother, who has hurt herself. This is indeed a great example, but of course hardly restricted to our species. My main point, however, is not whether the proposed distinctions are real or imagined, but why all of them need to be in our favor. Aren't humans at least equally special with respect to torture, genocide, deception, exploitation, indoctrination, and environmental destruction? Why does every list of human distinctiveness need to have the flavor of a feel-good note?

This is one of the fundamental reasons that the Abrahamic religions find it so hard to reconcile their beliefs with science. They have locked themselves into a dogma that human beings are special in some discontinuous way from all other animals, when science is increasingly revealing that all species lie on a continuum with no sharp boundaries. These religions simply cannot live with the idea that what makes us human is just that we have different amounts of same things that are possessed by other animal species.

Religious people keep searching for that one spark of divine fire that reassures them that they are unique and that their god really does care for them in a special way. But they keep repeatedly failing in their quest because the 'soul' (for want of a better term) is like the rainbow, an illusion that keeps receding. It is kind of sad that they never seem to be able to come to terms with their true place in the universe.

I myself find it enormously uplifting to think that I am part of all of life, that I can connect myself to every single thing that lives and has ever lived by tracing a path through the great tree of life. What could be more magnificent than that?

July 27, 2011

How yogis 'levitate'

Hindu mystics have long been claiming that they can, by sheer will and/or the intervention of god, levitate off the ground. Here is one way it is done.

A good rule of thumb is that if something violates the laws of science, it is not a miracle, it is not by 'harnessing the energy field' or some such Deepak Chopraesque mumbo-jumbo, and it is not due to a god. It is merely a trick. The only question to be explored is how the trick is carried out.

July 11, 2011

World Cup soccer final for robots

Exciting!

July 10, 2011

The 44 chromosome man

Almost all human beings have 46 chromosomes (23 pairs) and being born with an extra or missing one usually signifies that the person will have serious medical problems such as Down syndrome.

On the other hand, our close relatives the chimpanzees have 48 chromosomes (24 pairs). The chimps and us shared a common ancestor about 6-8 million years ago. So how did we end up with fewer? This is because about a million years ago, two of the 24 chromosomes in a human fused together end-to-end to form a single longer chromosome. Since the crucial genetic information in each chromosome was preserved by this fusion process, the organism could survive. The evidence suggests that it was chromosomes #12 and #13 that fused to form the present chromosome #2.

The interesting question is how that mutation might have occurred and why it took hold in the human population so that 46 chromosomes is now the standard.

In this fascinating article (sent to me by reader Fu DaYi), Barry Starr of Stanford University describes a recent discovery in China of a man who seems to have undergone a similar reduction process with chromosomes #14 and #15 becoming fused, and now has just 44 chromosomes (22 pairs). His case sheds light on how the chromosome reduction process might have occurred in our own ancestors.

July 08, 2011

Heart with no heartbeat

NPR had an interesting story on a new type of artificial heart. Older models had tried to replicate the human heart with its pumping mechanism but have been unable to create models that work without problems for a long time.

This new heart is radically different in that it foregoes the pumping action and has motors that continuously drive blood through the body. This makes for a much simpler design with less chance of breakdown. It seems as if the pumping action is not essential for the working of the body, though it is still early days and we do not have long-term data on the effects.

If the results hold up and a heart that beats is not really necessary, it means that the beating heart is a product of evolution that is functional but not optimal. This would illustrate once again that the processes of evolution do not necessarily produce the best design but merely a design that works. This will not be the first time that thinking that nature's design is the best and trying to copy it has sent us in the wrong direction. Early experiments with flight tried to emulate the flapping wing action of birds with little luck.

What is kind of weird is that with this new artificial heart, there will be no heartbeat, no pulse, and the EKG signal will be a flat line. So the most common markers we currently use to see if someone is dead or alive would indicate that the person is dead.

June 30, 2011

Early eyes

A new article published today in Nature finds fossil evidence that fairly sophisticated eyes had evolved as early as 515 millions years ago, around the time known as the Cambrian explosion.

There were no fossil bodies found attached to the eyes, but the eyes probably belonged to a shrimp-like creature.

June 21, 2011

Myths about the Golden Ratio

Take a straight line. How should one divide the length into two parts such that the ratio of the length of the whole line to the longer segment is equal to the ratio of the longer segment to the shorter one? A little algebra gives you the result that longer segment should be 0.618 times the length of the whole line and thus the ratio of the full line to the longer segment is 1.618 (=1/0.618).

The number 1.618 is known as the 'Golden Ratio' and folklore ascribes deep significance to it and claims a ubiquity for it that far exceeds the reality.

Mathematician Keith Devlin tries to set the record straight.

June 16, 2011

Who am I?

In yesterday's post, I wrote about the fact that different parts of our bodies keep regenerating themselves periodically. This fact alone should make nonsense of the belief of some religious people that our bodies become physically reconstituted after death in the afterlife, because if so, the resurrected body of a person who died at the age of 70 would be unrecognizably grotesque, consisting of around 70 livers and 7 full skeletons, all surrounded by hundreds, maybe thousands, of pounds of skin.

But leaving aside that, there is an interesting question raised by this constant regeneration of the body and that is how we retain a sense of having a single identity over our full life spans even as individual parts of us get replaced periodically. The average age of the molecules in my body is around 7 to 10 years and yet I have the strong sense of continuity, that I am in some fundamental sense the same person that I was as a child, even though almost none of those molecules have stayed with me over that time. How is it that we retain a strong sense of permanence in our identity while being so transient in our bodies?

The answer may lie in the fact that our brain seems to be the most permanent of our organs, undergoing little or no regeneration. In the same article in the New York Times that I referred to yesterday, Nicholas Wade says:

Dr. Frisen, a stem cell biologist at the Karolinska Institute in Stockholm, has also discovered a fact that explains why people behave their birth age, not the physical age of their cells: a few of the body's cell types endure from birth to death without renewal, and this special minority includes some or all of the cells of the cerebral cortex.

The cerebral cortex is the thin sheet that forms the outer layer of the brain and is divided up into several zones that have different functional roles. If the cortex were removed and smoothed out to eliminate all the creases and folds, it would look like a dinner napkin. It is gray in color, the origin of its popular euphemism of 'gray matter'. The network of nerve cells in the brain (called neurons) determines how the brain functions.

brain.jpg

While the brain seems to be the most enduring part of the body, even here there is variation. The cerebellum seems to contain non-neuronal cells that are close to the birth age (within three years or so) while the cerebral cortex (which is responsible for our cognitive capabilities and is thus most closely identified with our sense of self) has a slightly greater turnover of non-neuronal cells. But the researchers do not turn up any evidence that there is neuronal generation after birth, at least in the region known as the occipital cortex.

It was long believed that the number of neuronal connections in the brain grew rapidly during the first year or two of life and then got pruned and this was how our lives shaped our brains without new neurons being created. In 1999, there was research that found that new neurons were being created in the cerebral cortex of adult monkeys, suggesting that it could happen in adult humans too. This would complicate things somewhat as to how we retain a permanent sense of self but also provide hope that brains could regenerate. But this summary of later research (much of it by the same Karolinka group that I referred to yesterday) that appeared in the Proceedings of the National Academy of Sciences says that this does not happen with the neurons in the human cerebral cortex. (The neocortex referred to in the paper is the most recently evolved part of the cortex that is defined as containing the 'higher' functions and are "arranged in six layers, within which different regions permit vision, hearing, touch, the sense of balance, movement, emotional responses and every other feat of cognition.")

The results show that the average age of the neurons (with respect to the age of the individual) is age 0.0 ± 0.4 years, i.e., the same as the age of the individual. In contrast, the nonneuronal cells have an average birth date of 4.9 ± 1.1 years after the birth of the individual.

Both of the experiments of Bhardwaj et al. indicate that there are no new neurons, either long-lived or transient, produced in the adult human for the neocortex. Importantly, these experiments are quantitative and indicate a theoretical maximum limit of 1% on the proportion of new neurons made over a 50-year period.

Bhardwaj et al. settle a hotly contested issue, unequivocally. The two-pronged experimental approach clearly establishes (i) that there is little or no continuous production of new neurons for long-term addition to the human neocortex and (ii) that there are few if any new neurons produced and existing transiently in the adult human neocortex. Importantly, the results are quantitatively presented, and a maximum limit to the amount of production of the new neurons can be established from the data presented. The data show that virtually all neurons (i.e., >99%) of the adult human neocortex are generated before the time of birth of the individual, exactly as suggested by Rakic, and the inescapable conclusion is that our neocortical neurons, the cell type that mediates much of our cognition, are produced prenatally and retained for our entire lifespan. [My italics]

So basically, even though every other part of us gets sloughed off and replaced at different points in time, for good or bad we are pretty much stuck with the brains that we have at birth. This may be crucial to our ability to retain a sense of a permanent identity that lasts all through our lives, although this is not yet established. Even if new research emerges that new neuronal cells could be generated over time replacing older ones, it may turn out to be able to do this seamlessly and provide cognitive continuity, just the way our other organs give us the illusion of being permanent even though they are not.

It seems like our brains are our essential selves with the rest of our bodies just superstructure. Rene Descartes famously said "I think, therefore I am." We could also say, "My brain is who I am."

June 15, 2011

Amazing robots

Now that computers have beaten us at chess, robots are turning their attention to pool.

(Via Machines Like Us.)

How old are you?

In an article in the New York Times, Nicholas Wade points out that our bodies are younger than we think, because there is a discrepancy between our birth age and the age of the cells that make up our bodies

Whatever your age, your body is many years younger. In fact, even if you're middle aged, most of you may be just 10 years old or less.

This heartening truth, which arises from the fact that most of the body's tissues are under constant renewal, has been underlined by a novel method of estimating the age of human cells. Its inventor, Jonas Frisen, believes the average age of all the cells in an adult's body may turn out to be as young as 7 to 10 years.

He quotes the work of Spalding, Bhardwaj, Buchhold, Druid, and Frisén of the Karolinska institute that uses the radioactive isotope carbon-14 to determine the age of the cells in bodies. Their paper appeared in the July 15, 2005 issue of Cell. They used carbon-14 dating to determine the age of cells. The carbon that forms organic matter is largely obtained from the atmosphere. Plants, for example, take in carbon dioxide from the air and exude oxygen as part of the process of photosynthesis. Hence the proportion of carbon-14 that is found in living organic matter is the same as that in the ambient atmosphere at the time it was absorbed. The level of the radioactive isotope carbon-14 that occurs in the atmosphere is fairly constant because its rate of production is balanced by the rate of decay. Once the plant dies, it does not take in any new carbon and the decay of the carbon-14 that it had at the moment of death results in a steadily smaller proportion of it and the difference can be used to measure how long it has been dead. The half-life of carbon-14 is 5,730 years and this method can be used to determine the age of dead organic matter up to about 50,000 years, which is a convenient range for archeological dating because it lies in the range required for those studies.

The way that Frisén and his co-workers used this knowledge to measure the age of cells in humans is quite clever. Carbon-14 is produced by cosmic rays and the level of carbon-14 in the atmosphere should be constant. This is why we can tell how long something has been dead but not when it was 'born', i.e., when the organic matter was created. But in the 1950s and 1960s, there was a sharp spike in carbon-14 levels because of the atmospheric testing of nuclear weapons. Once atmospheric test ban treaties came into came into being, the surge of carbon-14 that had been produced steadily became diffused in the atmosphere as it spread over the globe, and so there has been a steady decline in average carbon-14 levels over time. It is this that enables us to know when the carbon-14 was absorbed to create organic matter.

cell images_Page_02_Image_0001.jpg

The amount of carbon-14 in the genomic DNA can thus be used to measure when the DNA in the cell was created. The technique was checked against the age of trees which can be measured by the amounts of carbon-14 found in the various rings as the isotope is absorbed during photosynthesis. Their results and those of others show that different parts of the body get replaced after different durations, whose approximate values are given below. (I have included results from both the Wade newspaper article and the Frisen paper.)

Stomach lining: five days
Surface layer of skin: two weeks
Red blood cells: three months
Liver: one year
Skeleton: 10 years
Intestine: 11 years
Rib muscles: 15 years

This explains why our bodies seem so durable and able to withstand considerable abuse.

So why do we die if parts of us keep getting regenerated? It seems as if the ability of stem cells to keep reproducing declines with age. In other words there seems to be a limit to the number of times that cells can reproduce and once we reach that limit, the ability of the body to regenerate itself ceases. What causes this limit is still an open question. As Wade writes:

Some experts believe the root cause is that the DNA accumulates mutations and its information is gradually degraded. Others blame the DNA of the mitochondria, which lack the repair mechanisms available for the chromosomes. A third theory is that the stem cells that are the source of new cells in each tissue eventually grow feeble with age.

Frisen thinks his research might be able to shed some light on this question, especially the third option, saying "The notion that stem cells themselves age and become less capable of generating progeny is gaining increasing support."

June 14, 2011

Patenting DNA and genetic tests

In an article titled Patently Unjust in the June 2010 issue of The Progressive (not available online), Kari Lydersen describes a similar issue to the one involving Henrietta Lacks, where private companies are making a bundle out of publicly funded research. In this case, the publicly funded Human Genome Project has made freely available the full human genome but some private companies have obtained patents over individual genes.

The particular case that Lydersen deals with involves the genes known as BRCA1 and BRCA2. Certain mutations in these genes are predictors of breast and ovarian cancer, since women with such mutations are five times more likely to develop breast cancer and ten to thirty times more likely to develop ovarian cancer. We are now able to test if a woman has these mutations in which case they have to make difficult decisions about whether to preemptively remove their breasts and ovaries. These genes were discovered as part of the genome project.

It turns out that a single company named Myriad Genetics holds several patents on the genes and as a result claims exclusive rights to the tests they developed to detect the mutations. They charge about $3,000 for the test, which prices many women out of the market. They claim that if companies could not make money, they would not have the incentive to develop the tests. There is some truth in this but it is also true that a huge amount of federal (i.e. public) research funding went into the research that provided the basis for the company's work, which should also be a factor. If the public funds something, the public should also benefit.

The reasons given by the company's founder for the high price they charge for the tests is revealing about the why medical costs are so high in the US. He says, "In the U.S. what you charge for a test is a complex equation of what it costs you to do it and what people will pay" (my italics). This is part of the problem in a system with employer-based private health insurance coupled with monopoly providers. Well-to-do groups with power can pressure their insurance companies to cover the costs of tests which enables the testing companies to charge higher prices than they need to merely cover costs and provide a reasonable profit. The price then becomes prohibitive for those without insurance and drives up the cost of health care. I have written about this before.

As Lydersen writes, this is a widespread problem.

Myriad is far from the only patent holder on human genes; about 20 percent of the human genome is patented. This basically means that only the patent holder can offer testing and other services related to a specific gene. Patents currently cover genes related to other diseases, including Alzheimer’s, asthma, colon cancer, muscular dystrophy, and spinal muscular atrophy, a hereditary disease that kills children at a young age.

What is worse, because the company claims exclusive rights to the genes, women cannot get a second opinion on such a major question. At a minimum, what is needed is at least for more than one company to be able to provide services so that they can compete with each other. Giving private companies monopoly power over the use of research results that were largely publicly funded seems wrong.

The intricacies of patent law are too subtle for me to get into but on the surface the U. S. Patent Office seems to have been too generous in allowing companies to patent genes. It is illegal to patent a product of nature but the US Patent Office has granted Myriad and similar outfits patents on the genes on the basis that they were able to isolate them from their natural state and purify them. But others argue that this is far too expansive a view. After all, just because you develop a technique to highly purify gold (say) should not enable you to claim the patent to gold. I can understand patents being awarded to the purifying process because that is something the company did develop. That would reward their intellectual contribution while yet preserving the right of other companies to invent alternative methods of purification of the same gene and thus develop competing tests.

The right of private companies to patent genes was litigated and Lydersen writes that in March of 2010 US District Judge judge Robert W. Sweet ruled that Myriad's claims did not meet the test of what makes something derived from nature patentable and invalidated the patents, saying in his ruling:

"The patents issued by the USPTO are directed to a law of nature and therefore were improperly granted," Sweet wrote. "DNA represents the physical embodiment of biological information, distinct in its essential characteristics from any other chemical found in nature…. DNAs existence in an 'isolated' form alters neither this fundamental quality of DNA as it exists in the body nor the information it encodes. Therefore, the patents at issue directed to 'isolated DNA containing sequences found in nature are unsustainable as a matter of law and are deemed unpatentable subject matter."

Patents are valuable things and protect the rights of inventors and other creative people but the Patent Office should be wary of taking the claims of private companies too much at face value, especially when it comes to patenting things in nature like bits of DNA.

Myriad has appealed the ruling to the US Court of Appeals and much hangs in the balance.

June 13, 2011

Who should own the rights to one's tissues?

People generally do not think about what happens to the blood and tissue samples they give as part of medical tests, assuming that they are eventually discarded in some way. Many are not aware that your samples may be retained for research or even commercial purposes. Once you give it away, you lose all rights to what is subsequently done with it, even if your body parts have some unique property that can be used to make drugs and other things that can be marketed commercially.

The most famous case of this is Henrietta Lacks, a poor black woman in Baltimore who died from cervical cancer in 1951. A researcher who had been trying unsuccessfully, like others, to have cells reproduce in the test tube, received a sample of hers too. It turned out that her cancer cells, unlike other cells, could reproduce endlessly in test tubes, providing a rich and inexhaustible source of cells for research and treatment. Her cells, called HeLa, have taken on a life of their own and have travelled the world long after she died. Her story is recounted in the book The Immortal Life of Henrietta Lacks by Rebecca Skloot.

The issue of whether one's cells should be used without one's permission and whether one should be able to retain the rights to one's tissues is a tricky one for law and ethics.

"Science is not the highest value in society," [Lori Andrews, director of the Institute for Science, Law, and Technology at the Illinois Institute of Technology] says, pointing instead to things like autonomy and personal freedom. "Think about it," she says. "I decide who gets my money after I die. It wouldn't harm me if I died and you gave all my money to someone else. But there is something psychologically beneficial to me as a living person to know I can give my money to whoever I want. No one can say, 'She shouldn't be allowed to do that with her money because that might not be most beneficial to society.' But replace the word money in that sentence with tissue, and you've got precisely the logic many people use against giving donors control over their tissues." (Skloot, p. 321)

It does seem wrong somehow for private companies to hugely profit from the lives and bodies of others without owing them anything. In the case of Henrietta Lacks, her family remained very poor and lacked health insurance and proper medical care even while her cells became famous and they bitterly resented this. They did not even know about the widespread use of her cells until two decades later.

On the other hand, it would put a real crimp on research if scientists had to keep track of whose tissues they were working on. Since we all benefit (or should benefit) from the results of scientific research, one can make the case that the tissues we give up are like the trash we throw away, things for which we have voluntarily given away our rights. If the tissues are used for medical research done by public institutions like the NIH or universities and the results are used not for profit but to benefit the general public, this would, I believe, remove many of the objections to the unaccredited use of tissues.

You can see why scientists would prefer to have the free use of tissues but what I don't understand are those scientists who go overboard in making special exceptions for religion.

David Korn, vice president for research at Harvard University says: "I think people are morally obligated to allow their bits and pieces to be used to advance knowledge to help others. Since everybody benefits, everybody can accept the small risks of having their tissue scraps used in research. "The only exception he would make is for people whose religious belief prohibit tissue donation. "If somebody says being buried without all their pieces will condemn them to wandering forever because they can't get salvation, that's legitimate, and people should respect it," Korn says. (Skloot, p. 321)

This is another case where religions try to claim special privileges denied to everyone else. Why is that particular claim legitimate? Why should religious superstitions get priority over other irrational beliefs? Our bodies are in a constant state of flux. It sheds cells all the time in the normal course of our daily lives, which is why DNA testing has become such a valuable forensic tool for solving crimes. Since we are losing old cells and gaining new cells all the time, it is a safe bet that hardly any of the cells that were part of me as a child are still in my body. So the whole idea that the afterlife consists of 'all of me' is absurd since that would require bringing together all the cells that I have shed during my life, resulting in me having multiple organs and limbs, like some horror fiction monster.

Rather than pandering to this fantasy, we should educate people that our bodies are in a constant state of flux, that our seemingly permanent bodies are actually transient entitites.

June 10, 2011

Atheism is a byproduct of science

Science is an atheistic enterprise. As the eminent population geneticist J. B. S. Haldane said:

My practice as a scientist is atheistic. That is to say, when I set up an experiment I assume that no god, angel or devil is going to interfere with its course; and this assumption has been justified by such success as I have achieved in my professional career. I should therefore be intellectually dishonest if I were not also atheistic in the affairs of the world.

While not every scientist would apply the highly successful atheistic methodology to every aspect of their lives as Haldane does, the fact that intellectual consistency requires it, coupled with the success of science, has persuaded most scientists that leaving god out of things is a good way to proceed and hence it should not be surprising that increasing awareness of science correlates with increased levels of atheism.

But it would be wrong to conclude that scientists have atheism as a driving concern in their work or that they actively seek out theories that deny the existence of god. God is simply irrelevant to their work. The negative implications for god of scientific theories is a byproduct of scientific research rather than the principle aim of it. Non-scientists may be surprised that discussions about god are almost nonexistent at scientific meetings and even in ordinary interactions among scientists. We simply take it for granted that god plays no role whatsoever.

For example, the idea of the multiverse has torpedoed the argument of religious people that the universe must have had a beginning or that its parameters seem to be fine-tuned for human life, which they argue are evidences for god. They seem suspicious that the multiverse idea was created simply to eliminate god from these two of the last three refuges in which he could be hiding. (The third refuge is the origin of a self-replicating molecule that was the precursor of life.) In his article titled Does the Universe Need God?, cosmologist Sean Carroll dismisses that idea.

The multiverse is not a theory; it is a prediction of a theory, namely the combination of inflationary cosmology and a landscape of vacuum states. Both of these ideas came about for other reasons, having nothing to do with the multiverse. If they are right, they predict the existence of a multiverse in a wide variety of circumstances. It's our job to take the predictions of our theories seriously, not to discount them because we end up with an uncomfortably large number of universes.

Carroll ends with a nice summary of what science is about and why god really has no reason to be postulated into existence. This is similar to the points I made in my series on why atheism is winning.

Over the past five hundred years, the progress of science has worked to strip away God's roles in the world. He isn't needed to keep things moving, or to develop the complexity of living creatures, or to account for the existence of the universe. Perhaps the greatest triumph of the scientific revolution has been in the realm of methodology. Control groups, double-blind experiments, an insistence on precise and testable predictions – a suite of techniques constructed to guard against the very human tendency to see things that aren't there. There is no control group for the universe, but in our attempts to explain it we should aim for a similar level of rigor. If and when cosmologists develop a successful scientific understanding of the origin of the universe, we will be left with a picture in which there is no place for God to act – if he does (e.g., through subtle influences on quantum-mechanical transitions or the progress of evolution), it is only in ways that are unnecessary and imperceptible. We can't be sure that a fully naturalist understanding of cosmology is forthcoming, but at the same time there is no reason to doubt it. Two thousand years ago, it was perfectly reasonable to invoke God as an explanation for natural phenomena; now, we can do much better.

None of this amounts to a "proof" that God doesn't exist, of course. Such a proof is not forthcoming; science isn't in the business of proving things. Rather, science judges the merits of competing models in terms of their simplicity, clarity, comprehensiveness, and fit to the data. Unsuccessful theories are never disproven, as we can always concoct elaborate schemes to save the phenomena; they just fade away as better theories gain acceptance. Attempting to explain the natural world by appealing to God is, by scientific standards, not a very successful theory. The fact that we humans have been able to understand so much about how the natural world works, in our incredibly limited region of space over a remarkably short period of time, is a triumph of the human spirit, one in which we can all be justifiably proud.

Religious believers misuse this fundamental nature of scientific inquiry, that all conclusions are tentative and that what we believe to be true is a collective judgment made by comparing theories and determining which one is best supported by evidence, to make the misleading case that unless we have proved one single theory to be true, other theories (especially the god theory) should merit serious consideration. This is wrong. While we may not be able to prove which theories are right and which are wrong, we do know how to judge which ones are good and which ones are bad.

God is a terrible theory. It fails utterly to deliver the goods, and so should be abandoned like all the other failed theories of the past. In the film Love and Death, Woody Allen's character says, "If it turns out that there is a god, I don't think that he's evil. I think that the worst you can say about him is that basically he's an underachiever." He is right.

June 09, 2011

God is not the 'simplest' explanation for the universe

Believers in god (especially of the intelligent design variety) like to argue that a god is a 'simpler' explanation than any of the alternatives for many natural phenomena. But they seem to equate simple with naïve, in the sense that what makes something simple is something that should be understandable by a child. For example, if a child asks you why the sun rises and sets every day, giving an explanation in terms of the laws of gravity, Newton's laws of motion, and the Earth's rotation about its own axis, is not 'simple'. A child would more likely understand an explanation in which there is a man whose job it was to push the sun around in its daily orbit. This is 'simpler' because the concepts of 'man' and 'push' are familiar ones to a child, requiring no further explication. But this apparent simplicity is an illusion because it ignores enormously complicating factors such as how the man got up there, how strong must he be, why don't we see him, and so on. It is because such issues are swept under the rug that this explanation appears to be simple.

In his article titled Does the Universe Need God?, cosmologist Sean Carroll points out that introducing a new ad hoc element like god into a theory actually makes things enormously complicated. The erroneous idea that simplicity is linked to the number of entities involved is based on a misconception of science.

All else being equal, a simpler scientific theory is preferred over a more complicated one. But how do we judge simplicity? It certainly doesn't mean "the sets involved in the mathematical description of the theory contain the smallest possible number of elements." In the Newtonian clockwork universe, every cubic centimeter contains an infinite number of points, and space contains an infinite number of cubic centimeters, all of which persist for an infinite number of separate moments each second, over an infinite number of seconds. Nobody ever claimed that all these infinities were a strike against the theory.

The simplicity of a theory is a statement about how compactly we can describe the formal structure (the Kolmogorov complexity), not how many elements it contains. The set of real numbers consisting of "eleven, and thirteen times the square root of two, and pi to the twenty-eighth power, and all prime numbers between 4,982 and 34,950" is a more complicated set than "the integers," even though the latter set contains an infinitely larger number of elements. The physics of a universe containing 1088 particles that all belong to just a handful of types, each particle behaving precisely according to the characteristics of its type, is much simpler than that of a universe containing only a thousand particles, each behaving completely differently.

At first glance, the God hypothesis seems simple and precise – an omnipotent, omniscient, and omnibenevolent being. (There are other definitions, but they are usually comparably terse.) The apparent simplicity is somewhat misleading, however. In comparison to a purely naturalistic model, we're not simply adding a new element to an existing ontology (like a new field or particle), or even replacing one ontology with a more effective one at a similar level of complexity (like general relativity replacing Newtonian spacetime, or quantum mechanics replacing classical mechanics). We're adding an entirely new metaphysical category, whose relation to the observable world is unclear. This doesn't automatically disqualify God from consideration as a scientific theory, but it implies that, all else being equal, a purely naturalistic model will be preferred on the grounds of simplicity.

Religious people think that god is a 'simpler' theory because they give themselves the license to assign their god any property they wish in order to 'solve' any problem they encounter, without making the answer given in one area consistent with an answer given elsewhere. But the very fact that the god model is so malleable is what makes it so useless. For example, religious people will argue (as they must) that the way that the world currently exists, despite the suffering, disasters, and catastrophes that seem to afflict everyone indiscriminately, is evidence for a loving god. A colleague of mine who is a very thoughtful and sophisticated person told me recently that when he looks at the world, he sees one that is consistent with the existence of god.

This raises two questions. The first is whether the world that he sees also consistent with the non-existence of god. If yes, how does he decide which option to believe? If no, what exactly is the source of the inconsistency?

The second question is what the world would need to look like for him to conclude that the there is no god. Carroll gives a thought experiment that illustrates the shallowness of those who argue that the evils and misfortunes and calamities that bestride this world are actually evidence for god.

In numerous ways, the world around us is more like what we would expect from a dysteleological set of uncaring laws of nature than from a higher power with an interest in our welfare. As another thought experiment, imagine a hypothetical world in which there was no evil, people were invariably kind, fewer natural disasters occurred, and virtue was always rewarded. Would inhabitants of that world consider these features to be evidence against the existence of God? If not, why don't we consider the contrary conditions to be such evidence?

It is not hard to understand why the concept of god could only have arisen in primitive, or at least pre-modern, times.

Consider a hypothetical world in which science had developed to something like its current state of progress, but nobody had yet thought of God. It seems unlikely that an imaginative thinker in this world, upon proposing God as a solution to various cosmological puzzles, would be met with enthusiasm. All else being equal, science prefers its theories to be precise, predictive, and minimal – requiring the smallest possible amount of theoretical overhead. The God hypothesis is none of these. Indeed, in our actual world, God is essentially never invoked in scientific discussions. You can scour the tables of contents in major physics journals, or titles of seminars and colloquia in physics departments and conferences, looking in vain for any mention of possible supernatural intervention into the workings of the world.

The concept of god is a relic of our ancient history, like the vestigial elements of animal physiology such as the legs bones of some snakes, the small wings of flightless birds like the kiwi, the eyes of the blind mole rat, and the tailbone, ear muscles, and appendix of humans. It will, like them, eventually disappear for the same reason, because they have ceased to be of use.

June 08, 2011

The failure of fine-tuning arguments for god

When I ask people why they believe in god, their response almost invariably comes down to them being impressed with the complexity of the world and thinking that it could not have come about without some intelligent agent behind it. It is highly likely that this 'reason' is not the actual cause of their belief but a later rationalization for beliefs that they unthinkingly adopted as part of their childhood indoctrination into religion. When people become adults, they realize that saying they believe something because they were told it as children is likely to expose them to ridicule, and so they manufacture a superficially more rational answer.

The more sophisticated among them, who like to consider themselves as modernists who are accepting of science, argue that the properties of the laws of science and the inanimate matter that make up the universe seem to have just the right values to make life possible and that this implies that god must have chosen those values in order to enable the emergence of humans. This is what is known as the fine-tuning argument for god. (See also the discussion in the comments in yesterday's post .)

In his article titled Does the Universe Need God?, cosmologist Sean Carroll elaborates on it.

In recent years, a different aspect of our universe has been seized upon by natural theologians as evidence for God's handiwork – the purported fine-tuning of the physical and cosmological parameters that specify our particular universe among all possible ones. These parameters are to be found in the laws of physics – the mass of the electron, the value of the vacuum energy – as well as in the history of the universe – the amount of dark matter, the smoothness of the initial state. There's no question that the universe around us would look very different if some of these parameters were changed. The controversial claims are two: that intelligent life can only exist for a very small range of parameters, in which our universe just happens to find itself; and that the best explanation for this happy circumstance is that God arranged it that way.

I have argued before that this makes no logical sense. It seems to imply that god was somehow locked into a blue-print for what humans should be like, and then had to carefully retro-engineer the evolution of the entire universe in order that the humans determined by that blueprint could emerge and survive. But this seems pointlessly Rube Goldbergish. The simpler thing for an omnipotent designer god to do would be to first create the universe and then design humans to fit into whatever emerged. After all, a god can presumably do anything and could have designed us to live in the vacuum of deep space or in the Sun or on any planet in the universe under any conceivable conditions.

But even if we take the fine-tuning argument of religious people on their own terms, we are by no means forced to the conclusion that a god is necessary. In fact, Carroll lists other possible alternatives:

  1. Life is extremely robust, and would be likely to arise even if the parameters were very different, whether or not we understand what form it would take.
  2. There is only one universe, with randomly-chosen parameters, and we just got lucky that they are among the rare values that allow for the existence of life.
  3. In different regions of the universe the parameters take on different values, and we are fooled by a selection effect: life will only arise in those regions compatible with the existence of life.
  4. The parameters are not chosen randomly, but designed that way by a deity.

So postulating a god is only one of many options to explain fine-tuning and by no means the most plausible one. It is not even the most attractive one.

Carroll then addresses the position that religion supplies the answers to the 'why' questions that science cannot.

These ideas all arise from a conviction that, in various contexts, it is insufficient to fully understand what happens; we must also provide an explanation for why it happens – what might be called a "meta-explanatory" account.

It can be difficult to respond to this kind of argument. Not because the arguments are especially persuasive, but because the ultimate answer to "We need to understand why the universe exists/continues to exist/exhibits regularities/came to be" is essentially "No we don't." That is unlikely to be considered a worthwhile comeback to anyone who was persuaded by the need for a meta-explanatory understanding in the first place.

Granted, it is always nice to be able to provide reasons why something is the case. Most scientists, however, suspect that the search for ultimate explanations eventually terminates in some final theory of the world, along with the phrase "and that's just how it is." It is certainly conceivable that the ultimate explanation is to be found in God; but a compelling argument to that effect would consist of a demonstration that God provides a better explanation (for whatever reason) than a purely materialist picture, not an a priori insistence that a purely materialist picture is unsatisfying.

There is no reason, within anything we currently understand about the ultimate structure of reality, to think of the existence and persistence and regularity of the universe as things that require external explanation. Indeed, for most scientists, adding on another layer of metaphysical structure in order to purportedly explain these nomological facts is an unnecessary complication.

It is hard for religious people to accept that there need not be an answer to every 'why' question. What is laughable is that after insisting that the why questions must have answers, religious people simply make up stuff, however preposterous or implausible it may be, without any evidence or even attempt at justification, and then proudly proclaim that they have solved the problem. It is better to accept that some things are just the way they are than make up an answer that has no evidence or reason behind it.

June 07, 2011

Why a god is not necessary to create the universe

In an article titled Does the Universe Need God?, cosmologist Sean Carroll provides a rejoinder to those who would try to squeeze god in as an answer to what they perceive as unexplained gaps in our knowledge. It is a long article that is worth reading in full but for those who lack the time, I will excerpt some of the key points.

He starts by making the same point that I made in the series Why atheism is winning, that the long-term outlook for religion is extremely bleak because science and its associated modernistic outlook is making it irrelevant in ways that are hard to ignore even by the most determined religionist.

Most modern cosmologists are convinced that conventional scientific progress will ultimately result in a self-contained understanding of the origin and evolution of the universe, without the need to invoke God or any other supernatural involvement. This conviction necessarily falls short of a proof, but it is backed up by good reasons. While we don't have the final answers, I will attempt to explain the rationale behind the belief that science will ultimately understand the universe without involving God in any way.

Those who want to insert god somewhere, to show that he/she/it is necessary in some way, need to realize that they have at most a window of one second just after the Big Bang to work with.

While we don't claim to understand the absolute beginning of the universe, by the time one second has elapsed we enter the realm of empirical testability. That's the era of primordial nucleosynthesis, when protons and neutrons were being converted into helium and other light elements. The theory of nucleosynthesis makes precise predictions for the relative abundance of these elements, which have passed observational muster with flying colors, providing impressive evidence in favor of the Big Bang model. Another important test comes from the cosmic microwave background (CMB), the relic radiation left over from the moment the primordial plasma cooled off and became transparent, about 380,000 years after the Big Bang. Together, observations of primordial element abundances and the CMB provide not only evidence in favor of the basic cosmological picture, but stringent constraints on the parameters describing the composition of our universe.

He then clarifies what it means to talk about the Big Bang event, a singular event in time, as distinct from the Big Bang model that is the working out of the aftermath of that event.

One sometimes hears the claim that the Big Bang was the beginning of both time and space; that to ask about spacetime "before the Big Bang" is like asking about land "north of the North Pole." This may turn out to be true, but it is not an established understanding. The singularity at the Big Bang doesn't indicate a beginning to the universe, only an end to our theoretical comprehension. It may be that this moment does indeed correspond to a beginning, and a complete theory of quantum gravity will eventually explain how the universe started at approximately this time. But it is equally plausible that what we think of as the Big Bang is merely a phase in the history of the universe, which stretches long before that time – perhaps infinitely far in the past. [My italics] The present state of the art is simply insufficient to decide between these alternatives; to do so, we will need to formulate and test a working theory of quantum gravity.

The problem with "creation from nothing" is that it conjures an image of a pre-existing "nothingness" out of which the universe spontaneously appeared – not at all what is actually involved in this idea. Partly this is because, as human beings embedded in a universe with an arrow of time, we can't help but try to explain events in terms of earlier events, even when the event we are trying to explain is explicitly stated to be the earliest one. It would be more accurate to characterize these models by saying "there was a time such that there was no earlier time."

To make sense of this, it is helpful to think of the present state of the universe and work backwards, rather than succumbing to the temptation to place our imaginations "before" the universe came into being. The beginning cosmologies posit that our mental journey backwards in time will ultimately reach a point past which the concept of "time" is no longer applicable. Alternatively, imagine a universe that collapsed into a Big Crunch, so that there was a future end point to time. We aren't tempted to say that such a universe "transformed into nothing"; it simply has a final moment of its existence. What actually happens at such a boundary point depends, of course, on the correct quantum theory of gravity.

The important point is that we can easily imagine self-contained descriptions of the universe that have an earliest moment of time. There is no logical or metaphysical obstacle to completing the conventional temporal history of the universe by including an atemporal boundary condition at the beginning. Together with the successful post-Big-Bang cosmological model already in our possession, that would constitute a consistent and self-contained description of the history of the universe.

Nothing in the fact that there is a first moment of time, in other words, necessitates that an external something is required to bring the universe about at that moment. [My italics]

The Big Bang event itself does not necessarily imply that the universe had a beginning in time and even if it should turn out that it had, it does not imply a beginner. This strikes at the heart of the arguments of religious apologists who need a beginning to make their claim say that a beginning necessarily implies a beginner. That argument is weak to begin with, but is the main one they have for god.

Religious people know that this conclusion is a devastating one for them. After all, if no god is required to create the universe, then he is truly an unnecessary concept. So they will fight or ignore or obfuscate this point with theological jargon.

May 29, 2011

Inattentional deafness

I have long been intrigued by the fact that when I am absorbed in reading, I completely miss what people have said, even if they have been speaking directly to me. This can be embarrassing but in my case people tend to indulgently excuse it because of the stereotype of the 'absent minded professor'. Being a theoretical physicist also helps since we are considered to be a little weird anyway.

But since I have been in the same room as the speaker, the sound waves must have entered my ears and gone to my brain but I have absolutely no memory of hearing anything. It is like the sound never even entered my head. This article explains why.

The researchers believe this deafness when attention is fully taken by a purely visual task is the result of our senses of seeing and hearing sharing a limited processing capacity. It is already known that people similarly experience 'inattentional blindness' when engrossed in a task that takes up all of their attentional capacity – for example, the famous Invisible Gorilla Test, where observers engrossed in a basketball game fail to observe a man in a gorilla suit walk past. The new research now shows that being engrossed in a difficult task makes us blind and deaf to other sources of information.

So it seems like we never really 'hear' anything until the brain has actually processed the incoming sound waves to register as sound. If the part of my brain responsible for this task is otherwise occupied, I haven't really 'heard' it.

This has happened to me other than reading, when I am merely thinking about something and have tuned the speaker out. I am sure everyone has had the same experience of daydreaming and missing what was said. This adds to the evidence that certain kinds of multitasking are impossible at a basic cognitive level.

May 19, 2011

The McGurk effect

Blog reader Henry sent me the link to this clip from the BBC program Horizon of what is known as the McGurk effect, that shows that when the brain receives two different inputs, one aural and one visual, the brain forces you to register just one. Lawrence Rosenblum of the University of California, Riverside explains this effect and demonstrates how in this particular case the visual overrides the sound.

If we cannot do such a simple act of multitasking, imagine how unlikely it is that we can do more complex and challenging multitasking.

May 18, 2011

The motives of the Templeton Foundation

The June 21, 2010 issue The Nation has a good article by Nathan Schneider titled God, Science and Philanthropy that looks at the work of this wealthy foundation that dangles generous grants and a cash prize every year that is larger than the Nobel prize that goes, as Richard Dawkins says, "usually to a scientist who is prepared to say something nice about religion."

Along with providing support for politically right-wing organizations, the foundation's goal seems to be to lure scientists to sign on to the idea that science and religion are compatible. Nobel prize winning chemist Harold Kroto is one of those fighting back against it and says of the foundation that "They are involved in an exercise that endangers the fundamental credibility of the scientific community."

The myth of multitasking

Since I work at a university and am around young adults all the time, I have long been aware that young people today are avid consumers of multimedia, who are adept at emailing, texting, listening to mp3 players, surfing the web, checking up on Facebook, etc. It seems like they are quite proficient at multitasking.

I have always been a poor multitasker. I cannot read or do any work that requires serious thinking if I can hear conversation or loud noises in the background. I have found that I cannot even listen to music in the background when reading. But I know people who seem to thrive on that kind of ambient sound and even deliberately go to coffee shops to do work such as grading papers or writing, things that would be impossible for me.

I had thought that my lack of ability to multitask was partly due to being old and not acquiring these skills while young, similar to my slow reaction time when playing video games (which results in being destroyed when playing them with my children) and my inability to manipulate my thumbs dexterously enough to use the small keys on cell phones without making numerous mistakes.

I thought my poor multitasking skills may also be due to a cognitive disability, similar to the one that prevents me from ever seeing the hidden 3-D images in those so-called autostereogram ('Magic Eye') pictures that were such a rage a few years ago. The Sunday papers used to have one and my daughters would look briefly at it and say, "Oh, look at the dolphins" or whatever it was that day whereas, despite my strenuous efforts at staring using all the recommended tricks, all I saw were colored dots and wiggly lines. I later learned that some people never see the hidden image, due to some feature of their visual-cognitive brain function. It was not reassuring to discover that I have a defective brain, and that there is no warranty.

But a study by Stanford researchers Eyal Ophira, Clifford Nass, and Anthony D. Wagner titled Cognitive control in media multitaskers and published in 2009 the Proceedings of the National Academy of Sciences seems to indicate that hardly anyone can really multitask and they are only deluding themselves that they can.

In an interview with the PBS program Frontline, lead researcher Clifford Nass said that it is possible to multitask certain things if those require different parts of the brain. For example, one might be able to cook and keep an eye on the children, or do gardening while listening to music or drive while talking. But classical psychology says that when it comes to doing more than one task that requires similar cognitive abilities, the brain simply cannot do that. What people do is try to rapidly switch their attention from one task to the next.

Nass and his colleagues hypothesized that to carry out successful multitasking of this latter sort required three distinct skills. One is the ability to filter, to detect irrelevancy, to be able to quickly distinguish between those things that are important and those that are not important. The second is the rapidity with which they could switch from one task to the next. The third is a greater ability to sort and organize the information in the brain so as to keep track of the results of their different tasks.

The researchers expected to find that people who were 'high multitaskers', i.e., people who tend to do multiple things, would be very good at least in one of those areas when compared to the 'low multitaskers', i.e., people like me who have to do things sequentially. What they were surprised to find was that the high multitaskers were terrible in all three areas.

So we know, for example, that people's ability to ignore irrelevancy -- multitaskers love irrelevancy. They get distracted constantly. Multitaskers are very disorganized in keeping their memory going so that we think of them as filing cabinets in the brain where papers are flying everywhere and disorganized, much like my office.

And then we have them being worse at switching from one task to another. ... It's very troubling. And we have not yet found something that they're definitely better at than people who don't multitask.

There is a serious cost to this. The researchers say that trying to multitask leads to deficiencies in analytical reasoning because people don't stick to one thing long enough to think it through but instead shift to another task, thus thinking in fragments.

We worry about it, because as people become more and more multitaskers, as more and more people -- not just young kids, which we're seeing a great deal of, but even in the workplace, people being forced to multitask, we worry that it may be creating people who are unable to think well and clearly.

And it seems as if simply telling them that trying to multitask is bad does not have any effect.

One would think that if people were bad at multitasking, they would stop. However, when we talk with the multitaskers, they seem to think they're great at it and seem totally unfazed and totally able to do more and more and more.

[V]irtually all multitaskers think they are brilliant at multitasking. And one of the big new items here, and one of the big discoveries is, you know what? You're really lousy at it. And even though I'm at the university and tell my students this, they say: "Oh, yeah, yeah. But not me! I can handle it. I can manage all these".

One of the biggest delusions we hear from students is, "I do five things at once because I don't have time to do them one at a time." And that turns out to be false. That is to say, they would actually be quicker if they did one thing, then the next thing, then the next. It may not be as fun, but they'd be more efficient.

One interesting finding in the study was that there were no gender differences, which goes against the myth that women are either naturally good multitaskers or become so because of the multiple roles imposed on them by society, such as caregiver, housekeeper, breadwinner, etc. This may be an illusion that arose from the fact that the multiple tasks that they have traditionally had to do (keeping an eye on the children while cooking or cleaning the house and listening to the radio) largely involved different parts of the brain and thus did not pose any serious cognitive conflicts.

The big challenge will be how to wean people away from thinking they can multitask. We are not doing them any favors by letting them continue to delude themselves.

May 11, 2011

Living sculptures

These wind-powered sculptures by Theo Jansen are amazing to behold.

You can see more of Jansen's creations at his website.

(Via Why Evolution is True.)

May 10, 2011

Altruism

Jerry Coyne has a nice post about the various forms of altruism and what biology and genetics does, or does not, have to do with them.

May 08, 2011

How the face evolved

Your Inner Fish is a book by Neil Shubin, the leader of the team that in 2006 discovered Tiktaalik, the 375 million year old transitional fossil between fish and land animal. The book shows how the basic morphology (i.e., form and structure) of human bodies can be traced back to our fishy ancestors.

The BBC has nice report (with a short video) on how some of our features, especially the face, came about. In particular, it explains the presence of the philtrum, the little groove on our upper lip just below the nose that has no obvious function.

April 26, 2011

Participants needed for brain study on morality

A reader of this blog told me that he had participated in a study on morality and that they are looking for more people.

Study Name: Moral Boundaries
Location: CCIR at University Hospital (in Cleveland)
Researcher: Megan Norr

Detailed Description:

This study consists of a 2.5 hour research appointment which takes place at the Case Center for Imaging Research at University Hospital. This study seeks to define which brain areas are responsible for moral judgment processing and to determine how they are working with other parts of the brain when we make moral judgments. By using behavioral questionnaires to gather information about individual attitudes on morality and fMRI to examine brain activation in response to a variety of stimuli, we hope to shed some light on the neural representation of human morality. During the appointment, participants will complete a computer-based questionnaire which takes roughly 1 hour and participate in an MRI scan which will take 1 hour and 10 minutes. The MRI session consists of a variety of unique tasks, including viewing of photos and video, listening to stories, reading text, and responding to opinion questions. Some stimuli in this study may be morally challenging or alarming. All participants will have the opportunity to view sample stimuli prior to beginning the study. Participation is voluntary. Participants will be compensated a flat rate of $50. If you are a medical doctor, medical student, or professional in the fields of biology or medicine, you are ineligible for this study.

I believe they are looking for people in the 30-40 year old range but they may not be too rigid about the boundaries.

The blog reader who participated said this about his experience:

In short, It's a morality study that uses MRI and behavioral measures to examine human morality. They investigate brain areas responsible for moral judgment and moral attitudes. It was a fun experience, asked many thought-provoking questions that revealed many subtleties about myself after some self-reflection and makes for interesting conversation amongst friends over drinks. Would love to give examples, but I don't want to influence the test in any way if you participate. So neat!.. O and the frosting and cherry on top: they give you a 3D movie of your brain on CD when you are done!

If you are interested you can register and schedule an appointment online or contact Megan Norr at megan.norr@gmail.com.

April 15, 2011

How the eye evolved

Richard Dawkins gives a clear explanation

April 10, 2011

Prostate cancer tests

Older men like me are routinely given a PSA test for prostate cancer as part of our check-ups. My numbers fluctuated from year to year. Some years my number would rise slightly and my physician would alert me to it, but the next year it would drop. I never did anything about it since I was not convinced that the tests were conclusive enough. Now a new study seems to indicate that my skepticism was justified, since the PSA seems to have high levels of false negatives and even higher levels of false positives.

This latest study was carried out in Norrkoping in Sweden. It followed 9,026 men who were in their 50s or 60s in 1987.

Nearly 1,500 men were randomly chosen to be screened every three years between 1987 and 1996. The first two tests were performed by digital rectal examination and then by prostate specific antigen testing.

The report concludes: "After 20 years of follow-up, the rate of death from prostate cancer did not differ significantly between men in the screening group and those in the control group."

The favoured method of screening is the prostate specific antigen (PSA) test.

However, around 15% of men with normal PSA levels will have prostate cancer and two-thirds of men with high levels of PSA do not in fact have prostate cancer.

One study has suggested that to prevent one death from prostate cancer you would have to screen 1,410 men and treat 48 of them. (My italics)

April 06, 2011

Scientific American on evolution education

Five years after the Dover trial, Scientific American looks at the state of teaching evolution.

(via Machines Like Us.)

March 03, 2011

Ron Paul thinks that evolution is only a theory

February 27, 2011

A new planet in our Solar system?

I was stunned recently by this report that there may be a massive new planet that we did not know about in our very own Solar system. I thought this must be a hoax report but apparently it is being considered as a serious possibility.

The hunt is on for a gas giant up to four times the mass of Jupiter thought to be lurking in the outer Oort Cloud, the most remote region of the solar system. The orbit of Tyche (pronounced ty-kee), would be 15,000 times farther from the Sun than the Earth's, and 375 times farther than Pluto's, which is why it hasn't been seen so far.

But scientists now believe the proof of its existence has already been gathered by a Nasa space telescope, Wise, and is just waiting to be analysed.

You would have thought that our knowledge of our own stellar neighborhood was complete but apparently not. The suggestion that Tyche existed was first made as far back as 1999 but not everyone is persuaded that it exists.

We should know with greater certainty either way by 2012. This is what makes science so much fun. There are always new discoveries to look forward to.

February 11, 2011

Solar sail vessel unfurled

The idea that the electromagnetic radiation can exert pressure is an interesting idea that I taught in my physics courses. As an example, the idea of using the pressure from solar radiation to power a spacecraft has been around for a long time, and I used to give this as a homework problem.

It looks like it has finally come to fruition. Japan used one to fly by Venus in 2010 and now NASA has deployed one to orbit the Earth. Plans are underway to use one to fly to Jupiter later in the decade.

solarsail.jpeg

(via Machines Like Us.)

January 14, 2011

The story of life

(via Machines Like Us.)

January 06, 2011

How the case against the MMR vaccine was fixed

Some of you may be aware that many parents are not giving their children the MMR (measles, mumps, rubella) vaccine out of fears that it may cause autism. These fears were generated by a paper published in 1998 by the British medical journal Lancet by Andrew Wakefield and others suggesting such a link. The findings were challenged but the journal only withdrew the paper in 2010.

The British Medical Journal has now published a detailed investigation and concludes that all of the twelve original cases reported had had their data misreported or altered in order to make the link.

The Lancet paper was a case series of 12 child patients; it reported a proposed "new syndrome" of enterocolitis and regressive autism and associated this with MMR as an "apparent precipitating event." But in fact:

  • Three of nine children reported with regressive autism did not have autism diagnosed at all. Only one child clearly had regressive autism

  • Despite the paper claiming that all 12 children were "previously normal," five had documented pre-existing developmental concerns

  • Some children were reported to have experienced first behavioural symptoms within days of MMR, but the records documented these as starting some months after vaccination

  • In nine cases, unremarkable colonic histopathology results—noting no or minimal fluctuations in inflammatory cell populations—were changed after a medical school "research review" to "non-specific colitis"

  • The parents of eight children were reported as blaming MMR, but 11 families made this allegation at the hospital. The exclusion of three allegations—all giving times to onset of problems in months—helped to create the appearance of a 14 day temporal link

  • Patients were recruited through anti-MMR campaigners, and the study was commissioned and funded for planned litigation

None of the families of the children were aware that Wakefield was involved in a lawsuit that would benefit from showing the link he purportedly discovered..

As the editors of the BMJ say:

Who perpetrated this fraud? There is no doubt that it was Wakefield. Is it possible that he was wrong, but not dishonest: that he was so incompetent that he was unable to fairly describe the project, or to report even one of the 12 children's cases accurately? No. A great deal of thought and effort must have gone into drafting the paper to achieve the results he wanted: the discrepancies all led in one direction; misreporting was gross.

Meanwhile the damage to public health continues, fuelled by unbalanced media reporting and an ineffective response from government, researchers, journals, and the medical profession. Although vaccination rates in the United Kingdom have recovered slightly from their 80% low in 2003 they are still below the 95% level recommended by the World Health Organization to ensure herd immunity. In 2008, for the first time in 14 years, measles was declared endemic in England and Wales. Hundreds of thousands of children in the UK are currently unprotected as a result of the scare, and the battle to restore parents’ trust in the vaccine is ongoing.

What Wakefield set in motion was a monstrous crime, playing on the great fear of parents that some well-meaning action on their part may cause harm to their children and for which they will never forgive themselves. Fortunately for me, my own children were vaccinated well before this scare arose otherwise I too would have agonized over what to do.

Due to so many not giving their children the vaccines because of these fears, all children have been put at risk, while many have suffered from each of these diseases and some have died. Despite this new report, it will be hard to convince die-hard vaccine skeptics to change their minds.

(via Balloon-Juice)

January 02, 2011

The story of the whale

Of all the arguments that are used by religious people against evolution, the most fraudulent is that there are no transitional forms between species. People who say this either willfully ignore the evidence that does exist or think that a transitional form must be a hybrid between two currently existing species.

Do you think that no one could be that stupid? Behold the infamous crocoduck argument.

Yes, some creationists like Kirk Cameron are so ignorant of the theory of evolution that they will actually go on national TV and make fools of themselves in that way.

Fossilization can occur only under very special conditions which is why they occur so rarely and why the discovery of transitional forms like Tiktaalik are so notable.

But if someone should raise this argument with you, point them to whale evolution. Since 1978, we have pieced together in step-by-step form how a mammal that can live in the ocean came about and it is a truly remarkable and well evidenced story. The short video showing how the story was pieced together is fascinating.

It's the tale of an ancient land mammal making its way back to the sea, becoming the forerunner of today's whales. In doing so, it lost its legs, and all of its vital systems became adapted to a marine existence -- the reverse of what happened millions of years previously, when the first animals crawled out of the sea onto land.

Some details remain fuzzy and under investigation. But we know for certain that this back-to-the-water evolution did occur, thanks to a profusion of intermediate fossils that have been uncovered over the past two decades. (My italics)

Starting with wolf-sized carnivores that existed between 60 and 37 million years ago, we see in the fossil record the steady evolution of features that were once suitable for living on the land becoming adapted to water.

None of these animals is necessarily a direct ancestor of the whales we know today; they may be side branches of the family tree. But the important thing is that each fossil whale shares new, whale-like features with the whales we know today, and in the fossil record, we can observe the gradual accumulation of these aquatic adaptations in the lineage that led to modern whales.

The story of the whale likely will not convince your hardcore creationists because they have learned how not to see things they don't want to see. And the truly loony will say that god deliberately planted these fossils to give us the impression that evolution occurred.

But the story of the whale cannot fail to have an impact on those who are genuine seekers of truth.

December 17, 2010

Plenty of time for evolution to occur

Critics of evolution sometimes try to argue that the mechanism of natural selection works too slowly to produce the world we now have in the time that was available. P. Z. Myers shows why that argument is wrong.

December 11, 2010

How bacteria talk to each other...

... and can thus work together for good or bad. Fascinating.

November 02, 2010

Why do so many birds die by flying into power lines?

This was a puzzle and attempts to make the power lines more visible failed. Apparently the answer is that birds have blind spots in their field of vision that make the power lines 'invisible' to them, due to the way they have evolved to become successful foragers.

Although the heavy bustard differs greatly in general body shape from the delicate crane and stork, the birds share a foraging technique - visually guiding their bill to take food items.

This technique requires excellent vision at the end of the bill, resulting in a narrow field of vision and wide "blind spots".

"Once we saw the wisdom of looking at the problem through birds' eyes rather than human eyes, it all made sense," says Professor Graham Martin.

"These birds can see straight ahead in flight but they only need to pitch their heads forward by a small amount and they will be blind in the direction of travel."

Many species of bird have been observed looking down during flight, possibly to locate fellow birds and suitable foraging and nesting sites.

Narrow binocular fields combined with birds' tendencies to look down effectively means certain species cannot see power lines until it is too late.

It is sad that there seems to be nothing we can do about it.

November 01, 2010

Alcohol more harmful to society than heroin?

The former chief drug advisor to the UK government, who was sacked from that post in 2009, has published a study that examines the harm to the individual and to society of various drugs.

alcohol.gif

Heroin, crack cocaine, and crystal meth are the most harmful to individual users but the widespread use (and abuse) of alcohol is what makes it the most harmful to society, followed by heroin and crack cocaine.

October 21, 2010

Physicists and climate change

In 2007, the American Physical Society issued a short but strong statement stating that the evidence for global warming is incontrovertible. It is no secret that there is a very small but vocal minority within the APS membership that disputes the idea that global warming has a significant human-based cause and who were upset with the APS's strong stand. Because of the fuss they created, the APS issued a longer clarifying statement in 2010 providing some context and the basis of their reasoning. Both statements can be read here.

A minor kerfuffle has now broken out because a physicist named Hal Lewis has resigned from the American Physical Society in protest at its stance on climate change. (Thanks to Chaz for the link.)

I am not sure why it is significant when a retired 87-year old physicist whose work during his research career had nothing to do with climate change resigns from the APS in protest. He is not a 'top' physicist in that although I do not doubt that is competent in his specialized field and known within it, I would guess that most physicists have not heard of him. The claim in some global warming skeptic circles that Lewis's resignation letter is the equivalent of Martin Luther nailing his theses to the church door that sparked the Protestant reformation is laughable. I predict that it will not cause even a ripple within the physics community.

Lewis is not like Freeman Dyson, for example, another 87-year old physicist who is also a global warming skeptic. Although he too has no background in climate science, at least Dyson is very well known among physicists and any theoretical physicist in any field around the world would likely know his name and have some awareness of his work.

I agree with Lewis that money is having a negative effect in general in that it may be distorting the direction of research, but there is no evidence to support his charge that it has influenced the APS's stance on climate change.

The APS has issued a statement in response to the Lewis resignation.

October 17, 2010

Benoit Mandelbrot

The mathematician who founded the discipline of fractal geometry has died at the age of 85. To see some of the beautiful patterns generated by fractals, see here.

October 09, 2010

DNA coiling and replicating

Via Machines Like Us, here is a wonderful animation of DNA coiling and replicating.

October 06, 2010

The Sound of Science

(via Why Evolution is True.)

September 07, 2010

Stephen Hawking on the universe and god

Recently religious apologists have taken to harping on the question "How can something come from nothing?" because they think that science cannot explain how the universe came into existence. Of course, their own answer that "God must have done it!" is not an answer at all since it merely shifts the problem to that of how god could come into being from nothing.

Stephen Hawking has recently published a book that says that we can indeed understand how the universe came into being without invoking god. The idea itself has been known for sometime but when Hawking says it, it generates a lot of media attention. Cosmologist Sean Carroll explains Hawking's ideas in a three-minute video.

In short, science has not proved that there is no god (because such proofs are impossible) but has shown is there is no need for god.

November 18, 2009

Transitional forms

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

In the previous post, I said that one thing that keeps creationists from 'seeing' the truth of evolution is that their teleological viewpoint makes them think that species in their current form are the aim of creation. If that is the case, why would god bother making anything else? Hence ancestral forms of current species that are unlike anything that currently exist simply have no place in their model.

Another mental block that prevents them from seeing transitional forms for what they are also arises due to this teleological viewpoint. Here they are misled by the very word 'transitional', which suggests something less that perfect and on the way to perfection.

In an online debate with Eugenie Scott, the head of the National Center for Science Education, Ray Comfort makes the following jaw-dropping statement where he illustrates this misconception by pointing to what he thinks is the weakness of the theory of evolution:

Nothing we have in creation is half evolved. The cow has a working udder to make drinkable milk. The bee has working apparatus to make edible honey. We don't find a half-evolved cow or bee. None of the 1.4 million species on the Earth has half an eye. All have the necessary functioning equipment, from the brain, to the teeth, to the eye, to limbs, to reproductive necessities. Everything that we see in creation is in full working order—from the sun, to the mixture of the air, to the seasons, to fruit trees and vegetables, to the animal kingdom—from the tiny ant right up to the massive elephant.

But not only do we see this mature completion in creation; we see it displayed in the fossil record. It reveals that each animal was complete.

I went to the Smithsonian to see the fossils galore, and they were there—millions of fossils that were evidence of special creation. The Smithsonian didn't have any transitional fossils that proved evolution (staunch believers claim that they have them, but not on display). I also visited the evolution museum in Paris (Grande Galerie de L'Evolution). I took a camera crew, and we spent an hour looking for the evolution exhibit. It didn't have one. All it had were millions of fossils of fully formed animals that God created (my italics).

This is a perfect example of creationists not 'seeing' the evidence for evolution that the rest of us see. It reveals the creationist teleological belief that everything we have now is in its final form and is functioning as designed. The very use of the phrase 'half evolved' reveals the deep misconceptions originating from a teleological viewpoint, because that phrase is meaningless unless one sees current species as being in their final, perfectly functioning forms.

In this view, a 'transitional' form must be something less than perfectly functioning. What Comfort thinks evolution predicts is that transitional forms should consist of animals malformed in weird ways, like cows with udders that do not produce milk or bees that have not figured out yet how to make honey or human beings with only one leg. This displays a staggering ignorance of the most basic elements of how evolution works. But because Comfort has a teleological view that starts from the end, he cannot see that all of us, even though we are fully functioning and adapted to our present environment, are also at the same time transitional forms even though we don't know how we will evolve in the future.

Evolution tells us what we evolved from, not what we are evolving to. Every species that lives now or has ever lived is both 'fully evolved' (in that it is the result of successful adaptations to its past environments) and a transitional form (in that it will evolve in the future as a result of new environmental pressures). There is no such thing as being 'fully evolved' in the Comfort sense of having reached unchanging perfection.

There are only three reasons I can think of for people making the kinds of extraordinary statements that Comfort makes above.

One is, of course, outright stupidity, coupled with ignorance. One should never rule that out.

Another reason is dishonesty, in that they know they are spreading falsehoods about what evolution is but think that saving souls for Jesus compensates for lying to them. One cannot rule that out either. The ranks of religious liars and charlatans are legion.

The third and most charitable explanation, which is what I am suggesting in this series of posts, is that that they simply haven't been able to make the Gestalt-type switch from the old teleological and Platonic worldview to the modern scientific one. While scientists can look at living organisms and fossils and see them as both fully functioning and transitional, creationists can see only a 'fully evolved' object. This is an almost perfect example of what happens when you cannot make the Gestalt switch to see two images while viewing a single object. While scientists can look at the image below and see both a duck and a rabbit, for creationists the duck is still only a duck, and as a consequence, the two pointy-things on the left can only be its bill.

Duck-Rabbit_illusion.jpg

It is quite sad.

POST SCRIPT: Here's a 'fully evolved' ape

From the BBC comedy show Not the Nine O'Clock News.

November 17, 2009

Why creationists do not 'see' evolution

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

One specific creationist religious belief whose origins I have been curious about is the bizarre argument that is advanced by anti-evolution religious people about how the lack of transitional fossils is undermining the theory of evolution. This argument mystifies scientists because of course there are huge numbers of such fossils. The evidence is incontrovertible. In fact, every living or fossilized organism can also be considered a transitional form, since change is constant. It should also be borne in mind that Darwin arrived at his theory without having the wealth of fossils that are now available, basing his arguments largely on biogeography, the similarities in body patterns of animals, embryology, and the existence of vestigial organs. Nowadays, the fossils that keep being found and the relationships that have been discovered between the DNA molecules of species have sealed the case for evolution.

Fossils are extra evidence and the case for evolution would be strong even without them. So why do creationists keep harping on transitional fossils? One reason is because they think that that is their strongest point. They also know that fossils seem more persuasive to the general public because we can actually see them with the naked eye.

But it may be that they are possessed of a deep misconception (like those involving electric current) about how evolution works that prevents them from actually 'seeing' the evidence the way that scientists see it. Changing that deep misconception requires a Gestalt-type switch but may prove as hard as getting people to understand that electric current flows in closed loops and is not used up.

A few weeks ago, I had quite a bit of fun with Ray Comfort's banana argument and with Kirk Cameron's belief that a transitional fossil is a weird hybrid between two existing species, the latter giving as an example an animal with the head of a crocodile and the body of a duck, which he cleverly calls a 'crocoduck'. But it appears that I was wrong in crediting him with originating this inspired piece of idiocy. It apparently goes back much further to at least Duane Gish, one of the founders of 'modern creationism' (now there's an oxymoron for you). Biologist Jerry Coyne says he heard Gish give a talk where he showed a cartoon of what he expected a transitional form between a fish and a mammal to look like. It consisted of an animal whose front half was a cow and rear half was a fish. (Jerry Coyne, Why Evolution is True, 2009, p. 47.) Gish's message, like Kirk Cameron's, is "Ha! Ha! These wacky evolutionists may be willing believe such crazy things but we are too smart for that."

In this case, I think that these religious people have a wrong point of view of species that dooms them from the start. Like the pre-Galilean theorists of motion who thought that the end point of motion was what was important, or that of Platonic idealists who focused on the essential unchanging nature of things, they too make the mistake of starting from the end point.

In the case of biology, this translates into a teleological view that sees all the current species as the end point, the convergence if you will, of a grand plan. Hence the word 'transitional' does not mean to them an ancestor of a current species that looks different from anything that we currently see, because such things are inconceivable in their teleological model which sees everything as purpose-driven. For them, such an organism would be unnecessary, not serving any purpose. As long as they have a teleological view of the world with its current life forms being representatives of a Platonic ideal, the very word 'evolution' will mean something very different to them from what it means to the rest of us.

So what can a transitional form mean to people with that view? The only transitional forms that they can conceive of are the curious hybrids they keep coming up with, like the crocoduck and the cow-fish. Unfortunately, as I said yesterday, even some of the visual images that we have of the process of evolution, such as the one that draws it as fish→amphibian→monkey→human (with the drawing of each showing what a current typical specimen looks like), reinforce this misconception by suggesting that evolution consists of transitions between forms that currently exist.

When these creationists claim there is lack of fossil evidence of transitional forms, they mean the absence of fossils of these bizarre hybrids. It is clear that people like Gish, Comfort, and Cameron are 'seeing' the theory of evolution in a very different way from the way that scientists see it, and this explains why they will keep coming up with theories so outlandish that we are often at a loss to know how to even start to refute them.

Until they make that Gestalt switch and see evolution and transitional forms the way that scientists see it, they are hopelessly lost. The duck, for them, will remain a duck.

Another obstacle to creationists 'seeing' evolution will be discussed in the next post.

POST SCRIPT: Science vs. religion debate

Thanks to Machines Like Us, you can see the entire recent debate between Christopher Hitchens, Sam Harris, and Daniel Dennett on the one hand versus Dinesh D'Souza (Roman Catholic), Shmuley Boteach (Orthodox Jewish rabbi), and Robert Wright (whom I have labeled as a religious atheist) on the other. It was held at the La Ciudad de las Ideas in Mexico.

November 16, 2009

The power of subconscious theories

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

The existence and history of religion tells us that people are willing to believe things for which there is no evidence and that they will fight to hold on to them even in the face of overwhelming evidence and arguments to the contrary. But when those beliefs collapse, as they sometimes do, the switch to disbelief can often be quite sudden. I know that in my case, I had been struggling (unsuccessfully) to reconcile my scientific ideas with that of a god for some time. The realization that everything made a lot more sense if there was no god hit me like a Gestalt switch.

One specific creationist religious belief whose origins I have been curious about is the bizarre argument that is advanced by anti-evolution religious people about how the lack of transitional fossils is undermining the theory of evolution. This argument mystifies scientists because it is so palpably wrong and the fossil evidence is so strong. So where does this weird idea come from? And why does it persist?

As much of research in science education has shown, robust misconceptions are often not simply bits of false knowledge (like thinking that Portland is the capital of the state of Maine) that can be easily corrected, but instead are the manifestations of elaborate theories that emerge from a deeply rooted but fundamentally flawed premise. As long as that flawed premise remains intact and unexamined, the misconceptions that flow from it will reappear even if countered in specific cases.

I have seen this phenomenon in my own teaching of electricity to people without a science background. One of the strong misconceptions that people have about electric current is that it emerges from a source (a battery or an electrical outlet), flows through the wire, and is then 'used up' by the radio or light or whatever device it is connected to. They also think that a battery always supplies the same amount of current. Based on this model of electricity, they will then make wrong predictions about how current will flow in more complicated circuits, say by connecting two or more devices to the same source of current.

In actuality, current is never used up. It just flows around in a circuit. Current flows out of one end of the battery (or other source), goes through one wire to the device, passes through the device, and then flows back through another wire into the other end of the battery. The amount of current flowing out of the battery at one end is exactly equal to the amount of current flowing into it at the other end. But it is extraordinarily hard to persuade novice learners of this model, even when they want to learn about electricity and have no reasons to resist it. After all, the Bible does not say anything about electricity. When I tell them how current really behaves, they believe me because I am an authority figure. But yet the misconceptions persist.

If you teach the right model of current to people and then ask them a direct question about how current flows, they will give back the right answer. But when they are asked something indirect, like giving them a circuit and asking them to predict how current will flow, very often they will come up with an answer that is at variance with how it really will behave. If you trace the reasoning of the wrong answer back to its source, you will find that it arises from their original misconception of current being used up and the battery producing a fixed amount of current, even though they consciously thought they had rejected that old way of thinking. When you point this out, they will think that this time they have definitely overcome the misconception. But when they are given a yet more complicated circuit, very often they will make a wrong prediction again, based on the same underlying misconception.

It is only after it has been repeatedly pointed out to them the important role that their basic deep misconception plays in their surface thinking that they switch to seeing the current flowing in a circuit. Once they make that switch in their basic misconception, there is no going back. They cannot imagine that they could have ever thought otherwise.

The reason this particular misconception about current is so deeply held is because people have constructed it on their own. Most of them are not even aware that they have this underlying theory of electricity because they have not consciously thought about it. The theory is built intuitively. Nobody taught it to them, they just 'picked it up' because it makes sense. After all, they know that appliances have a power cord that must be connected to an electrical supply system in order to work. They know that electrical devices 'use up' power because batteries eventually die. The power cord looks like a single tube, like a garden hose, and thus electricity seems like it can flow only in one direction. All these things make sense by assuming their simple model. Most people do not look more closely and wonder why the plug has two prongs and they do not break open the wires or their devices and find that there are incoming and outgoing pathways for the current.

The theories that people intuitively create for themselves are the hardest to refute because they are buried deeply in their thinking and are not consciously articulated by them. The consequences of these misconceptions are often erroneous but if we only correct the consequences without understanding and addressing the source, then we will find that same misconception rearing its head each time a novel situation is encountered.

The misconceptions about how evolution works are of the same kind. They are created deep in the minds of people at an early age, often by well meaning, science-supporting adults who tell their children that 'we evolved from monkeys' and by some of the visual images that we have of the process of evolution, such as the one that draws it as fish→amphibian→monkey→human (with the drawing of each showing what a current typical specimen looks like).

Once these misconceptions about evolution take root at an early age by a process of intuitive thinking, they become, just like the false electricity models, hard to dislodge in adulthood even by confronting people with the most clear reasoning.

As Jonathan Swift said, "You cannot reason a person out of a position he did not reason himself into in the first place."

Next: The role that deep misconceptions play in evolution

POST SCRIPT: How not to stalk off an interview

It is not uncommon for guests on TV or radio to get miffed about something, throw a fit, and stalk off the set. Some may even do it deliberately as a strategy, knowing it will get them publicity. But it sometimes doesn't work out well, with some forgetting to take off either the earpiece or the mike and getting yanked, resulting in a less-than-impressive exit.

But the award for the worst interview termination must surely go to Carrie Prejean. Remember her? Here are some keywords to jog your memory: Miss California who was stripped of her title, supporter of 'opposite marriage', breast implants, topless photos, Donald Trump, lawsuit against Miss USA pageant, sex video.

While on a tour promoting her book, she was irked by a question posed by Larry King of all people, who is notorious for his softball questions. So she removes her mike but instead of then walking off the set, she just sits there, talks to someone off-camera, and smiles at the camera as if she was competing in a pageant, leaving King baffled as to what is going on. Watch.

November 13, 2009

The key steps in adopting evolution

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

Making a Gestalt-type switch is often aided by nudges from outside sources, and in the case of evolution, two such factors came into play: the age of the Earth and concerns about the effects of human population growth.

Darwin was fortunate that he lived in a time when advances in knowledge in other areas, such as the idea of uniformitarianism in geology, were coming along at the same time that he was pondering all the things he was observing on his voyage on the Beagle. The first edition of the first volume of Charles Lyell's highly influential book The Principles of Geology was published in 1830 and was given to Darwin to read on his voyage on the Beagle that began in 1831. Its argument that small changes (such as erosion) can accumulate over long periods of time to produce major geological features such as mountains and gorges had an impact on him.

By measuring the rates of erosion and sedimentation that were occurring in his own time and calculating how long it would take at that rate to produce the existing rivers and canyons, Lyell concluded that the Earth must be hundreds of millions of years old. Furthermore, Lyell's books discussed some of the fossil evidence that existed at that time because he used them as aids in arriving at the ages of rocks, although Lyell himself believed in special creation.

The fact that the Earth was now possibly hundreds of millions of years old, rather than merely thousands, created an intellectual environment that was more open to acceptance of the idea that new species can gradually evolve from old ones, because that needed long time spans too.

Darwin (and also Wallace) had a Gestalt-type switch when he was struggling to find the mechanism that causes species to evolve in a way that seemed to indicate directionality. The trigger was Thomas Malthus's Essay on the Principle of Population (1798) that argued that populations would grow exponentially, except for the fact that they encounter limited resources that restricts growth because of starvation and premature death. This gave Darwin the idea that natural selection could serve as the mechanism he was looking for. In The autobiography of Charles Darwin 1809-1882 (Nora Barlow (ed), 1958, page 120), he describes his epiphany in ways that suggest a Gestalt-type switch:

In October 1838, that is, fifteen months after I had begun my systematic enquiry, I happened to read for amusement Malthus on Population, and being well prepared to appreciate the struggle for existence which everywhere goes on from long-continued observation of the habits of animals and plants, it at once struck me that under these circumstances favourable variations would tend to be preserved, and unfavourable ones to be destroyed. The result of this would be the formation of new species. Here, then, I had at last got a theory by which to work. (my italics)

Darwin and Wallace saw that if there are variations, then it makes sense that some variations are more likely to survive to adulthood and produce more offspring than others. If this advantageous property is heritable and passed on to its offspring then, over time, that particular variation will dominate the population. And by a very long series of such small changes, new species would emerge.

Once Darwin saw the world in this new way, there was no going back. And the rest, as they say, is history.

I have argued that the kinds of switches in viewing the world that Darwin and Wallace experienced are like Gestalt switches in perception. When one changes one's perspective, suddenly things fall into place and new patterns emerge. What seemed inexplicable, mysterious, and even impossible before suddenly seems clear and even obvious. And once the new way of seeing things is pointed out to others, they immediately see it as obvious too. As Thomas Huxley said after learning how the theory of evolution worked, "How extremely stupid not to have thought of that!" As a result, the new view spreads like wildfire.

But even when told what to look for, not everyone makes the switch. There are some people who never see the new pattern, either because of a rigidity of attitude or, as we will see in the next posting in the case of evolution, because they do not want to see the new pattern because they cannot bear to give up the old one. For them the duck remains a duck and they never see a rabbit.

Next: The mental block of creationists

POST SCRIPT: Well, that didn't take long!

On Tuesday, I wrote about the atheist billboard campaign in Ohio, putting up three billboards near Cleveland, Columbus, and Cincinnati. Some godly people in the Cincinnati area have already taken offense and threatened violence, requiring the billboard to be moved to another location.

See here for more details.

November 12, 2009

Gestalt switches in evolution

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

After Darwin published his book On the Origin of Species in 1859, large numbers of people were convinced in a very short time by his arguments, although full acceptance of the mechanism of natural selection took longer. But the idea of evolution had been in the air for some time. Why didn't people before him see what Darwin and his co-discoverer Alfred Russell Wallace saw, since they had access to much of the same evidence that he had?

A possible reason is because the theory of evolution also required a Gestalt-type switch. People had been viewing the world through a prism of Platonic ideal forms. In the Platonic view, real objects are approximations to their ideal forms and it is only the ideal forms that matter and from which we get true information. So for example, for any triangle that we draw on paper, the angles will not add up to exactly 180 degrees because of the inevitable imperfections of our drawing and the inaccuracies of our measuring instruments. But the angles of all ideal triangles (that we can only conceive of in our minds) will always add up to 180 degrees, and it is the properties of that ideal form that is important to understand, not our real-life approximations.

While this way of looking at things is perfectly suited for mathematics, it leads people hopelessly astray when applied to biology. In the case of biological organisms, the Platonic model translates into thinking of each species as having an ideal form and of real organisms as just approximations that can and do deviate from the ideal in unimportant ways. So real chickens, with all their variety, are just imperfect manifestations of the ideal, perfect chicken that we can only conceive of in our minds. It is this perfect chicken that we need to study to understand what makes a chicken a chicken, the essence of chickenhood.

But the problem is that the ideal perfect chicken will necessarily always remains the same and cannot evolve into anything else, just like a triangle will not become a square nor will the sum of its angles slowly change with time. Platonic thinking rules out change but is perfectly consistent with the idea of a god creating every species as perfect unchangeable beings and part of a grand plan.

Darwin and Wallace both realized that it is the real forms of organisms that are important, not its idealized version, and furthermore that there are no ideal forms in biology. There is no idealized chicken. The variations found in real chickens, rather than being a nuisance detracting from our understanding of the ideal chicken, actually contain the key to understanding the nature of chickens and how they and other things can change. This shift in perception made the variations in a species central to our understanding, and not peripheral.

The likely reason that Darwin and Wallace may have been able to make the switch is because they spent some time traveling to other parts of the world and saw much more of the variety of life than those who stayed pretty much in one locality. Darwin's voyage on the Beagle confronted him with so much new information about the diversity of life in so many new locations that it forced him into new ways of thinking. Alfred Russell Wallace also had his epiphany while travelling through Asia collecting biological specimens that were exotic and new to him.

Once Darwin and Wallace had made this switch, things started falling into place. They realized that if one adds up these small variations cumulatively over a long time, then even though each one is so small that it cannot be observed with the naked eye or even in one's lifetime, it can add up to huge changes, resulting in the emergence of new species, something that was ruled out by Platonic thinking.

Two things stood in the way of making such an idea workable. It seemed to require an inordinate amount of time, much longer than people at that time thought the Earth had existed, and it lacked a plausible mechanism for species change. An obvious objection to their model that they needed to find an answer for was why should the variations in organisms cumulatively add up to result in large changes? Why could they not simply vary randomly leaving, on average, no net change?

This is where other factors can play a role in making a Gestalt switch in perception.

Next: The key steps in 'seeing' evolution

POST SCRIPT: Jon Stewart parodies Glenn Beck

This clip has been all over the political blogs but it is well worth seeing. Utterly hilarious.

The Daily Show With Jon StewartMon - Thurs 11p / 10c
The 11/3 Project
www.thedailyshow.com
Daily Show
Full Episodes
Political HumorHealth Care Crisis

November 11, 2009

Perception changes in physics

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here. You can also listen to the podcast of the interview on WCPN 90.3 about the book.)

In an earlier post, I suggested (following Thomas Kuhn) that Gestalt-type switches can play an important role in the creation and adoption of new theories in science. Today I want to look at specific examples of such changes.

Take the case of a simple pendulum, made by hanging a small weight from a fixed point by a string and setting it in motion by pulling it back and releasing it. What had been observed from time immemorial is the weight swinging back and forth with decreasing amplitude before finally coming to rest at the lowest point in its trajectory. People used to interpret this motion as the pendulum weight, when released, 'seeking' (to use anthropomorphic language) to get to its final resting place at the lowest point in its trajectory, but initially overshooting the mark, trying again to get to the lowest point, overshooting again by a smaller amount, and so on, until it finally reaches its destination and stays there.

Viewed this way what the pendulum is 'trying' to do is to come to rest at the bottom but is prevented from doing so by overshooting it due to its motion. Hence the time taken from the instant of release to the final resting point would be the significant thing to measure to see if there are any patterns in this data. But we now know that the time taken to reach the lowest point in its trajectory is not a useful parameter and this is why this approach did not lead to any interesting results.

It took a Galileo to observe the same pendulum motion as everyone else but see it in a different way. He saw the fundamental aspect as an oscillation. In that view, what the pendulum is 'trying' to do is keep oscillating forever with the same amplitude but other factors prevent it from doing so, bringing it to rest. In this view, it makes sense to measure the period of oscillation (i.e. the time taken to go through one cycle) and this data does yield useful patterns, such as that the period is independent of the weight or the amplitude of motion (within certain limits), but does depend in a precise way on the length of the string.

The point is that how one views a phenomenon will determine what one chooses to measure. And what one measures determines what one will discover.

In the case of theories of motion in a straight line, the ancient Greeks saw the motion of bodies as headed towards something. In such a view, the key distance is the distance of the object from its final destination. It was only the reversal of worldview that saw the distance and elapsed time of the object from its starting point as the parameters worth measuring that yielded useful patterns of relationships that eventually culminated in Newton's laws of motion.

Once someone had made this Gestalt-type switch and were able to articulate the new view to others, others quickly started seeing the same thing. What had been seen as a duck was now a rabbit, what was as two faces was now a vase. But not everyone will see the world in the new way. Those who are strongly wedded to the old way of looking at the world will resist making the switch. It may not be that they see the duck and are consciously rejecting it in favor of the rabbit. It may actually be that they do not even 'see' the duck. For them, the rabbit remains a rabbit and never becomes a duck.

In the actual case of the rabbit and the duck image, it has been my experience everyone sees both shapes within moments of it being pointed out to them, although there are small differences in the time taken for the realization to hit. But there are other examples of switches where people struggle for a long time. (These are taken from this site where you can see even more examples.)

A popular one that some have a hard time seeing is the one below. People initially tend to see either one image or the other but not both. Once they have locked onto one image, they find it hard to switch until they are told what to look for and specific features are pointed out.

youngoldwoman.gif

The next one is even harder. It is not two images but requires one to see a single image instead of seemingly randomly scattered blobs. I initially could not see anything. Even after I was told what to look for, I still did not see it for some time until it suddenly 'appeared'. Now that I have seen it, it seems obvious.

dalmation.jpg

In both cases, most people do not see the picture on their own but need someone else to point out to them what they should be seeing before they suddenly see it for themselves. This was the particular genius of people like Copernicus, Galileo, Newton, Einstein, and (as I shall argue in the next post) Darwin. They looked at the same world that others did but saw it in a new way. And they were able to persuade others to see what they saw.

Next: Gestalt switches in evolution

POST SCRIPT: Buster Keaton film shorts

One of the funniest comics of the silent era was Buster Keaton. The Cleveland Cinematheque will show a series of his short films on Friday, November 13 at 7:30 pm. The films will be introduced by Robert Spadoni, professor of film studies at Case Western Reserve University. Accompanying the films will be live music, with pianist Shuai Bertalan-Wang playing the ragtime music of Scott Joplin.

For more details on location, admission prices, etc. see here. There is also a Facebook page about it.

November 09, 2009

Gestalt switches in science

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

In the history of science, we have often seen a theory being accepted and used over a long period and then replaced with a new one, with the transition occurring over a relatively short time. Sometimes the new theory is fairly simple and we marvel as to why people did not think of it before. For example, the Copernican heliocentric model is not a complicated idea when compared to the previous geocentric model. Similarly Newtonian mechanics can be formulated in terms of laws that are very simple mathematically and easy to understand. The essential ideas of Darwin's theory of evolution by natural selection can also be stated in a few simple sentences.

All three of these major new theories are of the kind that, if we had lived in the times when their inventors articulated them, we would have reacted exactly like T. H. Huxley, an early convert to Darwin's theory of evolution, who once he understood how natural selection worked, said "How extremely stupid not to have thought of that!"

So why did it take so long for people like Copernicus, Newton, and Darwin to come along with these new ideas? After all, the ancient Greek and Arab and Chinese civilizations were scientifically advanced. Why did it take over a millenium for us to develop modern science, which can arguably said to begin with Galileo?

This is the topic of study of historians and philosophers of science and they have come up with many factors to explain this phenomenon.

One explanation is, of course, the appearance of new evidence and data. If the new evidence is hard to reconcile under the old paradigmatic theory and causes serious problems for it, that can create an openness to new ideas and trigger the search for new theories. People try to see things in new ways.

Then there are the influences of developments in other areas. Advances in technology often lead to new data that were inaccessible before. The invention of telescopes, for example, allowed for the detection by Galileo of the moons orbiting Jupiter and dealt a serious blow to the geocentric model that said that every celestial body orbited the Earth. It became clear that other celestial objects could be the center of an orbit and thus the heliocentric idea became less outlandish.

Similarly, changes in the political, social, and intellectual climate may makes communities more open to ideas that were unthinkable before. The period we know as the Enlightenment was more open to new ideas and less wedded to religious dogma. Societies that are repressive in general are unlikely to be sources of great new intellectual discoveries.

One has also to take into account individual genius to create the new theory, though the way they contributed is often misunderstood. These geniuses often did not come up with completely new ideas but were able to recognize that the same buzz swirling around them as around others actually fit into a new pattern. Once they articulated that new pattern, others could almost immediately identify it as the right way to see things. But what enabled the pioneers to make that particular leap that eluded others who had access to the same ideas and knowledge?

Thomas Kuhn has argued, especially in his classic work The Structure of Scientific Revolutions, that what happened with these people is similar to the phenomenon known as the Gestalt switch, familiar to all of us in those visual puzzles where we can look at a single image and see it switching between a duck and a rabbit, or between a vase and two people facing each other.

Duck-Rabbit_illusion.jpg

Two faces and a vase.jpg

What happens with some scientific revolutions is that what everyone sees as a duck, one person suddenly sees as a rabbit. When they point out to others the new way of seeing the world, the reaction of others is similar to the reaction you get from people who initially saw only the duck (say) but now almost immediately see the rabbit. After the revelation, it is hard for people to imagine how they could not have seen it before because it seems so obvious.

Next: Specific examples of Gestalt-like switches in science

POST SCRIPT: Radio interview about my book

On Tuesday, November 10, I will be interviewed on the Cleveland NPR affiliate station WCPN 90.3 from 9:00-10:00 am on its program The Sound of Ideas. This was rescheduled from last Thursday.

The topic will be my latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom. You can listen online live on its webcast or listen to the podcast after the show.

It is a call-in show: local 216-578-0903 or toll-free 866-578-0903.

That same evening at 7:00 pm I will be speaking to the Center for Inquiry–Northeast Ohio in the second floor reading room of the Maple Heights library 5225 Library Lane, Maple Heights, OH 44137-1291. The event is open and free.

October 22, 2009

The interconnectedness of science

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

Even the most die-hard religious person will concede that scientific knowledge is extremely powerful. In thinking about evolution alone and the arguments presented for evolution by natural selection in Richard Dawkins's new book The Greatest Show on Earth, questions that might occur to the reader are: Why is science so powerful? What is it about its structure that has made it so successful?

This is a question that people have been grappling with for a long time and the answer is surprisingly hard to come by. The facile answer that science works so well because it produces truth is not easy to justify because great scientific theories in the past that were thought to be true have fallen by the wayside and there is little reason to think that we are better judges of the truth of theories than our predecessors were.

As long ago as 1906, Pierre Duhem in his book The Aim and Structure of Physical Theory laid out the difficulties that a scientist face in determining if a particular theory is true, by drawing an analogy between how a watchmaker and a doctor go about diagnosing the source of a malfunction in their respective areas of expertise:

People generally think that each one of the hypotheses employed in physics can be taken in isolation, checked by experiment, and then, when many varied tests have established its validity, given a definitive place in the system of physics. In reality, this is not the case. Physics is not a machine which lets itself be taken apart; we cannot try each piece in isolation and, in order to adjust it, wait until its solidity has been carefully checked. Physical science is a system that must be taken as a whole; it is an organism in which one part cannot be made to function except when the parts that are most remote from it are called into play, some more so than others, but all to some degree. If something goes wrong, if some discomfort is felt in the functioning of the organism, the physicist will have to ferret out through its effect on the entire system which organ needs to be remedied or modified without the possibility of isolating this organ and examining it apart. The watchmaker to whom you give a watch that has stopped separates all the wheelworks and examines them one by one until he finds the part that is defective or broken. The doctor to whom a patient appears cannot dissect him in order to establish his diagnosis; he has to guess the seat and cause of the ailment solely by inspecting disorders affecting the whole body. Now, the physicist concerned with remedying a limping theory resembles the doctor and not the watchmaker.

All of science is an interconnected web if theories. It is not like a set of independent modules where you can pluck one out and replace it with another. It is more like the way that the box springs in a mattress are all linked together. This is why it is so hard to replace one theory with another. All the other theories to which it is linked work to prevent the change.

This is why people who think that they can replace just evolution with some creationist idea du jour stumble badly. The theory of evolution gets its strength from that fact that it meshes well (though not perfectly because while science progresses it is never perfect) with the other theories of biology and chemistry and physics and geology and astronomy, as Dawkins so tellingly demonstrates. Creationist ideas go against all these other theories to various degrees. So when you reject the theory of evolution, you are pretty much rejecting all of science. Trying to replace evolution with the theory of intelligent design in a few cases is like (to switch analogies for the moment) trying to replace just one of the fuel injectors in a modern car with a carburetor from an older car. It just will not work.

An obvious objection to the above description is that it implies that all theories are locked in place forever, which is obviously false since we know that scientific revolutions have occurred in the past in single areas of science. How could that have happened? If you examine closely the history of how scientific revolutions occur, you see that they are preceded by extended periods of crises, when theories come under increased critical scrutiny and suspicion because of perceived weaknesses. Those correspond to the weakening, and even the slow removal, of the links connecting the theory under question to the rest of science. The other theories slowly adapt to the fact that one of their theories is suspect. This enables the suspect theory to be decoupled from the rest and replaced by the new theory.

Initially the new theory will work somewhat imperfectly because it will have few connections to the rest of the scientific theory web. But if it is a good theory that performs its own functions well and has at least some good working connections to other theories, the other areas of science will adapt to the new theory and new links will be forged, so that the end result will once again be a strong interconnected web of theories, but a different one from what existed earlier.

What religious people do not realize is that the theory of evolution is nowhere close to being in crisis and is firmly embedded in the fabric of science. In attempting to discredit it, they are taking on all of science. This is why they have failed so far and will continue to fail.

POST SCRIPT: Stephen Colbert gets ready for the end times

The Colbert ReportMon - Thurs 11:30pm / 10:30c
Yahweh or No Way - Legislation Prayers & Fake Shroud of Turin
www.colbertnation.com
Colbert Report Full EpisodesPolitical HumorMichael Moore

October 16, 2009

On quoting scientists-5: Religious scientists' beliefs about god

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

When scientists who are also religious believers are quoted as to why they believe in god, their reasons almost always fall into one of two classes. (I am excluding those who believe in the literal truth of their religious texts and, in my opinion, have effectively rejected science altogether.)

One is the ever-popular Argument from Personal Incredulity. This goes as follows:

1. There is no positive evidence for god.
2. But X (insert your preferred natural phenomenon here) is amazing.
3. I don't understand how X could have come about by natural processes.
4. Hence god must have done it.
5. Hence god exists.

The other is a self-serving circular argument that is driven by emotional needs:

1. There is no positive evidence for god.
2. But I want/need to believe in god.
3. Hence god must be acting in ways that preclude leaving any evidence.
4. Hence the absence of credible evidence for god is evidence for my belief that god chooses to act in ways that do not leave any evidence.
5. Hence god exists.

New atheists suggest that the following reasoning is simpler and makes more sense:

1. There is no positive evidence for god.
2. Hence there is no reason to believe in god.

It is in essence the advice that Bertrand Russell gave in his book Skeptical Essays, vol. I (1928):

I wish to propose for the reader's favourable consideration a doctrine which may, I fear, appear wildly paradoxical and subversive. The doctrine in question is this: that it is undesirable to believe a proposition when there is no ground whatever for supposing it is true. I must, of course, admit that if such an opinion become common it would completely transform our social life and our political system; since both are at present faultless, this must weigh against it.

I must say that I find that I find the willingness of those few scientists to express belief in anything more than a Slacker God somewhat surprising because it so fundamentally contradicts the basic assumptions under which science operates. The population geneticist J.B.S. Haldane (1892-1964), who did so much to advance the theory of evolution by natural selection by placing it on a firm mathematical footing, explained that he was an atheist simply as a result of his desire for consistency:

My practice as a scientist is atheistic. That is to say, when I set up an experiment I assume that no god, angel or devil is going to interfere with its course; and this assumption has been justified by such success as I have achieved in my professional career. I should therefore be intellectually dishonest if I were not also atheistic in the affairs of the world.

But this kind of desire to have a unified and consistent worldview is surprisingly rare. What religious scientists do is tacitly compartmentalize their thinking into two worlds: their scientific world where god does not act, and their religious world where god lives and acts. The word 'tacitly' is important. As long as you do not specify how this two-world system actually operates, you can ignore the huge contradictions that exist.

What I would like to ask the scientists who believe in god is the following question: Are you an atheist when you do scientific experiments, not allowing the hypothesis of god's action entering at all? If so, why do you have one set of beliefs when doing science and another set for all the other areas of your life?

The only way to make sense of this double standard is to assume that god thinks as follows:

If I feel like it, I may once in a while cure a sick person, while most of the time letting them die, sometimes cruel and horrible deaths. Once in a while I may avert a hurricane or tsunami from a populated area though most of the time I will let it destroy thousands of homes and people. I may save a few people in a plane crash just for the hell of it while killing off the rest. I may allow one baby to live and be rescued days after an earthquake that killed of its entire family and town, because I know my followers get a kick out of things like that and will rejoice in the 'miracle'. I will let an insane killer mow down many people in a crowded building just so that those whom he misses think that I picked them out to save. I will allow child rapist-murderers to get away with these and other horrendous crimes. I will create diseases that kill millions of people.

But I will never, ever, interfere with a scientist's experiments and mess up their search for scientific laws.

Because that would be wrong.

A physicist colleague of mine, a well-regarded scientist, is also an observant Jew. I once asked him how he reconciled his scientific work, which excludes supernatural intervention and explanations, with his belief in the Bible with all its stories of god messing around with the laws of the universe. He suggested that he thought that god used to do miracles and then decided around 2,000 years ago to not do any more.

"Why?" I asked.
"He must be having his reasons" he replied.

By invoking that ad hoc strategem, he was able to believe in the truth of the Bible and also avoid having to deal with the god hypothesis in his research. I think all religious scientists in the end adopt similar self-serving views. They just compartmentalize things differently and idiosyncratically depending on their personal beliefs and needs and preferences.

This is why I think Oxford University scientist Peter Atkins was exactly right when he said: "You clearly can be a scientist and have religious beliefs. But I don't think you can be a real scientist in the deepest sense of the word because they are such alien categories of knowledge."

POST SCRIPT: Interview

I was interviewed recently about an article that I had published called Death to the Syllabus! where I argued that our classrooms and syllabi had become too authoritarian and controlling, and that we needed to try and create a more collegial atmosphere in out classes if we were to achieve true learning. You can find the 25-minute podcast of the interview here.

October 15, 2009

On quoting scientists-4: God as metaphor

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

If one looks at the quotes of scientists used by religious believers, one sees that they fall into a familiar pattern. One is to take the metaphorical use of the word god by some scientists and imply that these imply belief in a real god. One of the most common examples is the popularity amongst religious people of a statement in Stephen Hawking's best-selling book A Brief History of Time that is often quoted this way: "[I]f we discover a complete theory…then we should know the mind of God". It has been seized upon by religious people to imply that Hawking believes in god, and is a prime example of this practice of 'quote mining'.

But Hawking, like Albert Einstein, is using god as a metaphor for complete knowledge, as can be seen in the full passage from which the quote is taken:

If we discover a complete theory, it should be understandable in broad principle by everyone, not just a few scientists. Then we shall all, philosophers, scientists and just ordinary people, be able to take part in the discussion of the question of why it is that we and the universe exist. If we find the answer to that, it would be the ultimate triumph of human reason-for then we would know the mind of God. (my emphasis)

In a BBC interview, he was asked to further clarify his statement that we might one day know the mind of god and his answer clearly indicates that his idea of god is nothing like the god that religious people believe in.

It seems that the universe is governed by a set of scientific laws. One might say that these laws were the work of god but it would be an impersonal god who did not intervene in the universe apart from setting the laws. What I meant when I said we would know the mind of god was that if we discovered the complete set of laws and understood why the universe existed we would be in the position of god… One could define god as the embodiment of the laws of nature. However, this is not what most people would think of as god. They mean a human-like being with whom one can have a personal relationship. When you look at the vast size of the universe and how insignificant and accidental human life is in it, that seems most implausible. (my emphasis)

Einstein was someone else who loved to use god as a metaphor in the same way as Hawking, and people have similarly seized on those quotes as evidence for at least a Slacker God. But Einstein viewed belief in god as a "childish superstition". In a letter written just a year before his death, he said:

The word god is for me nothing more than the expression and product of human weaknesses, the Bible a collection of honourable, but still primitive legends which are nevertheless pretty childish. No interpretation no matter how subtle can (for me) change this… For me the Jewish religion like all others is an incarnation of the most childish superstitions. And the Jewish people to whom I gladly belong and with whose mentality I have a deep affinity have no different quality for me than all other people.

Some scientists throw in god into their statements because it is a sure-fire way of drawing media interest. Physicists in particular seem to be prone to gratuitously using god as a metaphor. Leon Lederman gave his 1994 book the title The God Particle, which was his idea of a cute name for the Higgs boson, a particle that is predicted to play a crucial role in the standard model of particle physics but has not been detected as yet. Then there was this statement last week by two physicists speculating about why the Higgs boson (which is what the newly constructed massive Large Hadron Collider is designed to create) has been so hard to detect.

"It must be our prediction that all Higgs producing machines shall have bad luck," Dr. Nielsen said in an e-mail message. In an unpublished essay, Dr. Nielson said of the theory, "Well, one could even almost say that we have a model for God." It is their guess, he went on, "that He rather hates Higgs particles, and attempts to avoid them."

One can be sure that some religious people will seize on this statement as evidence for those scientists' belief in god.

But what Hawking or Einstein or Darwin or Dawkins or whomever believes about god is ultimately irrelevant. Unlike some religious people who unquestioningly accept what the Pope or other religious people or the authors of their religious texts say, atheists reject belief in god because there is no evidence for it and not because of any authority. That's it. Nothing more. If Richard Dawkins were to suddenly announce that he had had a vision of god and become a Christian, that would no doubt cause considerable surprise, shock even, but would not change anything about the existential status of god unless Dawkins could provide evidence that what he had experienced was not just a delusion or a psychotic episode but really was credible evidence of god's existence.

POST SCRIPT: Stephen Colbert on Democratic opponents of the public option

The Colbert ReportMon - Thurs 11:30pm / 10:30c
Send Your Medical Bills to Max Baucus
www.colbertnation.com
Colbert Report Full EpisodesPolitical HumorMichael Moore

October 14, 2009

On quoting scientists-3: What about statements about god?

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

I have said in the previous two posts that we should take scientists seriously when they talk about science (even outside their immediate fields of study) because they have their reputation for credibility at stake and they value that more than almost anything else professionally.

But what about when scientists go even farther afield and infer from that what they know about science to what they believe about god? Then the strength of their case rests only on the quality of the argument they make and the nature of the inferential reasoning they use. It does not rest on their scientific expertise except as far as the truth claims of the science on which they base their arguments is concerned. This affects the way we should use and evaluate the use of quotes.

The only purpose of using quotes in these cases is because the author has said something very succinctly or pithily and one wants to use their words in order to give them proper credit for expressing an idea. The quote by itself is never evidence either for or against the existence of god and the supernatural, but it is evidence as to the beliefs of the person who made the quote about the phenomenon. So a quote about what Darwin believed and said about god would not be evidence for or against god. But when it comes to the issue of Darwin's views on the existence of god, what he actually said would be relevant and well worth quoting.

Religious people tend to misunderstand this. They sometimes comb through the writings of famous dead scientists to find quotes that seem to suggest a belief in god, and use them as if it strengthens the case for god. This is a waste of time because it doesn't. For example, Charles Darwin died not believing in god. While there is no doubt whatsoever that his theory of evolution has made god increasingly redundant and strengthened the case for atheism, his disbelief by itself is not evidence against the existence of god.

Darwin's disbelief bothers some religious people and they think that if they could show that he was a believer in god, that discovery would undermine atheism. Such people sometimes even repeat the thoroughly debunked story of him having had a deathbed conversion to Christianity or make a big deal about the fact that Darwin explicitly rejected the label of atheist and embraced the term agnostic. They are misguided in their efforts. Neither of those things are relevant to the point that the theory of evolution seriously undermines belief in the existence of god.

Even if Darwin actually had made a deathbed conversion to Christianity, it would not prove anything about god either way. All it would have shed light on was about Darwin's state of mind as he lay dying. After all, his co-discoverer of the theory of natural selection Alfred Russell Wallace later in life seemed to embrace some forms of mysticism. Even the great scientist Isaac Newton believed in god in some form. But all that such stories tell us is what those people believed about those phenomena. By themselves they are not evidence for or against god or the supernatural.

One can sometimes use the consensus views of scientists about religion as evidence for some propositions about religion. As an example, suppose we take the new atheists' statement that science and religion are incompatible. The basis of this claim is that advances in science have made the god hypothesis increasingly redundant, that there is simply no need to believe in the existence of such an entity, and to invoke it is to turn one's back on methodological naturalism which is a foundational principle of modern science.

One consequence of this argument is that science as advanced even more, we would expect that the number of disbelieving scientists, especially those who are leaders in their fields and thus more intimately familiar with the frontiers of scientific research, should increase with time. As Oxford University scientist Peter Atkins said: "You clearly can be a scientist and have religious beliefs. But I don't think you can be a real scientist in the deepest sense of the word because they are such alien categories of knowledge."

As a result we might expect some circumstantial evidence in support of the claim that increasing depth of knowledge about science leads to greater disbelief. And there is. In medieval times or earlier there is no evidence that many scientists were disbelievers, unless they were keeping it secret. This is possible since death was a common punishment for heretics. But we have no way of really knowing the situation back then.

But with the enlightenment things began to change for the better. As Edward Larson and Larry Witham reported in a study published in Nature in 1998, at least in the 20th century there has been a steep drop from nearly 28% to 7% in the number of leading scientists who believe in a 'personal god', while the number of disbelievers and doubters rose from nearly 74% to 93%. If the numbers had gone the other way, that as science learned more and more about how the world worked that the number of religious scientists increased, then that would cast some doubt on the claim of the new atheists, although such data, depending as it does on people's beliefs, can never be conclusive about the truth or falsity of any proposition.

Next: God as metaphor

POST SCRIPT: No, let's not leave it there

Jon Stewart on the vapidity of the cable news shows.

The Daily Show With Jon StewartMon - Thurs 11p / 10c
CNN Leaves It There
www.thedailyshow.com
Daily Show
Full Episodes
Political HumorRon Paul Interview

October 13, 2009

On quoting scientists-2: When is a quote evidence, and for what?

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

I myself use direct quotes quite often and attribute them to the source whenever I can. Why?

One reason is simply style. Using quotes make for livelier reading. Inserting quotes set off differently from the rest of the text breaks up the visual monotony of the page, the way that dialogue does in fiction, and introducing the different rhythm of a new writer keeps the reader on her toes.

Another reason is to acknowledge the source of an idea that I am using. In writing a scholarly paper, one is obliged to track down the original source of an idea, not merely the person who brought it to your attention, but in blog writing it is acceptable to quote secondary sources.

Another reason is to introduce readers to other writers whom they may not have heard of before.

A fourth reason is that there are a lot of good writers out there who often express what I want to say much better than I can, so I use their words. I prefer to give direct quotes whenever possible rather than paraphrase because that leaves less room for unintentionally distorting their views. I cite the source whenever I can so that readers can check for themselves the full context of the quote if they think I am misinterpreting the words.

Why is that famous people are quoted more often than unknown people? It may often seem as if the authors of the quotes are being used as authority figures merely because of their fame, and the quotes themselves are evidence for some point of view, as if the beliefs of famous people have more weight. This is not necessarily so. It is more likely that people who are prolific and/or well-known and/or good writers get quoted more often because they have written more and are read more than obscure or poor writers.

Does the fame of the author give them more credibility? Yes sometimes, but only so far as what they say reflects their detailed knowledge of their subject. For example, when I make assertions about fields about which I have no direct knowledge, I like to quote the words of scholars or people whom I have confidence have actually studied the issue and have a reputation for presenting their subject with appropriate scholarly caution. This naturally skews the quotes in favor of well-known scholars since then I do not have to go through the dreary exercise of first establishing the quoted person's credentials in the field. Quotes by Richard Dawkins on evolution and Albert Einstein on physics have to be taken very seriously. Dawkins on physics and Einstein on evolution, not so much. Sarah Palin on evolution or physics, not at all.

Why do we take the words of scientists and other academic scholars seriously when they are talking about their own fields? Because academia works by peer review. The peers of scientists who are in a position to independently check their work would strongly challenge them if they were saying wrong things about the science, and in the absence of such critiques one can assume that they are expressing the consensus views of their field, even if there are some scientists who disagree with them.

The fact that there are some scientists outside the consensus does not weaken the consensus claims unless the theory really is experiencing a crisis, and it is usually fairly obvious when that is the case. As an example, in physics there are still some scientists who dispute the theory of relativity or the big bang, but those theories remain the consensus views of the community. There is no crisis there. When the consensus view among physicists is that the structure of the entire physical universe has the potential to be explained and understood using mathematical laws without any supernatural intervention, one has to take this view seriously, unless one can provide evidence against those consensus views. Assertions by religious people and theologians of the existence of supernatural forces simply do not carry anywhere near the same weight.

So when Charles Darwin or Richard Dawkins or any working biologist describes biological phenomena and the science behind it, their words definitely have greater credibility than those of non-biologists. The consensus view amongst biologists is that all the biological complexity that we see around us could easily have come about mainly by natural selection without any hidden mechanisms or supernatural intervention. As physicist Sean Carroll says:

Go to a biology conference, read a biology journal, spend time in a biology department; nobody is arguing about the possibility that an ill-specified supernatural "designer" is interfering at whim with the course of evolution. It's not a serious idea. It may be out there in the public sphere as an idea that garners attention — but, as we all know, that holds true for all sorts of non-serious ideas.

It is because of this consensus amongst biologists that we take the idea of evolution seriously, and discount supernatural explanations.

But we take academic scholars somewhat seriously even when they venture a little further afield, outside their narrow fields of expertise. The reason for this is that the most important thing to a working scholar is his or her credibility in the eyes of other scientists, and the more well known they are, the more effort they put into protecting that. This makes most scientists cautious about saying things about any subject that will earn them the scorn of their peers.

So serious scientists who need to express an opinion in a field outside their own specialty will usually check with scientists in that field to make sure they are getting the science right. I am currently reading Richard Dawkins's latest book The Greatest Show on Earth where he marshalls all the evidence in favor of evolution. In the process he talks about radiometric dating and continental drift, which lie in the fields of physics and geology and are outside his range of direct expertise. But it was clear to me that he had consulted knowledgeable people in those fields before he had used those arguments as evidence because it would be embarrassing for a scientist to err about any area of science.

Next: What about when scientists talk about god?

POST SCRIPT: Jon Stewart of the Democrats messing up health care reform

The Daily Show With Jon StewartMon - Thurs 11p / 10c
Democratic Super Majority
www.thedailyshow.com
Daily Show
Full Episodes
Political HumorRon Paul Interview

October 12, 2009

On quoting scientists-1: The numbers game

(My latest book God vs. Darwin: The War Between Evolution and Creationism in the Classroom has just been released and is now available through the usual outlets. You can order it from Amazon, Barnes and Noble, the publishers Rowman & Littlefield, and also through your local bookstores. For more on the book, see here.)

I recently received an email the subject line of which said, "Some leading and Nobel prize winner scientists view [sic] on God." The contents of the email consisted solely of 25 brief quotes, all in support of the existence of god, with no further explanation.

I am not sure what the point of this kind of exercise is since the email author did not explain. Is it to show that there are scientists who are also religious? If so, there is no need to make the case because no atheist denies that fact, so producing such lists serves no purpose than identifying some of the religious scientists by name.

In fact, one should be able to find even more than 25. The National Academy of Science is widely recognized as constituting only the leading scientists. It currently has about 2100 members. In response to a survey, 7% of NAS members said they believe in a personal god defined by the statement "a God in intellectual and affirmative communication with man ... to whom one may pray in expectation of receiving an answer." This is a far more active deity than the Slacker God of some accommodationists, so the email writer should have been able to dig up about 150 members of the NAS who have nice things to say about god.

If the point of the exercise is to impress atheists with the number of scientists who are religious, then this is the wrong way to go, since there are far more skeptics than believers in the NAS. About 72% are outright nonbelievers and another 21% are doubtful or agnostic. So if it comes down to a numbers game, believers lose by a landslide.

This reminds me of the time when the Discovery Institute, the organization that was behind intelligent design, issued a list of 103 people with doctorates in any field who had signed on to the following statement: "We are skeptical of claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for Darwinian theory should be encouraged." They even placed an ad touting the list as an argument against the theory of evolution.

In response, the National Center for Science Education started Project Steve, consisting of a list of scientists who were willing to sign on to the following statement:

Evolution is a vital, well-supported, unifying principle of the biological sciences, and the scientific evidence is overwhelmingly in favor of the idea that all living things share a common ancestry. Although there are legitimate debates about the patterns and processes of evolution, there is no serious scientific doubt that evolution occurred or that natural selection is a major mechanism in its occurrence. It is scientifically inappropriate and pedagogically irresponsible for creationist pseudoscience, including but not limited to "intelligent design," to be introduced into the science curricula of our nation's public schools.

The gimmick was that the signatories were limited to scientists who had names that were variations on some form of Stephen, such as Steve, Stephanie, Stefan, and so on. They got 367 scientists (including Stephen Hawking) to sign which, since the name Steve only represents 1% of the population, can be extrapolated to suggest that 36,700 scientists support the statement.

The whole point of Project Steve was to make fun of the idea that numbers of scientists behind a proposition alone is an argument for anything and if someone should think so, it is going to be a definite loser for religious beliefs.

But the email made me think about the uses of quotes by scientists in general. I myself use direct quotes quite often and attribute them to the source whenever I can. Why do I use them? What purposes do they serve?

Next: When do quotes serve as evidence for anything?

POST SCRIPT: Tuesdays with Moron?

Bill Maher speculates on the other ghostwriters who were considered for Sarah Palin's book and the titles they suggested.

September 23, 2009

Using placebos as part of treatments

Nowadays, the testing of new drugs often involves comparisons not only with placebos but also with older established drugs in three-way double-blind tests. What is emerging from these trials is that the placebo effect seems to be getting stronger, which means that new drugs in clinical trials are having a harder time showing that they are better than the placebo. Another consequence of stronger placebo responses is that some well-known drugs used is the trials as the older standard (and that had beaten the placebo in earlier tests) seem not to be able to do so now.

As Steve Silberman in Wired Magazine says:

Some products that have been on the market for decades, like Prozac, are faltering in more recent follow-up tests. In many cases, these are the compounds that, in the late '90s, made Big Pharma more profitable than Big Oil. But if these same drugs were vetted now, the FDA might not approve some of them. Two comprehensive analyses of antidepressant trials have uncovered a dramatic increase in placebo response since the 1980s. One estimated that the so-called effect size (a measure of statistical significance) in placebo groups had nearly doubled over that time.

It's not that the old meds are getting weaker, drug developers say. It's as if the placebo effect is somehow getting stronger.

But why would the sugar pill placebos be having a stronger effect now? One possibility is that we are getting better at doing double-blind tests, thus eliminating spurious effects that escaped detection earlier. For example it is found that certain assumptions used in drug testing (that geography does not matter) are now found to be not valid. Not only does the placebo response of the patient vary from place to place, so do the ratings by trial observers, leading to the unfortunate possibility that drug companies may 'placebo-shop', choosing for their clinical tests those areas where the placebo response is low in order to have their drugs seem more effective.

But the more interesting thing that Silberman points out is that the rising strength of the placebo response may be telling us something valuable about the power of the brain to influence our biochemical processes. The placebo effect may be more of a physiological response than a psychological one, and something that can be harnessed in favor of better treatments. Many of these effects are related to pain-reducing compounds called opiods that are produced by the brain. Placebos can act like catalysts, triggering the release of these opiods.

Researcher Fabrizio Benedetti at the University of Turin finds that:

Placebo-activated opioids, for example, not only relieve pain; they also modulate heart rate and respiration. The neurotransmitter dopamine, when released by placebo treatment, helps improve motor function in Parkinson's patients. Mechanisms like these can elevate mood, sharpen cognitive ability, alleviate digestive disorders, relieve insomnia, and limit the secretion of stress-related hormones like insulin and cortisol.

What seems to be going on is that our expectations of what the future will be like seem to play a significant role in how our brain influences our body. If we feel that a good result will ensue from a treatment, the brain releases chemicals that assist in creating that result. What placebos seem to be doing is manipulating those expectations.

It also works in reverse. There are things called 'nocebos' that work opposite to placebos, suppressing the beneficial brain functioning. "Cancer patients undergoing rounds of chemotherapy often suffer from debilitating nocebo effects—such as anticipatory nausea—conditioned by their past experiences with the drugs."

This has led to a revision in attitudes towards placebos, shifting them from a problem to be overcome to viewing them as an additional form of treatment that should be better harnessed. Of course, there are limits to what placebos and the brain can do. As Silberman says, a placebo "can ease the discomfort of chemotherapy, but it won't stop the growth of tumors."

The success of modern medicine in treating many ailments may have strengthened the placebo effect by instilling greater confidence in patients that their treatment will work, triggering the release of opiods and dopamine. Furthermore, drug companies also advertise heavily these days, promoting the benefits of their products to relieve all manner of ailments and associating taking it with good things in life, such as beautiful sunsets, playing with children, enjoying the outdoors, sex, sports, etc. So placebos may be getting stronger because people believe that the drugs will give them a better future.

As a result, the very success of drugs in the past may be working against the drug companies now by increasing the expectations of drugs and thus creating a stronger placebo response. Furthermore,

Existing tests also may not be appropriate for diagnosing disorders like social anxiety and premenstrual dysphoria—the very types of chronic, fuzzily defined conditions that the drug industry started targeting in the '90s, when the placebo problem began escalating. The neurological foundation of these illnesses is still being debated, making it even harder for drug companies to come up with effective treatments.

What all of these disorders have in common, however, is that they engage the higher cortical centers that generate beliefs and expectations, interpret social cues, and anticipate rewards. So do chronic pain, sexual dysfunction, Parkinson's, and many other ailments that respond robustly to placebo treatment. To avoid investing in failure, researchers say, pharmaceutical companies will need to adopt new ways of vetting drugs that route around the brain's own centralized network for healing.

It seems like there need to be developments in two areas. One is to find better ways to test for the true effectiveness of drugs that go even beyond the current double-blind testing. What may be necessary is to incorporate 'open/hidden' tests where the test subjects don't know when they being given any treatment at all, whether it be placebo or drug. This will remove the placebo effect of expectations, giving a better measure for the effectiveness of the drugs.

The second development is to learn how to better use the brain-based nature of the placebo response as part of therapy. A judicious combination of truly effective drugs and the placebo response may be an important part of the future of medicine.

POST SCRIPT: This Modern World

Tom Tomorrow's comic strip imagines how the health insurance industry would have operated in medieval times if it behaved the way it does now.

September 22, 2009

The placebo effect

In the previous post, I described the practice of homeopathy and explained why it should no longer be taken seriously. Now that we know that its originator Samuel Hahnemann was basically treating his patients with water, what made him think his treatment was effective? There is no evidence that he was a fraud or charlatan, foisting on his patients something he knew was bogus in order to take their money. He was probably genuine in his belief in the efficacy of his treatment.

It is likely that he was misled by the placebo effect, where patients recover from an illness due to any number of factors that have nothing to do with treatment provided by the doctor. People who want to believe seize on these random events and see patterns that don't exist. For example, since colds get better after a few days, it is possible to get gullible people to believe that practically anything is a cure for cold since if you take it soon after the onset of symptoms, presto, the cold disappears in a couple of days.

Steve Silberman in Wired Magazine describes how the placebo effect was discovered.

The roots of the placebo problem can be traced to a lie told by an Army nurse during World War II as Allied forces stormed the beaches of southern Italy. The nurse was assisting an anesthetist named Henry Beecher, who was tending to US troops under heavy German bombardment. When the morphine supply ran low, the nurse assured a wounded soldier that he was getting a shot of potent painkiller, though her syringe contained only salt water. Amazingly, the bogus injection relieved the soldier's agony and prevented the onset of shock.

Returning to his post at Harvard after the war, Beecher became one of the nation's leading medical reformers. Inspired by the nurse's healing act of deception, he launched a crusade to promote a method of testing new medicines to find out whether they were truly effective.

In a 1955 paper titled "The Powerful Placebo," published in The Journal of the American Medical Association, Beecher described how the placebo effect had undermined the results of more than a dozen trials by causing improvement that was mistakenly attributed to the drugs being tested. He demonstrated that trial volunteers who got real medication were also subject to placebo effects; the act of taking a pill was itself somehow therapeutic, boosting the curative power of the medicine. Only by subtracting the improvement in a placebo control group could the actual value of the drug be calculated.

The placebo explains why so many medical procedures that are now viewed with horror were standard treatments in the past. Bloodletting, bleeding with leeches, attaching maggots, dousing with cold water, were among the treatments once recommended. Charles Darwin suffered from all manner of undiagnosed ailments that included frequent vomiting and he subjected himself to various uncomfortable water treatments in the belief that they helped him. His beloved daughter Annie died of an unknown illness after receiving similar water treatments.

In my own building on the third floor is a small museum of medical history that contains all manner of gruesome-looking medical devices that no one thinks of using today but once were believed to be effective, even state-of-the-art. As long as the physician and patient had confidence in the treatment, it must have seemed to work.

Because of the repeated discrediting of medical treatments that were once considered effective, it has been suggested that the history of medicine is actually the history of the placebo effect, with new placebos replacing the old, leading to the uncomfortable suggestion that our current treatments, however sophisticated they may seem, are merely the latest placebos.

But there is reason to think that we now have a much better idea of what really works and what is a placebo because Beecher's work led to the invention of the practice of double-blind experimental testing, where neither the patient nor the researcher collecting the data and doing the analyses knows who is receiving the experimental treatment and who is receiving the placebo.

By 1962, the government had started requiring drug companies to perform clinical tests with placebos in order to get approval and this has led to the elimination of outright quackery in medicine. Without such precautions, people can, even with the best of intentions, subtly distort the results to get the result they want or expect.

As a result of the widespread adoption of double-blind testing, there is good reason to think that our current practices are significantly better than those of the past, and that we are no longer so easily fooled by placebos.

Next: Using placebos as part of treatment.

POST SCRIPT: How double blind tests work

Double-blind tests are useful not only in medicine. Richard Dawkins shows what happens when it is used to test the claims of people who think they can detect the presence of water by dowsing.

It is interesting that when the tests show the dowsers that the "powers" they thought they had is non-existent, they make up stuff to enable them to continue believing. Does that remind you of anything?

April 03, 2009

The stem cell issue-2: The ethics

Yesterday, I discussed the science involved in stem cell research. Today I want to discuss the ethics.

The ethical problems associated with stem cell research occur because although the fertilized eggs were not created for the purposes of research but to help infertile couples, since the method of in vitro fertilization for the treatment of infertility has not been perfected, more fertilized eggs are created than can be used to actually generate pregnancies, and the question of what to do with these extra frozen stored embryos is problematic.

If the extra ones are not needed for future implantation in a womb, then the options are to destroy them, preserve them forever, or use them for research. Those favoring stem cell research argue that preserving them forever is not realistic, that they will have to be thrown away eventually, and that using them for research is better than destroying them without any benefit being obtained, even though the resulting blastocyst must be destroyed in order to produce the stem cell lines,

Those opposed to stem cell research (and abortion) have a simple and clear argument: Life begins at the instant when an egg is fertilized, and no human action is permissible thereafter to prevent that egg from being eventually born. So once an egg is fertilized, whether in the uterus or outside, then we have a human life and using a blastocyst for research is effectively destroying life. This is a secular argument, even though many, or even the majority, of those who support it may have religious reasons for their stand, such as the idea that god inserts the soul at the moment of conception when the egg is fertilized. They argue that if such a position requires the preservation of unused embryos indefinitely, then we should do so, however impractical that might be.

Those who support a woman's right to terminate a pregnancy and/or the use of embryonic stem cells for research have more difficulty in justifying their position because drawing a clear line as to when 'life' begins or a clump of cells becomes 'human' is hard. One thing they are agreed upon is that a human being is much more than a fertilized egg or a bunch of cells such as a blastocyst. But where does one draw the line?

One line is that until such time as the fetus can exist independently outside the womb, it is not a human being. Right now that time corresponds roughly to the third trimester of the pregnancy. But as technology improves, that is likely to shift to earlier times. Others argue that any organism (human or otherwise) must have some higher level of capacity, such as a brain, before its life becomes worthy of protection from harm. After all, when it comes to question of death, society seems to have decided that when the brain stops functioning one is effectively dead and one no longer needs to take steps to keep the body alive. And as the Terry Schiavo case tragically illustrated, what we mean by a functioning brain is more than just brain stem functions that maintain basic body processes and some reflexes. It means that the part of the brain, such as memory and cognition, that gives us our personality and makes us who we are must be functioning. Once a person has reached the stage of being in what is known as a 'persistent vegetative state', that person is considered to be effectively dead.

In this debate, both sides usually ignore the need for consistency across species. Why should only human life be so valued? What makes us superior and worthy of special consideration? If life is precious and life begins with a fertilized egg or with higher brain function, then what about the lives of other species? After all, we kill animals, even though they are fully functioning living things with a level of brain function that we would undoubtedly value if a human had it. We even think nothing of eating them after killing them. Why should we have one standard for humans and another for nonhuman animals?

One can take a speciesist position and simply assert as a given that human beings are superior to others and so we have a right to do what we like to other animal forms while treating human life as sacrosanct. But that is hard to justify on general moral or ethical grounds. There is no clear marker that justifies treating humans as special, unless you throw in ideas such as that humans have a soul and other animals do not. This is an argument based on a particular religious viewpoint and should have no place in determining public policy, which should always be based on secular arguments.

In my opinion, the position taken by ethicists such as Peter Singer is the most consistent moral and ethical one, that does not give humans special privileges. They take a utilitarian position, that what one should seek is the minimization of suffering. Since suffering involves sentience, this requires that an organism must have at least some primitive brain function and the development of a nervous system before it can be said to have the possibility of suffering. So it would be acceptable to destroy any system of cells (whether from a human or non-human animal) as long as it has not yet reached the stage where it has the ability to suffer, or it has passed that stage at the end of life.

Even if we do not achieve the high level of consistency that it requires of us, the utilitarian argument that says that what we should aim for is a net reduction of global suffering seems to me to be a workable ethical principle on which to base decisions like these. Hence it is ethically allowable to use embryonic stem cells from a blastocyst (before the cells themselves have reached the capacity to suffer) in order to do research to reduce the suffering of actual living organisms.

Of course, this raises other potential problems that are sure to come down the road. Is it ethical, for example, to deliberately produce blastocysts purely for the purpose of research, as opposed to using those that are the by-products of infertility treatments? If, for example, one wanted to study the early development of a disease that had a genetic basis, would it be ethical to take an egg and sperm from people who have that disease and create a fertilized egg purely in order to study the early onset of that disease or to develop treatments for it?

These are very tough questions but ones that are going to come at us thick and fast in the near future as science and technology inexorably advance.

POST SCRIPT: God will decide if and when and how the world will end

Two days ago, I suggested that religious people make unreliable allies in the battle to save the environment because of their belief in god's plan. Right on cue, we have a member of the US Congress during hearings last week on cap-and-trade policies to reduce carbon emissions, quoting the Bible (Genesis 8:21,22 and Matthew 24:31) to support his belief that the future of the Earth is part of god's plan. Yes, god has our back, based on what he supposedly told Noah after the flood. So don't worry, burn those fossil fuels because Jesus has it covered!

April 02, 2009

The stem cell issue-1: The science

The decision by the Obama administration to reverse the Bush-era policy of banning the use of federal funds for stem cell research has created some controversy. The earlier policy had led to some frustration in the scientific community.

Bush's policy was intended to be a compromise: it banned the use of federal funds for the creation of new embryonic stem-cell lines while allowing scientists to study 21 lines that had already been created. But researchers say those lines aren't diverse enough and they have been eager to study hundreds of other lines, some of which contain specific genetic mutations for diseases like Parkinson's. There have been practical challenges as well. The restrictions forced scientists to use different lab equipment for privately funded and government-funded research; some even built entirely separate lab space. One of the most disconcerting aspects, researchers say, has been the negative effect on collaboration, a hallmark of the scientific process. Researchers supported by private money haven't been able to team up with scientists funded by the government, potentially holding back new insights and advances.

Stem cells are those that have three properties. Unlike most cells like muscle or blood or nerve cells, 91) they are capable of replicating themselves for a long period (making them a valuable source to regenerate the body by replacing cells that die), (2) they are unspecialized, and (3) when they reproduce they can produce either more stem cells or become specialized cells like muscle or nerve or bone (a process known as differentiation). The National Institutes of health has an informative FAQ page on this topic.

The two main kinds of stem cells are the embryonic ones and the non-embryonic ones. The embryonic ones can proliferate for a year or more in the laboratory without differentiating while the non-embryonic ones cannot do so for very long, but the reasons for this difference are not known as yet. The embryonic stem cells are capable of eventually differentiating into any type of specialized cell, and are called pluripotent. Such pluripotent cells are valuable because they can be used to repair tissue in any part of the body as needed. But eventually they need to differentiate into specialized cells in order to perform the functions that those specialized cells carry out in the body. The process by which stem cells differentiate is still not fully understood, but part of it involves interaction with the external environment in which the stem cell finds itself.

Adult stem cells are one form of non-embryonic cells and are found amongst the differentiated cells that make up the tissues of the body, such as the brain and heart and bone marrow, and they are the cells that are used to maintain and repair those tissues by differentiating when needed to produce new tissues. Some adult stem cells seem to have the capacity to differentiate into more than one type of specialized cell though the range is limited, unlike in the case of embryonic stem cells. Such cells are called multipotent.

For example, some multipotent stem cells found in the bone marrow can generate bone, cartilage, fat, and connective tissue. Stem cells taken from umbilical cord blood and the placenta seem to also have multipotent properties and thus in the future it may become routine that a stock of umbilical or placental cells will be taken after every birth and preserved for possible future use. Adult stem cells have some uses but working with them is much more difficult since they are harder to obtain and are less flexible.

To understand the ethical issues involved in using embryonic stem cells, one should be aware that creating embryonic stem cell lines for research requires extraction of cells from the blastocyst. This is the stage reached by a fertilized egg after about three to five days when, after repeated cell division and duplication, there are about 70-100 identical cells in the shape of a hollow ball containing an inner clump of cells. The inner clump becomes the embryo and the outer hollow ball becomes the placenta. When this occurs in the uterus, this stage is reached before this collection of cells gets implanted in the uterus wall. Sometimes implantation does not occur, in which case the pregnancy is spontaneously terminated.

This video explains what stem cells are and how they work.

The embryos from which embryonic stem cells are taken are produced during treatment for infertility when a woman's egg is taken from her body and fertilized and grown to blastocyst stage in a culture outside the woman's body. In the very early days after the egg is fertilized and the cell starts splitting and reproducing itself, all the cells are identical. Embryonic stem cells are obtained from that inner clump of cells and thus the blastocyst has to be destroyed in the process. The cells from a single blastocyst can be used to generate millions of embryonic stem cells that can be divided among researchers, and these are the stem cell 'lines' that are referred to. The cells in a single line are all genetically identical.

While there are promising new ways of creating embryonic stem cells using adult skin cells (called induced pluripotent stem cells), they have their own ethical issues.

Since tissues created from a person's stem cells have the same genetic information as the host, the host body will not reject the implanted tissues as a foreign body, thus overcoming one of the biggest hurdles in organ transplants. While the possibility of growing tissues and entire organs for transplant purposes is often publicized as the biggest potential benefit of using stem cells, there are other more immediately realizable potential uses for embryonic stem cells.

One is that it enables the process by which cells differentiate into their specialized forms to be studied. Another is that by creating cells that have a particular disease, say Parkinson's or Lou Gehrig's, one can observe under a microscope even the earliest stages of the progression of the disease and thus hope to develop better treatments. Another use is to test the effects of drugs on cells before testing them on a real person. That would enable you to see if they are toxic to a particular individual, creating a level of personalized medicine that we do not currently have.

The potential benefits of embryonic stem cells in research are clear, even though it is very early days yet and there is still a long way to go before we can hope to even begin realizing those benefits. The key question is how to balance the ethical concerns involved in using such cells with the benefits.

This question will be examined in the next post.

POST SCRIPT: The Daily Show on stem cells

The Daily Show With Jon StewartM - Th 11p / 10c
Stem Sell
comedycentral.com
Daily Show Full EpisodesEconomic CrisisPolitical Humor

August 25, 2008

Why Darwin scares people

(The text of a talk given at CWRU's Share the Vision program on Friday, August 22, 2008 at 1:00 pm in Severance Hall. This annual program is to welcome all incoming first year students. My comments centered on this year's common reading book selection The Reluctant Mr. Darwin by David Quammen.)

Welcome to Case Western Reserve University!

You are fortunate that in your first year here you are going to part of a big year-long celebration, organized by this university, to mark the 200th anniversary of the birth of Charles Darwin and the 150th anniversary of his groundbreaking book On the Origin of Species.

In my opinion, Darwin is the greatest scientist of all time. You have no idea how hard it is for me to say that because I am a physicist and had long thought that the only competitors for that exalted title were Isaac Newton and Albert Einstein. But the more that I have learned about the theory of evolution over the last decade, the more I have to concede that Darwin has had the most impact on our thinking.

As you have heard today, the Share the Vision program at Case is part of the university's commitment to create a welcoming and unifying environment for people from all backgrounds. Darwin's ideas should be warmly welcomed by those who share those goals because one important implication of his work is that all of us are biologically linked because we all share common ancestors.

If any two of you in this auditorium could trace your ancestors back in time, it will not be long before you find that you share a common ancestor. In fact, we would find that everyone who lives in the world now shares at least one common ancestor who lived only as far back as around 1500 AD. So around the time of Copernicus and the Renaissance, some one was walking around who is the common ancestor of each and every one of us.

If that doesn't boggle your mind, then listen to this. If you go back to just around 3,000 BC, of all the people who lived then, about 20% have no living descendents. Their lines died out. But the remaining 80% are the shared, common ancestors of all of us. Think about that for a minute. This is quite amazing. We are all, literally, part of one big family. We are all cousins under the skin.

It gets even better. If we go back even further, we find that we are cousins with all the nonhuman animals as well, and going back further still, with all the plants and even bacteria, all of us tracing our ancestors back to possibly a single ancestral organism. All of life that presently exists and ever existed is connected by this tree of life.

No wonder that Darwin was moved by this stupendous insight to end his book On the Origin of Species, by saying, "There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved."

But, sadly, not everyone is as delighted as I am with the idea that worms are our cousins, and that we are both part of one big family with every organism that ever lived. Those who want to believe that humans possess some unique and special quality not possessed by other animals have found Darwin's idea deeply disturbing, and this is the source of much of the antagonism to him. Even the cautious Darwin himself, aware of this problem and the hostility it would arouse, only obliquely hinted at the linkage of humans to all other species in On the Origin of Species, leaving a full treatment to a subsequent book The Descent of Man published twelve years later.

Darwin's theory of natural selection and the tree of life is not only eminently plausible, but has been put on a rigorous mathematical footing and has abundant evidence in support of it. So why does the theory still arouse such strong opposition?

The superficial answer is that Darwin's theory goes against the religious belief that each species, and especially humans, were the result of a special act of creation by god. That idea seemed plausible at a time when it seemed obvious that every complex thing needed an even more complex designer to create it. But with Darwin, for the first time we had a scientific theory that showed how complex things could emerge from simpler things, without any outside intervention or agent or intelligence or design. Once the first primitive replicator, an early ancestor of DNA, had been created in the primeval soup, it multiplied and diverged, under the action of purely physical and algorithmic laws acting mindlessly, to eventually become the wide array of life we have now.

What is even more unnerving to some is that Darwin's theory reaches into every aspect of existence. As philosopher Daniel Dennett says, it is like an immensely powerful acid that once created cannot be contained by any boundaries because it can eat through any wall. People first tried to restrict it to nonhuman life but it broke through that barrier. They then tried to restrict it only to the human body but it broke through that too. Darwin's theory is now being applied to explain the origins of language and altruism and morality and other aspects of behavior, and to the workings of the brain and mind and consciousness.

Even intelligence, the feature that humanity prizes itself upon and which had been thought to be a precursor to creation, we now know occurred much later in life's evolution and came into being as a result of the same non-intelligent, undirected, natural selection mechanism that produced our arms and legs.

There seems to be no quality that we humans possess that could not have come into existence by the evolutionary processes described by Darwin and his successors.

Darwin's theory has extended even to what used to be considered purely philosophical questions. Paleontologist George Gaylord Simpson said that all attempts before the publication of On the Origin of Species to answer the question of what does it mean to be human were worthless and that we would be better off if we ignored them completely. Such is the significance of Darwin's work.

People who are wedded to the idea that human beings must possess some unique, non-material, and possibly divine quality, and that there must be some externally imposed purpose to their lives and the universe are highly uncomfortable by these developments. As cognitive scientist Steven Pinker says, "People desperately want Darwin to be wrong . . . because natural selection implies there is no plan to the universe, including human nature."

But the fact that the theory of evolution causes unease for some is hardly grounds for its rejection. The test of validity of a scientific theory is not whether it is perfect or whether it explains everything or whether it makes us feel happy or satisfies some deep emotional need, but whether it works better than any of its competitors. And there is nothing that comes even close to replacing the neo-Darwinian synthesis as the explanation of life's diversity.

As you will have read in the book, Darwin was nervous about where his ideas were taking him, even though he was increasingly convinced that he was right. He knew that in science just having a good idea isn’t enough, however beautiful the idea is. You had to have evidence to support it and to that end he doggedly spent most of his life, observing, experimenting, and collecting data from all over the world, despite ill health and recurring headaches and vomiting attacks and personal tragedy.

Since his death, the evidence in favor of his theory has increased with other revolutionary discoveries like genes and DNA and continental drift and fossils. The evidence in support of Darwin's theory of natural selection and the resulting interconnectedness of all life now exists in abundance.

This has not stopped the critics though. But they have been reduced to merely trying to find problems as yet unsolved by the theory of evolution because no alternative theory has been able to produce the kinds of evidence necessary to be taken seriously as a competitor. But as Herbert Spencer pointed out as long ago as 1891, "Those who cavalierly reject the Theory of Evolution as not being adequately supported by facts, seem to forget that their own theory is supported by no facts at all."

This year, you will all be able to be part of the Darwin celebration as eminent scientists, philosophers, and legal scholars from all over the world come to Case to discuss all the ramifications of his work. You have a unique opportunity to be part of that exciting year and I hope you take full advantage of it.

POST SCRIPT: Teaching evolution in high schools

Florida has just introduced evolution explicitly into its science standards. This story illustrates one teacher's efforts to teach it to his high school students.

July 22, 2008

Scientific consistency and Conservapedia loopiness

One of the drivers of scientific research is the desire to seeking a greater and greater synthesis, to seek to unify the knowledge and theories of many different areas. One of the most severe constraints that scientists face when developing a new theory is the need for consistency with other theories. It is very easy to construct a theory that explains any single phenomenon. It is much, much harder to construct a theory that does not also lead to problems with other well-established results. If a new theory conflicts with existing theories, something has to give in order to eliminate the contradiction.

For example, Darwin's theory of evolution is a slow process, incompatible with the young Earth creationist theory of a 6,000-year old Earth. The acceptance of Darwin's theory was only made possible with the almost concurrent emergence of geological theories that argued that the Earth was far older than that. Creationists, on the other hand, want to go in the opposite direction and seek to discredit evolution so that they can hold on to a young Earth.

But while the scientific search for overall consistency results in more logical and satisfying theories and new breakthroughs, the parallel religious attempt to build consistency around a 6,000 year Earth leads to greater and greater loopiness, to the construction of an alternative reality that one can only marvel at.

Take for example, the fascinating response of some religious people to reports of Richard Lenski's interesting evolution experiment I wrote about yesterday. Andrew Schlafly (son of Phyllis Schlafly, a conservative icon) is the founder of Conservapedia, a religious alternative started to counter what they perceive as the anti-Christian, liberal agenda of Wikipedia. Conservapedia views everything through a Christian, right-wing, America-centered lens. It gives a lot prominence to arguments in favor of a 6,000-year old Earth.

The anti-evolution crowd contains many people who combine ignorance of science with arrogance and Schlafly exemplifies this. Even though he is not a microbiologist, he challenged Lenski's work with extraordinarily rude letters implying that there was shady work afoot and demanding to see the raw data, leading to a back-and-forth correspondence. You can read all the gory details here. Lenski's second reply to Schlafly is a masterpiece, combining a lesson in how to get slapped around politely with a good scientific explanation of his experiment.

One benefit of Schlafly's crusade is that Lenski's experimental results became elevated from something that just his biology subcommunity knew about to an internet phenomenon, widely discussed in the wider science and religion world. I myself heard about Lenski's work only because of the fuss that Andrew Schlafly created, so thanks Andy!

If you have not yet experienced the goofiness of Conservapedia, you are missing a treat. Take this gem from its article on the theory of relativity.

A prevailing theory among creation scientists such as physicist Dr. John Hartnett believe that the Earth was once contained in a time dilation field, which explains why the earth is only 6,000 years old even though cosmological data (background radiation, supernovae, etc.) set a much older age for the universe. It is believed that this field has since been removed by God, which explains why no such time dilation has been experienced in modern times. (my italics)

That is a typical religious explanation for phenomena – god did it and then hid the evidence that he did it. It always amazes me that these people claim to know exactly what god does and what god wants but plead ignorance as to why.

Take, as another example, Conservapedia's article on kangaroos. These marsupials are found only in Australia and the scientific understanding of how this happened involves theories of changes in ocean levels, the splitting apart of continents, and the speciation that results when animal populations get separated geographically and evolve independently from their ancestral forms, and thus diverge from their cousins on other continents.

After devoting just one line to the evolutionary explanation for the origin of kangaroos in Australia, Conservapedia expansively discusses the creationist explanation:

According to the origins theory model used by young earth creation scientists, modern kangaroos are the descendants of the two founding members of the modern kangaroo baramin that were taken aboard Noah's Ark prior to the Great Flood. It has not yet been determined by baraminologists whether kangaroos form a holobaramin with the wallaby, tree-kangaroo, wallaroo, pademelon and quokka, or if all these species are in fact apobaraminic or polybaraminic.

After the Flood, these kangaroos bred from the Ark passengers migrated to Australia. There is debate whether this migration happened over land with lower sea levels during the post-flood ice age, or before the super-continent of Pangea broke apart.

The idea that God simply generated kangaroos into existence there is considered by most creation researchers to be contra-Biblical.

Notice that this article disparages the notion that god created kangaroos out of nothing in Australia, but finds perfectly plausible the idea that god created the kangaroos out of nothing earlier, saved just a pair of them in Noah's Ark, and then after the flood had them hopping over to Australia to raise a family start a new life, like homesteaders in old Western films.

One would think that once one allowed that kangaroos could be created out of nothing, Ockham's razor would prefer the former theory. The only reason not to do so is to conform to Biblical myths. The Noah's Ark bottleneck has to be preserved at all costs.

It is a long journey from Mount Ararat in Turkey (where the Ark supposedly finally ended up) to Australia and this theory requires that the pair of kangaroos from the Ark either live long enough to get to Australia before they started breeding or that all their offspring produced along the way stuck with the family for the entire journey (can you imagine how maddening their cries of "Are we there yet?" would become) or that the successor lines of all the ones that were left behind along the way became extinct, leaving no fossil record anywhere else in the world. Or maybe they were raptured early.

Another possibility (which I just thought up or maybe it was god revealing the truth to me, undeserving heathen though I am) is that Noah's Ark was less like an emergency lifeboat and more like a round-the-world cruise ship, and that different animals left the liner at different ports of call: kangaroos at Sydney, koalas at Auckland, penguins in the Antarctic etc. This theory actually explains a lot about the geographic diversity of species and I offer it free to the creators of Conservapedia to add to their site.

Since Conservapedia, like Wikipedia, is a fairly open system that allows almost anyone to edit its entries, some suspect that much of the site's content consists of subtle parodies by people pulling the legs of Schlafly and his co-religionists, and that they have not cottoned on to it yet. For example, I found the above passage about relativity just last week but today noticed that the passage has been changed, to be replaced by the briefer "Prevailing theories among creation scientists such as physicists Dr. Russell Humphreys and Dr. John Hartnett are time dilation explains why the earth is only 6,000 years old even though cosmological data (background radiation, supernovae, etc.) set a much older age for the universe." Was the original a parody that the site editors discovered and scrubbed? Is the kangaroo explanation a parody? It is hard to tell.

It is a sad reflection on your credibility when readers cannot tell when the material has been created in good faith and when it is a hoax.

POST SCRIPT: Platypus

Steve Benen points out that new research mapping the genome of the platypus causes yet more headaches for creationists.

July 21, 2008

Seeing evolution in real time

Evolution opponents tend to try and dismiss the evidence in its favor, as a last resort often resorting to the argument that no one has actually seen evolution occurring and a new species emerging, with all the intermediate stages clearly identified. One reason for this is, of course, that evolutionary change occurs very slowly, not visible in the transition from one generation to another. The emergence of a new species is almost always a retrospective judgment, made long after the fact, of a process that often takes thousands, or tens of thousands, of generations. By that time, most of the intermediate forms have become extinct and left no trace, since fossilization is such a rare event.

This is why researchers are finding that bacteria and other microbes, organisms that can go through multiple generations in a single day, to be valuable targets for study, allowing them to see evolutionary change and speciation within the span of a human lifetime.

In a truly remarkable piece of work, Richard Lenski of Michigan State University, starting from a single E. coli bacterium in 1989, kept breeding them in environments with a limited supply of food to see how they would adapt to their situation.

The experiment ran as follows:

He created 12 identical lines of E. coli and then fed them a meager diet of glucose. The bacteria would run out of sugar by the afternoon, and the following morning Dr. Lenski would transfer a few of the survivors to a freshly supplied flask.

From time to time Dr. Lenski also froze some of the bacteria from each of the 12 lines. It became what he likes to call a “frozen fossil record.” By thawing them out later, Dr. Lenski could directly compare them with younger bacteria.

Within a few hundred generations, Dr. Lenski was seeing changes, and the bacteria have been changing ever since. The microbes have adapted to their environment, reproducing faster and faster over the years. One striking lesson of the experiment is that evolution often follows the same path. “We’ve found a lot of parallel changes,” Dr. Lenski said.

The clever part of this experiment was that by freezing samples every 500 generations or so along the way, Lenski could go back in time if necessary and identify when specific changes occurred. He now has over 40,000 generations of bacteria and has thus been able to track closely the way that random mutations and natural selection, the fundamental basis of evolution, works. What these and other similar experiments do is show evolution occurring in real time.

One result of his experiments is that the bacteria are now twice as big as their common ancestor and reproduce 75 percent faster.

But the more dramatic result that Lenski observed was that after 33,127 generations, suddenly one of the colonies of the E. coli bacteria evolved the ability to absorb citrate, a nutrient found in abundance in the broth in which the bacteria are cultured. One of the signature marks of standard or 'wild' E. coli is their inability, unlike many other microbes, to absorb citrate.

Science reporter Carl Zimmer, who has been following these experiments, reports on the analysis they did of what happened.

[Lenski's graduate student Zachary] Blount took on the job of figuring out what happened. He first tried to figure out when it happened. He went back through the ancestral stocks to see if they included any citrate-eaters. For the first 31,000 generations, he could find none. Then, in generation 31,500, they made up 0.5% of the population. Their population rose to 19% in the next 1000 generations, but then they nearly vanished at generation 33,000. But in the next 120 generations or so, the citrate-eaters went berserk, coming to dominate the population.

This rise and fall and rise suggests that the evolution of citrate-eating was not a one-mutation affair. The first mutation (or mutations) allowed the bacteria to eat citrate, but they were outcompeted by some glucose-eating mutants that still had the upper hand. Only after they mutated further did their citrate-eating become a recipe for success.

So we see the clear emergence of a new form of E. coli, able to live on citrate in a way that 'wild' E. coli are not found to be able to do. The fact that these bacteria developed the ability to switch their diet from the meager glucose to the abundantly available citrate is a significant evolutionary step, showing how an organism can adapt to its environment in ways that make it better able to survive.

This really is a beautiful experiment, illustrating once again how much of science depends on painstaking, long-term, careful study.

Next: Religious anti-evolutionists attack Lenski's work.

POST SCRIPT: Comedian Dave Allen on the story of Genesis

June 27, 2008

The difference between human and other animal communication

In his book The Language Instinct (1994) Steven Pinker pointed out two fundamental facts about human language that were used by linguist Noam Chomsky to develop his theory about how we learn language. The first is that each one of us is capable of producing brand new sentences never before uttered in the history of the universe. This means that:

[A] language cannot be a repertoire of responses; the brain must contain a recipe or program that can build an unlimited set of sentences out of a finite list of words. That program may be called a mental grammar (not to be confused with pedagogical or stylistic "grammars," which are just guides to the etiquette of written prose.)

The second fundamental fact is that children develop these complex grammars rapidly and without formal instruction and grow up to give consistent interpretations to novel sentence constructions that they have never before encountered. Therefore, [Chomsky] argued, children must be innately equipped with a plan common to the grammars of all languages, a Universal Grammar, that tells them how to distill the syntactic patters out of speech of their parents. (Pinker, p. 9)

Children have the ability to produce much greater language output than they receive as input but it is not done idiosyncratically. The language they produce follows the same generalized grammatical rules as others. This leads Chomsky to conclude that (quoted in Pinker, p. 10):

The language each person acquires is a rich and complex construction hopelessly underdetermined by the fragmentary evidence available [to the child]. Nevertheless individuals in a speech community have developed essentially the same language. This fact can be explained only on the assumption that these individuals employ highly restrictive principles that guide the construction of grammar.

The more we understand how human language works, the more we begin to realize how different human speech is from the communication systems of other animals.

Language is obviously as different from other animals' communication systems as the elephant's truck is different from other animals' nostrils. Nonhuman communication systems are based on one of three designs: a finite repertory of calls (one for warnings of predators, one for claims of territory, and so on), a continuous analog signal that registers the magnitude of some state (the livelier the dance of the bee, the richer the food source that it is telling its hivemates about), or a series of random variations on a theme (a birdsong repeated with a new twist each time: Charlie Parker with feathers). As we have seen, human language has a very different design. The discrete combinatorial system called "grammar" makes human language infinite (there is no limit to the number of complex words or sentence in a language), digital (this infinity is achieved by rearranging discrete elements in particular orders and combinations, not by varying some signal along a continuum like the mercury in a thermometer), and compositional (each of the finite combinations has a different meaning predictable from the meanings of its parts and the rules and principles arranging them). (Pinker, p. 342)

This difference between human and nonhuman communication is also reflected in the role that different parts of the brain plays in language as opposed to other forms of vocalization.

Even the seat of human language in the brain is special. The vocal calls of primates are controlled not by their cerebral cortex but by phylogenetically older neural structures in the brain stem and limbic systems, structures that are heavily involved in emotion. Human vocalizations other than language, like sobbing, laughing, moaning, and shouting in pain, are also controlled subcortically. Subcortical structures even control the swearing that follows the arrival of a hammer on a thumb, that emerges as an involuntary tic in Tourette's syndrome, and that can survive as Broca's aphasic's only speech. Genuine language . . . is seated in the cerebral cortex, primarily in the left perisylvian region. (Pinker, p. 342)

Rather than view the different forms of communication found in animals as a hierarchy, it is better to view them as adaptations that arose from the necessity to occupy certain evolutionary niches. Chimpanzees did not develop the language ability because they did not need to. Their lifestyles did not require the ability. Humans, on the other hand, even in the hunter-gatherer stage, would have benefited enormously from being able to share kind of detailed information about plants and animals and the like, and thus there could have been an evolutionary pressure that drove the development of language.

Human language was related to the evolution of the physical apparatus that enabled complex sound production along with the associated brain adaptations, though the causal links between them is not fully understood. Did the brain increase in size to cope with rising language ability or did the increasing use of language drive brain development? We really don't know yet.

The argument against a linguistic hierarchy in animals can be seen in the fact that different aspects of language can be found to be best developed in different animals.

The most receptive trainee for an artificial language with a syntax and semantics has been a parrot; the species with the best claim to recursive structure in its signaling has been the starling; the best vocal imitators are birds and dolphins; and when it comes to reading human intentions, chimps are bested by man's best friend, Canis familiaris. (Pinker, PS20)

It seems clear that we are unlikely to ever fully communicate with other species the way we do with each other. But the inability of other animals to speak the way we do is no more a sign of their evolutionary backwardness than our nose's lack of versatility compared to the elephant's trunk, or our inability to use our hands to fly the way bats can, are signs that we are evolutionarily inferior compared to them

We just occupy different end points on the evolutionary bush.

POST SCRIPT: But isn't everyone deeply interested in golf?

If you want yet more reasons why TV news is not worth watching . . .

June 26, 2008

Can animals talk?

One of the most interesting questions in language is whether animals can talk or at least be taught to talk. Clearly animals can communicate in some rudimentary ways, some more so than others. Some researchers are convinced that animals can talk and have spent considerable efforts to try and do so but with very limited results. In the comments to an earlier post, Greg referred to the efforts by Sue Savage-Rumbaugh (and Duane Rumbaugh) to train the bonobo chimpanzee Kanzi to speak, and Lenen referred to the development of spontaneous language in children who had been kept in a dungeon. There have been other attempts with chimps and gorillas named Washoe, Koko, Lana, and Sarah.

One thing that is clear is that humans seem to have an instinctive ability to create and use language. By instinctive, I mean that evolution has produced in us the kinds of bodies and brains that make learning language easy, especially at a young age. It is argued that all humans are born possessing the neural wiring that contains the rules for a universal grammar. The five thousand different languages that exist today, although seeming to differ widely, all have an underlying grammatical similarity that is suggestive of this fact. For example, this grammar affects things like the subject-verb-object ordering in sentences. In English, we would say "I went home" (subject-verb-object) while in Tamil it would be "I home went" (subject-object-verb).

What is interesting is that of all the grammars that are theoretically possible, only a very limited set is actually found in existence. We do not find, for example, languages where people say "Home went I" (object-verb-subject). What early exposure to language does is turn certain switches on and off in the universal grammar wiring in our brains, so that we end up using the particular form of grammar of the community we grow up in. This suggests that language structures are restricted and not infinitely flexible, indicating a biological limitation.

The instinctive nature of language can be seen in a natural experiment that occurred in Nicaragua. There used to be no sign language at all in that country because the children were isolated from one another. When the Sandinistas took over in 1979, they created schools for the deaf. Their efforts to formally teach the children lip reading and speech failed dismally. But because the deaf children were now thrown together in the school buses and playgrounds, the children spontaneously developed their own sign language that developed and grew more sophisticated and is now officially a language that follows the same underlying grammatical rules as other spoken and sign languages. (Steven Pinker, The Language Instinct, 1994, p. 24)

What about animals? Many of us, especially those of us who have pets, would love to think that animals can communicate. As a result, we are far more credulous than we should be of claims (reported in the media) by researchers that they have taught animals to speak. But others, like linguist Steven Pinker, are highly skeptical. When looked at closely, the more spectacular elements of the claims disappear, leaving just rudimentary communication using symbols. The idea that some chimps can be taught to identify and use some symbols or follow some simple spoken commands does not imply that they possess underlying language abilities comparable to humans. The suggestion that animals use sign 'language' mistakenly conflates the sophisticated and complex grammatical structures of American Sign Language and other sign languages with that of a few suggestive gestures.

The belief that animals can, or should be able to, communicate using language seems to stem from two sources. One lies in a mistaken image of evolution as a linear process in which existing life forms can be arranged from lower to higher and more evolved forms. One sees this in posters in which evolution is shown as a sequence: amoebas→ sponges→ jellyfish→ flatworms→ trout→ frogs→ lizards→ dinosaurs→ anteaters→ monkeys→ chimpanzees→ Homo sapiens. (Pinker, p. 352) In this model, humans are the most evolved and it makes sense to think that perhaps chimpanzees have a slightly less evolved linguistic ability than we do but that it can be nudged along with some human help. Some people are also convinced that to think that animals cannot speak is a sign of a deplorable species superiority on our part.

But that linear model of evolution is wrong. Evolution is a branching theory, more like a spreading bush. Starting from some primitive form, it branched out into other forms, and these in turn branched out into yet more forms and so on, until we had a vast number of branches at the periphery. All the species I listed in the previous paragraph are like the tips of the twigs on the canopy of the bush, except that some (like the dinosaurs) are now extinct. Although all existing species have evolved from some earlier and more primitive forms, none of the existing species is more evolved than any other. All existing species have the same evolutionary status. They are merely different.

In the bush image, it is perfectly reasonable to suppose that one branch (species) may possess a unique feature (speech) that is not possessed by the others, just like the elephant possesses a highly useful organ (the trunk) possessed by no other species. All that this signifies is that that feature evolved after that branch separated from the rest of the bush and hence is not shared by others. The fact that nonhuman animals cannot speak despite extensive efforts at tutoring them is not a sign that they are somehow inferior or less evolved than us.

Some efforts to teach animals language skills seem to stem from a sense of misguided solidarity. It is as if the more features we share with animals, the closer we feel we are to them and the better we are likely to treat them. It is undoubtedly true that the closer we identify with some other living thing, the more empathy we have for it. But the solution to that is to have empathy for all living creatures, and not try to convince ourselves that we are alike in some specific ways.

As Pinker says:

What an irony it is that the supposed attempt to bring Homo sapiens down a few notches in the natural order has taken the form of us humans hectoring another species into emulating our instinctive form of communication, or some artificial form we have invented, as if that were a measure of biological worth. The chimpanzees' resistance is no shame to them; a human would surely do no better if trained to hoot and shriek like a chimp, a symmetrical project that makes about as much scientific sense. In fact, the idea that some species needs our intervention before its members can display a useful skill, like some bird that could not fly until given a human education, is far from humble! (p. 351)

While any animal lover would dearly love to think that they can talk with animals, we may have to reconcile ourselves to the fact that it just cannot happen, because they lack the physical and perhaps cognitive apparatus to do so.

Next: The differences between animal and human communication.

POST SCRIPT: Superstitions

One of the negative consequences of religious beliefs is that it leads to more general magical thinking, one form of which is superstitions. Steve Benen lists all the superstitions that John McCain believes in.

It bothers me when political leaders are superstitious. Decision-makers should not be influenced by factors that have no bearing whatsoever on events.

June 25, 2008

When did language originate?

Trying to discover the origins of language is a fascinating scientific problem but the evidence is necessarily indirect. Clearly our bodies' physical capacity to articulate sounds is a biological development. Language had to be preceded by the evolution of the physical organs responsible for vocalization. Those organs must have co-evolved with those parts of the brain that can process language. But this evolutionary history is hard to reconstruct since the voice organs and brains are made of soft tissue and are thus unlikely to fossilize. Even if we could get an accurate fix on when the actual physical ability to speak came into being, this the could only be used to set a limit on the earliest time at which language could have occurred, but tells us nothing of when it actually did.

Since humans have these language organs and our closest existing cousins the chimpanzees do not, and since our branch of mammals split off from chimpanzees about 5-7 million years (or about 350,000 generations) ago, it is theoretically possible for language to be that old and still be consistent with only humans being able to speak.

At the other end, the discovery of cave art in Europe consisting of depictions of animals and humans in carved and painted and sculpted forms by Cro-Magnon humans in the Upper Paleolithic era about 35,000 years ago indicate complex social thinking indicative of the presence of language, suggesting that this sets a limit on the latest time for the origin of language.

But 35,000 to 5-7 million years is a huge time interval and attempts have been made to get a more precise fix on the origin of language. Various approaches have been attempted. One avenue of exploration comes from linguistics: the study of languages themselves and how they evolved. Another is to look at the physiological development of the human body. A third method is to look at the development of lifestyles to discern levels of complexity that suggest the kinds of social organization that would require language. A fourth is to look at the use of tools, to see if there is sophistication and uniformity over a wide area suggesting that knowledge was being shared and transmitted to distant locales.

While these are all promising avenues of research, unfortunately the lines of evidence from these different approaches currently do not converge on a single time, suggesting that we still have a long way to go in determining when language might have arisen.

Starting with linguistics, it is known that the structure of languages is very analogous to the biological tree of living organisms. Just as the fossil and DNA evidence all point to all living things being descended from a common ancestor, the approximately five thousand languages that currently exist exhibit grammar and vocabulary relationships strongly suggestive of the fact that they are all derived from a single common proto-language that existed long ago that evolved and split into branches the way that living organisms did. By tracing that linguistic tree back in time, we may be able to fix narrower bounds on the date of origin of that proto-language.

Steven Pinker argues that since modern humans Homo sapiens first appeared about 200,000 years ago and spread out of Africa about 100,000 years ago, and since all modern humans have identical language abilities along with a universal grammar, it seems likely that language appeared concurrently with the first appearance of modern humans. (Steven Pinker, The Language Instinct, 1994, p. 363, 364) Furthermore, there was a more than a tripling of brain size (from 400cc to 1350cc) during the period between the first appearance the genus Homo (in the form of Homo habilis) about two million years ago until Homo sapiens appeared, suggesting that the brain developed in that period partly in order to accommodate the new language centers. Pinker suggests that since Homo sapiens are us, it seems reasonable that language came into being as long ago as 200,000 years ago.

As for biological development. Richard Leakey explains what it is about the human body that enables speech. (The Origin of Humankind, 1994)

Humans are able to make a wide range of sounds because the larynx is situated low in the throat, thus creating a large sound-chamber, the pharynx, above the vocal chords . . . the expanded pharynx is the key to producing fully articulate speech . . . In all mammals except humans the larynx is high in the throat, which allows the animal to breathe and drink at the same time. As a corollary, the small pharyngeal cavity limits the range of sounds that can be produced. . . Although the low position of the larynx allows human to produce a greater range of sounds, it also means that we cannot drink and breathe simultaneously. We exhibit the dubious liability for choking.

Human babies are born with the larynx high in the throat, like typical mammals, and can simultaneously breathe and drink, as they must during nursing. After about eighteen month, the larynx begins to migrate down the throat, reaching the adult position when the child is about fourteen months old. (p. 130)

The unique position of the larynx in human speech suggests that if were able to identify when it got lowered to its present position, we might be able to determine when we first had the ability to speak. But the problem is that those parts of the body are made of soft tissues and do not fossilize easily. However, the shape of the bottom of the skull called the basicranium is arched for humans and essentially flat for other mammals and this part of the skull is an indicator of how well it can articulate sounds. "The earliest time in the fossil record that you find a fully flexed basicranium is about 300,000 to 400,000 years ago, in what people call archaic Homo sapiens." (Leakey, p. 132)

But of course that does not mean that language developed simultaneously with the basicranium. Leakey says that it is unlikely that language was fully developed among archaic Homo sapiens.

The brain is another indicator of possible language origins. The part of the brain known as Broca's area is a raised lump near the left temple associated with language and the use of tools. Furthermore, the left hemisphere of the brain (which is associated with language) is larger than the right. So if we can find fossilized skulls that indicate the presence of either of these features, that would also indicate the onset of possible linguistic ability. A fossil found nearly two million years ago seems to have just such features. Combined with the discovery of tool-making around this time Leakey thinks it is possible that it was with the advent of Homo habilis (the handyman) about two million years ago that language first started to appear, at least in a very crude form. (Leakey, p.129)

Another strategy is to look at the various tools and other artifacts that humans created and see if there is an increase in sophistication and increased spread of similar designs, which would suggest the sharing of knowledge and ideas and thus speech. The more complex the social structures in which people lived, the greater the need for language. As for tools, although they started being made about two million years ago, the earliest kinds were opportunistic in nature. More conscious tool making began about 250,000 years ago but then stayed static for about 200,000 years. The kinds of ordering of tools that are really suggestive of language does not seem to occur until suddenly about 35,000 years ago, coinciding with the sudden spurt in cave art in the Upper Paleolithic period. (Leakey, p. 134)

So basically the situation is confused. While it is possible that language began to appear in some primitive form as early as two million years ago, it seems more likely that real language skills began about 200,000 years ago. Also it is not clear whether language evolved gradually since that time or whether it remained in a low and more-or-less static state before suddenly exploding about 35,000 years ago into the complex language structures that we now have.

Next: Can animals talk?

POST SCRIPT: Fred and Wilma? Who knew?

The most unforgettable act of the 1969 Woodstock festival was Joe Cocker's rendering of the Beatles' A little help from my friends, a gentle song sung by Ringo Starr, which Cocker turned into an over-the top, weird, air-guitar-playing, frenzied, incoherent performance that looked like he was having some kind of seizure. Throughout it, you kept wondering what the hell he was singing since the lyrics seemed to have only a passing resemblance to the original.

Some helpful soul has now provided captions for Cocker's words. It all makes sense now. Or maybe not.

(Thanks to Jesus's General.)

June 24, 2008

The power of language

One of the things that makes some people uneasy about the theory of evolution is its implication that humans are just one branch in the tree of life, connected to every other living thing through common ancestors, and thus not special in any mysterious way. It is surely tempting to think that we must be somehow unique. Look at the art and culture and science and technology we have produced and for which nothing comparable exists by any other species. How can we explain that if we are not possessed of some quality not present in other species?

One doesn't have to look far to find one feature that distinguishes the human species from all its cousins in the evolutionary tree of life. It is language. Somehow, at some point, we developed the capacity to speak and communicate with each other through well-articulated sounds and that has had a profound impact on our subsequent development. Although the number of phonemes (units of sound) that humans can make (about fifty) is not vastly greater than the number available to apes (about a dozen), we can use them to generate an average vocabulary of about 100,000 words. "As a consequence, the capacity of Homo sapiens for rapid, detailed communication and richness of thought is unmatched in the world of nature." (Richard Leakey, The Origin of Humankind, 1994, p. 122)

Without language, the knowledge of animals is restricted to what they are born with as a result of their evolutionary development (i.e., their instincts) and what they acquire during their own lifetimes. That is necessarily restricted and each generation essentially starts life at the same point in knowledge space as the previous one.

But with language, all that changes. Now knowledge can be passed on from generation to generation and we can learn from our ancestors. Knowledge becomes cumulative and the process accelerated with the discovery of writing about 6,000 years ago, resulting in the ability to store and retrieve knowledge over long times and long distances.

I have sometimes wondered why religious people, always on the lookout for a sign that humans are special in god's eyes and possessed of some quality that could not be accounted for evolutionarily, have not seized on language as that which makes us uniquely human. Why don't intelligent design advocates suggest that it was god's intervention that enabled us to develop the ability to speak?

One advantage to religious people of using the introduction of language as a mysterious sign of god's actions is that it is hard to pin down exactly when and how language started, and thus might make it hard to explain scientifically, making it an even better choice for a religious explanation than the bacterial flagellum or even the origin of life. Language was a significant development in our evolutionary history but how it came about is murky because spoken language leaves no trace.

Of course, the fact that we humans possess a unique feature does not necessarily imply that we are special. After all, elephants can also boast of a uniquely useful organ, the trunk, that can do truly amazing things. It is strong enough to uproot trees and stack them carefully in place. It is delicate enough that it can pick a thorn, draw characters on paper with a pencil, or pick up a pin. It is dexterous enough that it can uncork a bottle and unbolt a latch. It is sensitive enough to smell a python or food up to a mile away. It can be used as a siphon and a snorkel. And it can do many more things, both strong and delicate. (Steven Pinker, The Language Instinct, 1994, p. 340)

Why did only elephants evolve this extremely useful organ compared to which the human nose seems so inadequate? It presumably developed according to the laws of natural selection, just like everything else. But if elephants were religious, they might well be tempted to argue that having a trunk was a sign from god that they were special and made in god's image, and thus that god must have a trunk too.

So uniqueness alone doesn't imply that we are possessed of some spiritual essence. But even if the ability to speak does not confer on us a mystical power, the question of when and how humans developed this profound and incredibly useful ability is well worth studying.

Next: When did language originate?

POST SCRIPT: George Carlin on language

I had written this post on language last week but then learned that comedian George Carlin died yesterday at the age of 71. He pushed the boundaries of comedy and many of his riffs dealt with the hypocritical use of language. His famous routine "Seven words you can't say on TV" ended up in 1973 as a case in the Supreme Court, which ruled that the government did have a right to limit the words used on broadcasts.

That routine is below. As to be expected, there is extensive and repeated use of the seven naughty words so don't watch if such language offends you.

Bonus video: George Carlin was also an atheist who poked fun at the lack of logic underlying religious beliefs.


June 23, 2008

Cloning and stem cell research

(This series of posts reviews in detail Francis Collins's book The Language of God: A Scientist Presents Evidence for Belief, originally published in 2006. The page numbers cited are from the large print edition published in 2007. The complete set of these posts will be archived here.)

In the Appendix of his book The Language of God: A Scientist Presents Evidence for Belief (2006), Francis Collins gives a very clear and brief exposition of the issues involved in stem cell research and cloning, which are not the same thing despite popular impressions.

A human being starts out as a single cell formed by the union of an egg and a sperm. The nucleus of this cell contains the contributions of DNA from each of the two parents and thus all the genetic instructions, while the region outside the nucleus, called the cytoplasm, contains the nutrients and signaling mechanisms that enable the cell to do whatever it is meant to do.

The single cell starts multiplying by copying itself, a process known as mitosis. In the very early stages, all the cells are identical and capable of eventually becoming any specialized cell like a liver cell, blood cell, etc. Such cells are called 'pluripotent' because of their ability to become any of the tissues that make up the body and it is these cells that are called embryonic stem cells and the center of the ethical debate.

Soon these embryonic cells begin to specialize and differentiate into cells that begin to form different organ tissues. They do this by having the DNA start turning switches on and off in its genes. Some of these specialized cells, such as those found in limited amounts in bone marrow, become what are known as adult stem cells in that while they still have the ability to differentiate further, they can do so only into a much more limited variety of adult tissues. Such stem cells are called 'multipotent'.

The promise of stem cell research is that one can use a person's own stem cells to regenerate tissues lost or damaged by all kinds of diseases. Since these cells are not perceived as foreign matter, this would not trigger the body's immune mechanism that rejects foreign tissues, as occurs currently with transplants. At present, this immune response has to be suppressed with powerful drugs, leaving the patient vulnerable to other infections.

The ethical problem is that although adult stem cells can be obtained and used from an adult without harming that person, they have only a very limited flexibility. Pluripotent cells are preferred but at present using such cells results in the loss of the embryos from which they are taken, and this immediately raises the ethical issue of whether by destroying an embryo, we are destroying life.

Currently pluripotent stem cell lines are created during the process of in-vitro fertilization, by taking an egg from a woman, fertilizing it in a petri dish with sperm from a man, and growing the resulting cell in solution containing the necessary nutrients for its growth. After about five days, what is called a 'blastocyst' is formed which consists of about 70-100 cells. This consists of an outer wall of cells encompassing a hollow cavity, and an inner clump of about 30 cells (called the inner cell mass) at one end of the cavity. It is the inner cell mass that eventually turns into the tissues that make up the growing fetus, while the outer wall becomes the placenta.

In-vitro fertilization is done to assist childless couples. The selected blastocyst is implanted in the uterus of either the person who donated the egg (the biological mother) or a surrogate, and once it adheres to the wall of the uterus, it receives oxygen and other nutrients from the mother and develops as any other fetus.

The ethical dilemma arises because the process is not 100% certain, and thus many more fertilized eggs and blastocysts are created this way than are currently used to generate actual pregnancies, and this has resulted in hundreds of thousands of unused fertilized eggs. They are currently kept frozen.

Researchers suggest that these fertilized eggs be used (with the donors' permission) to generate embryonic stem cell lines that can be used for research purposes. To do this, the inner cell mass is extracted from the blastocyst and transferred into a dish containing a culture that enables it to grow. When this is done, the blastocyst is effectively destroyed and cannot be used to create a human.

Opponents of embryonic stem cell research say that even a single fertilized egg cell is a human life and thus the blastocyst created this way should never be destroyed. Others argue that a blastocyst has none of the qualities that we associate with being human and thus destroying it not taking a life.

This dilemma created by scientific advances may be resolved by further scientific advances.

One possible compromise arises from the discovery of the process by which animals have been cloned, starting with the famous cloned sheep Dolly. This process is known as somatic cell nuclear transfer (SCNT). What happened with Dolly is that a single cell was taken from the udder of an adult sheep and its nucleus (containing all the genetic information) was extracted. Then an egg cell was taken and its nucleus removed and replaced with the nucleus that had been extracted from the udder cell.

What one might have expected to have created was a cell that was specialized for udders since one had taken a cell from the udder of an adult and by that time the cell should have become specialized for just that purpose. It was once thought that this process of specilization was irreversible. i.e., once a pluripotent embryonic stem cell becomes an adult stem cell or an adult specialized cell, there was no going back to its unspecialized state.

What researchers found to their amazement was that when the udder cell nucleus was inserted into the egg cell that had had its nucleus removed, the nucleus seemed to effectively go back in time and become like the original embryonic cell that had eventually resulted in the sheep from which the udder cell was obtained. When this was then implanted in a sheep, it grew as if from a single fertilized egg and gave rise to a new sheep (Dolly) that had genes identical to those of the sheep from which the original udder cell was taken.

This process has now been repeated with other mammals like horses, cows, dogs, and cats. Although the Raelians made the spectacular claim that they had used this technique to clone a human being, that seems like a hoax.

As a result of this research, it looks like it should be possible to take a nucleus from (say) the skin cell of an adult human and insert it into an egg cell that has had its nucleus removed and thus create cells that have all the properties of embryonic stem cells. Thus it should be possible to create blastocysts in the laboratory without having them originate in the fusion of sperm and egg, the traditional way in which children are conceived. These stem cells would have DNA identical to those of the adult whose skin cell the nucleus was taken from, and not a fusion of mother and father DNA information, the way an embryo is normally formed.

Of course, if this cell is implanted in a uterus, one could potentially create a cloned human being but no one is suggesting that that be done. In fact, there is strong worldwide opposition to such an act. But if the cell is grown in a petri dish, then it could generate the equivalent of embryonic stem cells for both research and therapeutic purposes.

Would the process of SCNT be considered sufficiently different from the usual process of creating a fertilized egg to be considered not a potential human and thus overcome the ethical problems of stem cell research? That remains to be seen.

POST SCRIPT: Tough times

We know that the troubled economy is hurting many people. The Daily Show looks at how it is affecting the people of Beverly Hills.


June 20, 2008

Bioethical dilemmas

(This series of posts reviews in detail Francis Collins's book The Language of God: A Scientist Presents Evidence for Belief, originally published in 2006. The page numbers cited are from the large print edition published in 2007. The complete set of these posts will be archived here.)

In the Appendix of Francis Collins's book The Language of God: A Scientist Presents Evidence for Belief (2006), he tackles the difficult ethical issues raised by advances in science and medicine, especially in the field of molecular biology. His own major contributions to the human genome have undoubtedly made him acutely conscious of these issues. Collins's describes the science and the issues arising from them very clearly and this Appendix is well worth reading.

Having mapped out the entire human genome, scientists are now in the position of being potentially able to identify the presence of genes that may predispose people to certain diseases or behaviors long before those things have manifested themselves in observable ways. This ability has, of course, some obvious advantages in the prevention and treatment of diseases.

For example, breast cancer has a hereditary component that can be identified by the presence of a dangerous mutation in the gene BRCA1 on chromosome 17. This mutation, which also creates a greater risk for ovarian cancer, can be carried by fathers as well, even though they themselves may not have the disease. In those families in which breast cancer is prevalent, knowing who has the mutated gene and who hasn't may influence how closely they are monitored and what treatments they might be given.

As time goes by, our genetic predisposition to more and more hereditary diseases will be revealed. But is this an unqualified good thing?

On the plus side, having this knowledge may enable those people at risk to take steps (diet, exercise, preventative treatment) that can reduce their risk of actually contracting the disease. After all, genes are usually not the only (or even the main) factor in causing disease and we often have some degree of control over the other risk factors for diseases such as diabetes or blood clotting.

We may also be able to treat more genetic diseases by actually changing an individual's genes, although currently the only changes being made are to the genes in the somatic cells (the ones that make up our bodies) and not the ones in the 'germ' line cells (the ones that are passed on to children via the egg and sperm). At present, there is a scientific and medical consensus that influencing the genes of future generations by changing the germ line is not something we should do.

Furthermore, our bodies' reaction to drugs is also often affected by our genes. That knowledge can be used to individualize treatment, to determine which drug should be given to which patient, and even to design drugs that take maximum advantage of an individual's genetic makeup. This kind of personalized medicine lies in our future.

But there are negatives to this brave new world of treatment. Should everyone have their DNA mapped to identify potential risk factors? And who should have access to a person's genetic information?

Some people may prefer not to know the likelihood of what diseases they are predisposed to, especially in those cases where nothing much can be done to avert the disease or what needs to be done would diminish by too much the quality of life of the individual. Furthermore, they may fear that this information could be used against them. If they have a predisposition for a major disease and this knowledge reaches the health insurance companies, the latter may charge them higher premiums or even decline to cover them at all. After all, the profit-making basis on which these companies run makes them want to only insure the pool of healthy people and deny as much coverage as possible to those who actually need it.

It works the other way too. If someone knows they have a potential health problem but the insurance companies don't, they may choose health (and life) insurance policies that work to their advantage.

So genetic information can become a pawn in the chess game played between the individual and the health (and life) insurance agencies.

This is, by the way, another major flaw of the current employer-based private health insurance schemes in the US. If we had a single-payer, universal health care system as is the case in every other developed country, and even in many developing countries, this problem regarding genetic knowledge would not even arise. Everyone would be covered automatically irrespective of their history, the risk would be spread over the entire population, and the only question would be the extent to which the taxpayers wanted to fund the system in order to cover treatment. That would be a matter determined by public policy rather than private profit. There would still be ethical issues to be debated (such as on what basis to prioritize and allocate treatment) but the drive to minimize treatment to maximize private profit would be absent, and that is a huge plus.

There are other issues to consider. What if we find a gene that has a propensity for its bearer to commit crimes or other forms of antisocial behavior? Would it be wrong to use this knowledge to preventively profile and incarcerate people? It has to be emphasized that our genes almost always are not determinants of behavior but at best provide small probabilistic estimates. But as I have written before, probability and statistics is not easy to understand, and the knowledge that someone has a slightly greater chance of committing a crime can, if publicly known, be a stigma that person can never shake, however upstanding and moral a person he or she tries to be.

There is also the question of what to do with people who want to use treatments that have been developed for therapeutic purposes in order to make themselves (or their children) bigger, taller, stronger, faster, better-looking, and even smarter (or so they think) so that they will have an advantage over others. That thought-provoking film Gattaca (1997) envisions a future where parents create many fertilized eggs, examine the DNA of each, and select only those which contain the most advantageous genetic combinations to implant in the uterus. Collins points out that while this is theoretically possible, in practice it cannot be used to select for more than two or three genes. Even then, there are no guarantees that environmental effects as the child is growing up may not swamp the effects of the carefully selected genes. (p. 354)

Collins argues, and I agree with him, that these are important ethical decisions that should not be left only to scientists but should involve the entire spectrum of society. He appeals to the Moral Law as general guidance for dealing with these issues (p. 320). In particular he advocates four ethical principles (formulated by T. L. Beauchamp and J. F. Childress in their book Principles of Biomedical Ethics, 1994) that we might all be able to agree on in making such decisions. They are:

  1. Respect for autonomy – the principle that a rational individual should be given freedom in personal decision making, without undue outside coercion.
  2. Justice – the requirement for fair, moral, and impartial treatment of all persons
  3. Beneficence – the mandate to treat others in their best interest
  4. Nonmaleficence – "First do no harm" (as in the Hippocratic Oath)

These are good guidelines, though many problems will undoubtedly arise when such general secular ethical principles collide with the demands of specific religious beliefs and cultural practices. When supposedly infallible religious texts become part of the discussion, it makes it almost impossible to seek underlying unifying moral and ethical principles on which to base judgments.

POST SCRIPT: Brace yourself

Matt Taibbi warns that this presidential election is going to be very rough.

April 18, 2008

The changing problems of science and religion

(I will be away on travel this week so will be reposting an old series, edited and updated, that discusses the nature of science and the difference between science and religion. New posts start again on Monday, April 21, 2008.)

In the previous posting, I discussed some of the problems that arise is reconciling science and religion. These problems change with time as our understanding of science changes and the explanatory powers of science encompass more and more phenomena.

For example, in the pre-Copernican era, one could have had a plausible model of god that became much harder to sustain in the light of post-Copernican scientific developments. This was because the universe then was seen as consisting of a spherical Earth located at the center of a finite universe and surrounded by a concentric rotating sphere in which the stars were embedded. (See Thomas Kuhn's The Copernican Revolution for a detailed history.) People thought that the stars were very small objects, and thus the outer sphere containing them could be quite nearby.

In that model, it was possible to think of the heavens as lying beyond this outer sphere and this provided a home for god and angels and so on. There are no major conceptual problems in believing this model. This model enabled people to envision without much difficulty how god could intervene in the events on Earth. All that was required was to imagine god as having pretty much the same powers as human beings did, but just more powerful and extensive. Thus god has more refined senses, sees better, hears better, is more powerful, travels faster, etc. It was not hard to think of god in heaven actually seeing and hearing what was going on Earth, being able to send thunderbolts or other forms of signals from heaven to Earth, or even making a quick trip (either personally or by sending angels) to Earth. Believing that god intervened in everyday events was not that hard to conceive within the framework of a pre-Copernican cosmology.

But Copernicus' introduction of a heliocentric universe, and the more precise astronomical observations made possible by the invention of the telescope caused some serious problems for such early models, although the theological implications seemed to have taken some time to sink in.

As Kuhn points out (on page 193):

When it was taken seriously, Copernicus' proposal raised many gigantic problems for the believing Christian. If, for example, the earth were merely one of six planets, how were the stories of the Fall and of the Salvation, with their immense bearing on Christian life, to be preserved? If there were other bodies essentially like the earth, God's goodness would surely necessitate that they, too, be inhabited. But if there were men on other planets, how could they be descendents of Adam and Eve, and how could they have inherited the original sin, which explains man's otherwise incomprehensible travail on an earth made for him by a good and omnipotent deity? Again, how could men on other planets know of the Savior who opened to them the possibility of eternal life? Or, if the earth is a planet and therefore a celestial body located away from the center of the universe, what becomes of man's intermediate but focal position between the devils and the angels? If the earth, as a planet, participates in the nature of celestial bodies, it cannot be a sink of iniquity from which man will long to escape to the divine purity of the heavens. Nor can the heavens be a suitable abode for God if they participate in the evils and imperfections so clearly visible on a planetary earth. Worst of all, if the universe is infinite, as many of the later Copernicans thought, where can God's Throne be located? In an infinite universe, how is man to find God or God man?

Most of those new problems are metaphysical. The last point mentioned by Kuhn is the one I want to focus on because it represents a physical problem and the one that is of most interest to me as a physicist. If the universe if infinite, then where does god exist? Since telescopes can now observe vast sections of the universe, it strains the imagination to think of god occupying some part of the physical universe because if god is made of the same kinds of stuff as other things in the universe, then how is it that our telescopes and other devices don't detect anything?

I am not sure (not being an expert of the history of theology) but it may be that it was to solve this problem that popular ideas about god being a non-material entity (and hence undetectable by telescopes) who is everywhere began to gain ground. That way, it was possible to overcome the time and space problems associated with having a material god who necessarily has to occupy the same physical space as us.

But this raises yet other problems. If god is non-material and occupying a non-material space that co-exists with our more familiar material world, then how can he/she interact with the material world to influence it? After all, if (say) god intervenes to change the course of natural events, then it must involve changing the behavior of tangible physical objects and this requires the application of forces to those tangible objects, and such forces fall within the realm of the physical world.

One solution is to forego all interventions by god except in the form of changing people's minds, and postulate that human beings possess a mind that is independent of the body, and thus occupies a space similar to or identical with that occupied by god. Thus communication within this 'spirit world' can take place between god and people. Such models allow for the concept of an after-life.

But this just shifts the problem one step away, and does not solve it. Because then we have the problem of understanding the mind-body relationship of each person and this has all the problems associated with the god-people relationship. If the mind exists independently of the body, then where does it exist? If the mind is a non-material entity, then how does it influence the body (which is material)? And so on. Such concerns were articulated by the mathematician-scientist-philosopher Rene Descartes (1596-1650). Note that Descartes posed these concerns after Copernican ideas had taken hold and the potentially vast size of the universe became better appreciated, giving such problems a sense of urgency,

The way that I have formulated these questions obviously reveals my physics background. I treat space and time as meaningful physical entities and so cannot easily absorb platitudinous statements like "god is everywhere" without further exploration as to what that statement actually means. I am guessing that most people do not consciously consider these questions either because they do not occur to them or shy away from them because of the discomfort they can cause.

So how does one resolve all these problems created by the assumption of god's existence in the light of modern scientific knowledge about a vast universe? I think once again people have to resort to Ockham's razor and each person will choose a position that satisfies him or her. I found that using Ockham's razor resulted in my dispensing with the idea of god altogether.

Assuming the existence of god creates a vast number of contradictions and complications that can only be dealt with by pleading ignorance and invoking an inscrutable deity, neither of which is very satisfying.

April 17, 2008

Science, religion, and Ockham's razor

(I will be away on travel this week so will be reposting an old series, edited and updated, that discusses the nature of science and the difference between science and religion. New posts start again on Monday, April 21, 2008.)

A few days ago I was working in my backyard when I noticed that the outdoor thermometer that I had fixed to a fence had disappeared. The mountings were still there but had been pulled away slightly. I thought that maybe the wind had blown it off and so I looked at the ground underneath but the thermometer was not there. There is a bed of pachysandra nearby and I looked nearby in it but no luck. I was baffled.

I pondered the various options for explaining the missing thermometer. One was that the wind had been strong enough to rip the thermometer from its mounting and blow it farther away into the pachysandra. The other was that it had fallen to the ground below and had then been taken away by squirrels or the neighbor's cat. The third was that neighborhood children had borrowed it without permission for some experiment. The fourth was that the International Outdoor Thermometer Cartel (IOTC) had raised the price of these thermometers to such a high value that organized crime gangs were stealing them and selling them on the black market. The fifth option was that aliens had taken it away as a souvenir of their clandestine visit to Earth.

Given these options, I decided that #1 was the most likely one and looked in the pachysandra over a larger area and, sure enough, I found it.

The reason for this anecdote is that it illustrates that I used something that we all use all the time (whether we are consciously aware of it or not), and that is Ockham's razor to make choices among competing theories.

According to the Encyclopedia Brittanica, the principle behind Ockham's razor (also called the law of economy or the law of parsimony) was stated by the scholastic William of Ockham (1285–1347), as "Plurality should not be posited without necessity." The principle is also expressed as "Entities are not to be multiplied beyond necessity." Ockham did not himself use the word 'razor', that was added to his name later by others.

The principle gives precedence to simplicity, but there are two ways it can be used. In the first case (which is more closely aligned with Ockham's intent), it says that you should not postulate more elements for anything other than the minimum required. For example, in the case of my missing thermometer, if I postulated one theory that a cat had taken it and a competing theory was that a cat that had a striped tail and a scar on its forehead had taken it, then in the absence of any extra information, the former theory is to be preferred. The latter theory just adds elements that do not add any necessary information to the explanation. The application of this version of the principle is fairly straightforward. One seeks the smallest subset of elements of a theory that provides an adequate explanation of whatever you are trying to explain.

The more problematic (but common) use of Ockham's razor is when you try and apply it to a situation where there are two competing theories that share either no common elements or there exist at least some necessary elements of one theory that the other does not possess. We commonly interpret Ockham's razor in those situations as requiring us to choose the simpler of the two theories. But simplicity may well lie in the eye of the beholder and it may not be easy to get agreement.

So, for example, in the case of the thermometer that was found some distance away from its mountings, the simpler explanation (for me at least) was that of the wind. If called upon, I could cite Bernoulli's Principle and the laws of motion to support my preference. That explanation is enough to satisfy me.

But this may not be true for someone else. For someone who is a believer in the existence UFOs and space aliens, a theory that alien vandals landed in my garden, tore the thermometer from its moorings, threw it away in the pachysandra and left in their spaceship, might be the "simpler" explanation. After all, it does not involve the use of calculus.

That is exactly the problem in many of the science and religion discussions. Apart from those people who reject science altogether, the integration of science and religion into one coherent philosophical framework becomes one of the most difficult challenges and there is no simple solution to it. And all of us use Ockham's razor to resolve it, even though the results are not the same for everyone.

A belief in the existence of god implies that there must be at least some phenomena caused by the intervention of god that lie outside the purview of science. (I am not considering the point of view that god created the world and its laws in one instant of time long ago and then has had a completely hands-off policy since then.)

For example, Biblical literalists will start with the assumption that the Bible is a historical document and that the events described in it (the world was created in six days and is only 6,000 years old, Joshua caused the Sun to stand still, Noah's flood did occur, etc.) all actually occurred. They will then painstakingly and tortuously try and reinterpret all evidence to be consistent with these axioms. The website Answers in Genesis goes to extraordinary lengths to try and answer questions such as "Where did Cain find his wife?" and "Did dinosaurs live alongside humans?" These are questions that do not trouble anyone who does not treat the Bible as an authoritative source for science and history.

But even those who take the Bible less literally have to confront difficult questions because at some point, the question is going to arise about where you draw the line and ascribe something to the actions of god. Each person will draw the line between god's actions and the actions of natural laws differently, depending on their personal level of comfort with the explanation.

This is something that believers in any theistic religion have to confront. Some will believe that any event that does not have a ready explanation to hand (a death in the family, an escape from injury, an unexpected recovery from a serious illness) are directly due to god's intervention to change the course of events. In order to deal with the existence of evil in the presence of an omnipotent and loving god, believers usually end up having to postulate that god's actions are inscrutable and that we cannot know the answers to at least some of the events that occur in the world.

At the other end, others might believe that god does not actually cause a change in the natural sequence of events but instead exerts his/her influence by working through people. In other words, people are the agents of god's actions and the sole mechanism by which he/she influences events. So people are cured of illnesses because god inspires researchers and physicians, and so on.

There are also an infinite number of intermediate states between those two extremes. For example, people like the biochemist Michael Behe, who is an intelligent design advocate and author of the book Darwin's Black Box, accept natural explanations for everything except for a few selected phenomena at the biochemical level (such as the blood clotting mechanism or the creation of the bacterial flagellum) that he feels are unlikely to have been created by natural processes. (See the New Yorker article by H. Allen Orr for a clear description of what Behe's argument is. Cory also sent me a link to a nice article written by John Rennie, editor of Scientific American, that addresses some of the key points raised by ID advocates.)

Or one can use decide that there is no god (or supernatural entity of any kind), and all that exists is the material world. This is the position of philosophical naturalism or atheism. (I am treating the two terms as effectively synonymous, although professional philosophers might disagree).

So we are left with only Ockham's razor with which to make a decision but in this case, it is a very personal razor whose use will satisfy only us. I personally find that assuming no god exists makes everything simpler and much more meaningful.

But those who are committed to believing in the existence of god despite the lack of evidence for his/her existence will not agree with me that this is the simplest explanation. They will likely say that having an inscrutable god who for some reason allows unspeakable cruelties is a 'simpler' way of understanding the world.

Which position one ends up taking is thus largely determined by deciding which is 'simpler' to believe in, which usually means deciding which belief structure you want to believe in and find personally enriching and meaningful, since there is no unambiguous measure of simplicity for incommensurable theories.

April 16, 2008

Why scientific theories are more than just explanations

(I will be away on travel this week so will be reposting an old series, edited and updated, that discusses the nature of science and the difference between science and religion. New posts start again on Monday, April 21, 2008.)

At its heart, intelligent design creationism (IDC) advocates adopt as their main strategy that of finding phenomena that are not (at least in their eyes) satisfactorily explained by evolutionary theory and arguing that hence natural selection is a failed theory. They say that adding the postulate of an 'intelligent designer' (which is clearly a pseudonym for god) as the cause of these so-called unexplained phenomena means that they are no longer unexplained. This, they claim, makes IDC the better 'explanation'. Some (perhaps for tactical reasons) do not go so far and instead say that it is at least a competing explanation and thus on a par with evolution.

As I discussed in an earlier posting, science does purport to explain things. But a scientific explanation is more than that. Scientific explanations also always carry within themselves the seeds of new predictions, because whenever a scientist claims to explain something using a new theory, the first challenge that is thrown invariably takes the form "Ok, if your theory explains X under these conditions, then it should predict Y under those conditions. Is the prediction confirmed?"

If the prediction Y fails, then the theory is not necessarily rejected forever but the proponent has to work on it some more, explain the failure to predict Y, and come back with an improved theory that makes better predictions.

Even if the prediction Y is borne out, the theory is still not automatically accepted but it gains a little bit of credibility and may succeed in attracting some people to work on it. Theories become part of the scientific consensus when their credibility increases by these means until they are seen by the scientific community as being sufficiently strong and robust that they become the exclusive framework, or 'paradigm', for future investigations.

A scientist who said things like "My new theory explains X but makes no new predictions whatsoever" would be ignored or face ridicule because such theories are easy to manufacture and of no practical use for science. And yet this is precisely the kind of thing that IDC proponents are saying. To see why this cannot be taken seriously, here is something abridged from the book Physics for the Inquiring Mind by Eric Rogers (p. 343-345), written way back in 1960. In it Rogers looks at competing claims for why an object set in motion on a surface eventually comes to rest:

The Demon Theory of Friction

How do you know that it is friction that brings a rolling ball to a stop and not demons? Suppose you answer this, while a neighbor, Faustus, argues for demons. The discussion might run thus:

You: I don't believe in demons.
Faustus: I do.
You: Anyway, I don't see how demons can make friction.
Faustus: They just stand in front of things and push to stop them from moving.
You: I can't see any demons even on the roughest table.
Faustus: They are too small, also transparent.
You: But there is more friction on rough surfaces.
Faustus: More demons.
You: Oil helps.
Faustus: Oil drowns demons.
You: If I polish the table, there is less friction and the ball rolls further.
Faustus: You are wiping the demons off; there are fewer to push.
You: A heavier ball experiences more friction.
Faustus: More demons push it; and it crushes their bones more.
You: If I put a rough brick on the table I can push against friction with more and more force, up to a limit, and the block stays still, with friction just balancing my push.
Faustus: Of course, the demons push just hard enough to stop you moving the brick; but there is a limit to their strength beyond which they collapse.
You: But when I push hard enough and get the brick moving there is friction that drags the brick as it moves along.
Faustus: Yes, once they have collapsed the demons are crushed by the brick. It is their crackling bones that oppose the sliding.
You: I cannot feel them.
Faustus: Rub your finger along the table.
You: Friction follows definite laws. For example, experiment shows that a brick sliding along a table is dragged by friction with a force independent of velocity.
Faustus: Of course, the same number of demons to crush however fast you run over them.
You: If I slide a brick among a table again and again, the friction is the same each time. Demons would be crushed on the first trip.
Faustus: Yes, but they multiply incredibly fast.
You: There are other laws of friction: for example, the drag is proportional to the pressure holding the surfaces together.
Faustus: The demons live in the pores of the surface: more pressure makes more of them rush out and be crushed. Demons act in just the right way to push and drag with the forces you find in your experiments.

By this time Faustus' game is clear. Whatever properties you ascribe to friction he will claim, in some form, for demons. At first his demons appear arbitrary and unreliable; but when you produce regular laws of friction he produces a regular sociology of demons. At that point there is a deadlock, with demons and friction serving as alternative names for sets of properties - and each debater is back to his first remark.

Faustus's arguments are just like those of the IDC advocates, and the reason why they are consistently rejected by the scientific community. Scientists ask for more than just explanations from their theories. They also need mechanisms that make predictions. They know that that is the only way to prevent being drowned in an ocean of 'explanations' that are of no practical use whatsoever.

You can't really argue with people like Faustus who are willing to create ad hoc models that have no predictive power. Such explanations as he gives have no value to the practicing scientist. At some point, in order to save your time and your sanity you have to simply walk away and ignore them. This explains why so many scientists refuse to get involved in the IDC battles.

But when you walk away from this kind of fruitless pseudo-debate, you do allow the other side to charge that you are afraid to debate them, at which point, they may jump up and down and shout "See they cannot refute us. We win! We win!", however illogical the charge.

It reminds me of the duel scene in Monty Python and the Holy Grail in which King Arthur chops off the arms and legs of the Black Knight, leaving just his torso and attached head on the ground, totally vanquished. The Black Knight refuses however to concede defeat and offers a compromise: "Oh? All right, we'll call it a draw." When Arthur and his assistant walk away from this offer, the Black Knight starts taunting him saying "Oh. Oh, I see. Running away, eh? You yellow bastards! Come back here and take what's coming to you. I'll bite your legs off!"

You can see the scene from the film here:

The IDC people are the Black Knights of the science-religion debate. Despite their arguments suffering one devastating refutation after another, they think they are invincible because god is on their side, will not concede that they have lost the battle, and refuse to go away. All that they have left is bluster.

April 15, 2008

Why intelligent design creationism is not science

(I will be away on travel this week so will be reposting an old series, edited and updated, that discusses the nature of science and the difference between science and religion. New posts start again on Monday, April 21, 2008.)

In a previous posting, I pointed out that if one looks studies the history of science, all the theories that have been considered to be science are both (1) naturalistic and (2) predictive. Thus these two things constitute necessary conditions for a theory to be considered science.

This is an important fact to realize when so-called intelligent design creationism (IDC) advocates argue that theirs is a 'scientific' theory. If so, the first hurdle IDC must surmount is that it meet both those necessary criteria, if it is to be even eligible to be considered to be science. It has to be emphasized that meeting those conditions is not sufficient for something to be considered science, but the question of sufficiency does not even arise in this case because IDC does not meet either of the two necessary conditions.

I issued this challenge to the IDC proponents when I debated them in Kansas in 2002. I pointed out that nowhere did they provide any kind of mechanism that enabled them to predict anything that anyone could go out and look for. And they still haven't. At its essence, IDC strategy is to (1) point to a few things that they claim evolutionary theory cannot explain; (2) assert that such phenomena have too low a probability to be explained by any naturalistic theory; and (3) draw the conclusion that those phenomena must have been caused by an 'unspecified designer' (with a nudge, nudge, wink, wink to the faithful that this is really god) whose workings are beyond the realm of the natural world explored by science.

Thus they postulate a non-natural cause for those phenomena and cannot predict any thing that any person could go and look for. (This is not surprising. The designer is, for all intents and purposes, a synonym for god and it would be a bit bizarre to our traditional concept of god to think that his/her actions should be as predictable as that of blocks sliding down inclined planes.) When I asked one of the IDC stalwarts (Jonathan Wells) during my visit to Hillsdale College for an IDC prediction, the best he could come up with was that there would be more unexplained phenomena in the future or words to that effect.

But that is hardly what is meant by a scientific prediction. I can make that same kind of vague prediction about any theory, even a commonly accepted scientific one, since no theory ever explains everything. A truly scientific prediction takes the more concrete form: "The theory Z encompassing this range of phenomena predicts that if conditions X are met, then we should see result Y."

IDC advocates know that their model comes nowhere close to meeting this basic condition of science. So they have adopted the strategy of: (1) challenging the naturalism and predictive conditions, arguing that these are not necessary conditions for science and that they have been adopted to specifically and unfairly exclude IDC from science; and (2) tried to create a new definition of science so that IDC can be included. This takes the form of arguing that a scientific theory is one that 'explains' phenomena.

(There are, of course, variations and expansions on these arguments by the various members of the IDC camp but I have tried to reduce it to its skeletal elements. These variations that IDC proponents adopt are designed to blur the issues but are easy to refute. See this cartoon by Tom Tomorrow (thanks to Daniel for the link) and this funny post by Canadian Cynic about the possible consequences of using IDC-type reasoning in other areas of life.)

The rejection by IDC advocates of naturalism and predictivity as necessary conditions for science goes against the history of science. Recall, for example, that in the struggle between the Platonic and Copernican models of the universe, both sides of this debate involved religious believers. But when they tried to explain the motions of the planets, both sides used naturalistic theories. To explain the retrograde motion of Mercury and other seemingly aberrant behavior, they invoked epicycles and the like. They struggled hard to find models that would enable them to predict future motion. They did not invoke god by saying things like "God must be moving the planets backwards on occasion." Or "This seemingly anomalous motion of Mercury is due to god." Such an explanation would not have been of any use to them because allowing god into the picture would preclude the making of predictions.

In fact, the telling piece of evidence that ended the dominance of the geocentric model was that the Rudolphine Tables using Kepler's elliptical orbits and a heliocentric model were far superior to any alternative in predicting planetary motion.

While it may be true that the underlying beliefs that drove people of that time to support the Platonic or Copernican model may have been influenced by their religious outlook, those earlier religious scientists did not seem to invoke god in a piecemeal way, as an explanation for this or that isolated unexplained phenomenon, as is currently done by IDC advocates. Instead they were more concerned with the question of whether the whole structure of the scientific theory was consistent with their understanding of the working of god. In other words, they were debating whether a geocentric model was compatible with their ideas of god's role in the world. They seemed to feel that detailed motions of specific planets, however problematic, were too trivial for them to invoke god as an explanation, although they would probably not have excluded the possibility that god was capable of routinely adjusting the motion of planets.

It may also well be true that some scientists of that time thought that god might be responsible for such things but such speculations were not part of the scientific debate. For example, Newton himself is supposed to have believed that the stability of the solar system (which was an unexplained problem in his day and remained unsolved for about 200 years) was due to god periodically intervening to restore the initial conditions. But these ideas were never part of the scientific consensus. And we can see why. If scientists had said that the stability was due to god and closed down that avenue of research, then scientists would never have solved this important problem by naturalistic means and thus advanced the cause of science. This is why scientists, as a community, never accept non-natural explanations for any phenomena, even though individual scientists may entertain such ideas.

So the attempts by IDC advocates to redefine science to leave out methodological naturalism and predictivity fly completely in the face of the history of science. But worse than that, such a move would result in undermining the very methods that have made science so successful.

In the next posting, I will discuss why just looking for 'good' explanations of scientific phenomena (the definition of science advocated by the IDC people) is not, by itself, a useful exercise for science.

April 14, 2008

What is science?

(I will be away on travel this week so will be reposting an old series, edited and updated, that discusses the nature of science and the difference between science and religion. New posts start again on Monday, April 21, 2008.)

Because of my science training and my interest in its history and philosophy I am sometimes called upon to answer the question "what is science?" Most people think that the answer should be fairly straightforward. After all science is such an important and integral part of our lives that everyone feels that they already know what it is and think that the problem of defining science is purely one of finding the right combination of words that captures their intuitive sense.

But as I said in an earlier previous posting, strictly defining something means having demarcation criteria for it, which involves developing a set of necessary and sufficient conditions, and this is extremely hard to do even for seemingly simple things like (say) defining what a dog is. So it should not be surprising that it may be harder to do for an abstract idea like science.

But just as a small child is able, based on its experience with pets, to distinguish between a dog and a cat without any need for formal demarcation criteria, so can scientists intuitively sense what is science and what is not science, based on the practice of their profession, without any need of a formal definition. So scientists do not, in the normal course of their work, pay much attention to whether they have a formal definition of science. If forced to define science (say for the purpose of writing textbooks) they tend to make up some kind of definition that sort of fits with their experience, but such ad-hoc formulations lack the kind of formal rigor that is strictly required of a philosophically sound demarcation criterion.

The absence of an agreed-upon formal definition of science has not hindered science from progressing rapidly and efficiently. Science marches on, blithely unconcerned about its lack of self-definition. People start worrying about definitions of science mainly in the context of political battles, such as those involving so-called intelligent design creationism (or IDC), because advocates of IDC have been using this lack of a formal definition to try to define science in a self-serving way so that their pet idea can be included as science, and thus taught in schools as part of the science curriculum and as an alternative to evolution.

Having a clear-cut demarcation criterion that defines science and is accepted by all would settle this question of whether IDC is science once and for all. But finding a satisfactory demarcation criterion for science has proven to be remarkably difficult.

To set about trying to find criteria that distinguishes between one class of ideas from another class, we do what we usually do in all such cases, we first set about finding all the unambiguous members of each class and see if we can extract common properties of each class.

In the case of science, we look at all the knowledge that is commonly accepted as science by everyone, and see if we can identify what is common among these areas. For example, I think everyone would agree that the subjects that come under the headings of astronomy, geology, physics, chemistry, and biology, and which are studied by university departments in reputable universities, all come under the heading of science. So any definition of science that excluded any of these areas would be clearly inadequate, just as any definition of 'dog' that excluded a commonly accepted breed would be dismissed as inadequate.

This kind of exercise is exactly we do when trying to define other things, like art (say). Any definition of art that excluded paintings hanging in reputable museums would be considered an inadequate definition.

Similarly, there is a general consensus that astrology, fortune-telling, and the like are not science. Any definition of science that resulted in those topics being considered science would be considered inadequate.

When we look at the history of the topics studied by people in those named disciplines that are commonly accepted as science, the first thing that we notice is that for a theory to be considered scientific it does not have to be true. Newtonian physics is commonly accepted to be scientific, although it is not considered to be universally true anymore. The phlogiston theory of combustion is considered to be scientific though it has long since been overthrown by the oxygen theory. And so on. In fact, since all knowledge is considered to be fallible and liable to change, truth is, in some sense, irrelevant to the question of whether something is scientific or not, because absolute truth cannot be established.

(A caveat: Not all scientists will agree with me on this last point. Some scientists feel that once a theory is shown to be incorrect, it ceases to be part of science, although it remains a part of science history. Some physicists also feel that many of the current theories of (say) sub-atomic particles are unlikely to be ever overthrown and are thus true in some absolute sense. I am not convinced of this. The history of science teaches us that even theories that were considered rock-solid and lasted millennia (such as the geocentric universe) eventually were overthrown.)

But there is a clear pattern that emerges about scientific theories. All the theories that are or have been considered to be science are (1) naturalistic and (2) predictive.

By naturalistic I mean methodological naturalism and not philosophical naturalism. The latter, I argued in an earlier posting where these terms were defined, is irrelevant to science.

By predictive, I mean that all theories that are considered part of science have the quality of having some explicit mechanism or structure that enable the users of these theories to make predictions, of being able to say what one should see if one did some experiment or looked in some place under certain conditions.

Note that these two conditions are just necessary conditions and by themselves are not sufficient. (See this earlier posting for what those conditions mean.) As such they can only classify theories into "may be science" (if it meets both conditions) or "not science" (if it does not meet either or both conditions.) As such, these two conditions by themselves do not make up a satisfactory demarcation criterion. For example, the theory that if a football quarterback throws a lot of interceptions his team is likely to lose, meets both naturalistic and predictive conditions, but such theories are not usually considered part of science.

But even though we do not have rigorous demarcation criteria for science, the existence of just necessary conditions still has important implications, which I shall explore in later postings.

January 23, 2008

Our inner fish and other evolution fun facts

Even though I am not a biologist, I find evolution to be an endlessly intriguing subject, constantly throwing up intriguing new facts. Here are some recent items that caught my eye.

Stephen Colbert has a fascinating interview with evolutionary biologist Neil Shubin, discoverer of the fish-land animal transitional fossil Tiktaalik, about how much of our human biology came from fish. In his 2008 book Your Inner Fish: A Journey Into the 3.5 Billion-Year History of the Human Body, Shubin points out that although superficially we may look very different, many of our human features can be found in to have analogous forms in fish and thus probably existed from the time that we shared common fish-like ancestors with them. (Incidentally, Shubin was one of the expert witnesses in the Dover intelligent design trial in which he discussed the theory of evolution and the role that Tiktaalik played in clarifying the link between fish and land animals.)

For me, one of the most surprising things in learning about evolution was that whales, dolphins, and porpoises evolved from land mammals that returned to the sea from which their own ancestors had emerged. In fact, hippos are the animals most closely related to modern day whales.

Researchers have now discovered in the Kashmir region the fossils of a land-based ancestor to whales, dolphins, and porpoises. The fox-sized Indohyus, as it has been called, lived 48 million years ago and is an even closer relative to the whales than hippos, and sheds more light on how whales cam to be.

"The new model is that initially they were small deer-like animals that took to the water to avoid predators," Professor Thewissen told BBC News. "Then they started living in water, and then they switched their diet to become carnivores."

And then there was the New Scientist report last week of the discovery of a two million year old Uruguayan fossil of a rodent (Josephoartigasia monesi) that weighed about a thousand kilograms, which makes it the world's largest rodent, about the size of a large bull. Of course, this species of giant rodent is extinct. The largest rodents now are the capybara, also found in South America, which clock in at a mere 50 kilos.

In reading the report, I discovered something else that I had not known, that North and South America had once split apart, and that this may explain how the giant rodent came into being. Later the huge landmasses joined again, causing the extinction.

South America saw a huge explosion in the diversity of rodents after the continent split from North America and became an island some 65 million years ago. Dinosaurs had just been wiped out and many animal groups were filling the void they left behind.

Without competition from other mammals which were diversifying on the other side of the water in North America, rodents of all sizes emerged in South America.
. . .
Around the time that the recently discovered J. monesi was alive, the two Americas were joined once more.

Sánchez speculates that the connecting land bridge may have helped bring about the demise of the giant rodents. Animals, among them the sabre-toothed cat, crossed the bridge in both directions bringing diseases, and competition for food and territory.

It is likely that changes in the climate will have also rendered the rodents' home less hospitable. J. monesi was found in what is now an arid region, but was then lush and forested.

"Our work suggests that 4 million years ago in South America, 'mice' that were larger than bulls lived with terror birds, sabre-toothed cats, ground sloths, and giant armoured mammals," say the Uruguayan researchers.

Of course, these explanations for the rise and fall of the giant rat are speculative and need to be corroborated with further research.

This is what I love about science: the constant discovery of exciting new findings, the challenge of fitting them into a theoretical framework while maintaining consistency with other scientific theories. All these things stimulate new research and ideas.

POST SCRIPT: Scott Ritter and Edward Peck

Scott Ritter is the US Marine who was a member of Hans Blix UN team that searched Iraq for WMDs prior to the invasion. He concluded that Iraq did not have any and repeatedly said so. For being correct, he was vilified by those anxious to go to war and almost completely banished from the media while those who were wrong on everything are still there, now pushing for war with Iran.

Scott Ritter and Edward Peck (former chief of mission for the US in Iraq) will speak tomorrow (Thursday) at at 7:30 pm at Trinity Cathedral in Cleveland. They have returned from a fact-finding mission in Iran.

Suggested donation: $10 general, $5 students. Trinity Cathedral is at 2230 Euclid Ave., across from CSU. Free parking is available in the Trinity Cathedral lot: entrance on Prospect Ave at E. 22nd.

January 11, 2008

What is science?

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Because of my interest in the history and philosophy of science I am sometimes called upon to answer the question "what is science?" Most people think that the answer should be fairly straightforward. This is because science is such an integral part of our lives that everyone feels that they intuitively know what it is and think that the problem of defining science is purely one of finding the right combination of words that captures their intuitive sense.

But as I said in my previous posting, strictly defining things means having demarcation criteria, which involves developing a set of necessary and sufficient conditions, and this is extremely hard to do even for seemingly simple things like (say) defining what a dog is. So I should not be surprising that it may be harder to do for an abstract idea like science.

But just as a small child is able, based on its experience with pets, to distinguish between a dog and a cat without any need for formal demarcation criteria, so can scientists intuitively sense what is science and what is not science, based on the practice of their profession, without any need for a formal definition. So scientists do not, in the normal course of their work, pay much attention to whether they have a formal definition of science or not. If forced to define science (say for the purpose of writing textbooks) they tend to make up some kind of definition that sort of fits with their experience, but such ad-hoc formulations lack the kind of formal rigor that is strictly required of a philosophically sound demarcation criterion.

The absence of an agreed-upon formal definition of science has not hindered science from progressing rapidly and efficiently. Science marches on, blithely unconcerned about its lack of self-definition. People start worrying about definitions of science mainly in the context of political battles, such as those involving so-called intelligent design creationism (or IDC), because advocates of IDC have been using this lack of a formal definition to try to define science in such a way that their pet idea be included as science, and thus taught in schools as part of the science curriculum and as an alternative to evolution.

Having a clear-cut demarcation criterion that defines science and is accepted by all would settle this question once and for all. But finding this demarcation criterion for science has proven to be remarkably difficult.

To set about trying to find such criteria, we do what we usually do in all such cases, we look at all the knowledge that is commonly accepted as science by everyone, and see if we can see similarities among these areas. For example, I think everyone would agree that the subjects that come under the headings of astronomy, geology, physics, chemistry, and biology, and which are studied by university departments in reputable universities, all come under the heading of science. So any definition of science that excluded any of these areas would be clearly inadequate, just as any definition of 'dog' that excluded a commonly accepted breed would be dismissed as inadequate.

This is the kind of thing we do when trying to define other things, like art (say). Any definition of art that excluded (say) paintings hanging in reputable museums would be considered an inadequate definition.

When we look back at the history of the topics studied by people in those named disciplines and which are commonly accepted as science, two characteristics stand out. The first thing that we realize is that for a theory to be considered scientific it does not have to be true. Newtonian physics is commonly accepted to be scientific, although it is not considered to be universally true anymore. The phlogiston theory of combustion is considered to be scientific though it has long since been overthrown by the oxygen theory. And so on. In fact, since all knowledge is considered to be fallible and liable to change, truth is, in some sense, irrelevant to the question of whether something is scientific or not, because absolute truth cannot be established.

(A caveat: Not all scientists will agree with me on this last point. Some scientists feel that once a theory is shown to be incorrect, it ceases to be part of science, although it remains a part of science history. Some physicists also feel that many of the current theories of (say) sub-atomic particles are unlikely to be ever overthrown and are thus true in some absolute sense. I am not convinced of this. The history of science teaches us that even theories that were considered rock-solid and lasted millennia (such as the geocentric universe) eventually were overthrown.)

But there is a clear pattern that emerges about scientific theories. All the theories that are considered to be science are (1) naturalistic and (2) predictive.

By naturalistic I mean methodological naturalism and not philosophical naturalism. The latter, I argued in an earlier posting where these terms were defined, is irrelevant to science.

By predictive, I mean that all theories that are considered part of science have the quality of having some explicit mechanism or structure that enable the users of these theories to make predictions, of saying what one should see if one did some experiment or looked in some place under certain conditions.

Note that these two conditions are just necessary conditions and by themselves are not sufficient. (See the previous posting for what those conditions mean.) As such they can only classify things into "may be science" (if something meets both conditions) or "not science" (if something does not meet either one of the conditions.) As such, these two conditions together do not make up a satisfactory demarcation criterion. For example, the theory that if a football quarterback throws a lot of interceptions his team is likely to lose, meets both naturalistic and predictive conditions, but it is not considered part of science.

But even though we do not have a rigorous demarcation criterion for science, the existence of just necessary conditions still has interesting implications.

January 10, 2008

Necessary and sufficient conditions

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

The problem of finding definitions for things that clearly specify whether an object belongs in that category or not has long been recognized to be a knotty philosophical problem. Ideally what we would need for a good definition is to have both necessary and sufficient conditions, but it is not easy to do so.

A necessary condition is one that must be met if the object is to be considered even eligible for inclusion in the category. If an object meets this condition, then it is possible that it belongs in the category, but not certain. If it does not meet the condition, then we can definitely say that it does not belong. So necessary conditions for something can only classify objects into "maybe belongs" or "definitely does not belong."

For example, let us try to define a dog. We might say that a necessary condition for some object to be considered as a possible dog is that it be a mammal. So if we know that something is a mammal, it might be a dog or it might be another kind of mammal, say a cat. But if something is not a mammal, then we know for sure it is not a dog.

A sufficient condition, on the other hand, acts differently. If an object meets the sufficient condition, then it definitely belongs. If it does not meet the sufficient condition, then it may or may not belong. So the sufficient condition can be used to classify things into "definitely belongs" or "maybe belongs."

So for the dog case, if a dog has papers certified by the American Kennel Association, then we can definitely say it is a dog. But if something does not have such papers it may still be a dog (say a mixed breed) or it may not be a dog (it may be a table).

A satisfactory demarcation criterion would have both necessary and sufficient conditions because only then can we say of any given object that it either definitely belongs or it definitely does not belong. Usually these criteria take the form of a set of individually necessary conditions that, taken together, are sufficient. i.e., Each condition by itself is not sufficient but if all are met they become sufficient.

It is not easy to find such conditions, even for such a seemingly simple category as dogs, and that it the problem. So for the dog, we might try define it by saying that it is a mammal, with four legs, barks, etc. But people who are determined to challenge the criteria could find problems (What exactly defines a mammal? What is the difference between an arm and a leg? What constitutes a bark? Etc. We can end up in an infinite regression of definitions.)

This is why philosophers like to say that we make such identifications ("this is a dog, that is a cat") based on an intuitive grasp of the idea of "similarity classes," things that share similarities that may not be rigidly definable. So even a little child can arrive at a pretty good idea of what a dog is without formulating a strict definition, by encountering several dogs and being able to distinguish what separates dog-like qualities from non-dog like qualities. It is not completely fool proof. Once in a while we may come across a strange looking animal, some exotic breed that baffles us. But most times it is clear. We almost never mistake a cat for a dog, even though they share many characteristics, such as being small four-legged mammals with tails that are domestic pets.

Anyway, back to science, a satisfactory demarcation would require that we be able to find both necessary and sufficient criteria that can be used to define science, and use those conditions to separate ideas into science and non-science. Do such criteria exist? To answer that question we need to look at the history of science and see what are the common features that are shared by those bodies of knowledge we confidently call science.

This will be discussed in the next posting.

December 26, 2007

Should scientists try to accommodate religion?

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Within the scientific community, there are two groups, those who are religious and who hold to the minimal scientific requirement of methodological naturalism, and those who go beyond that and are also philosophical naturalists, and thus atheists/agnostics or more generally "shafars". (For definitions of the two kinds of naturalism, see here).

As I have said earlier, as far as the scientific community goes, no one really cares whether their colleagues are religious or not when it comes to evaluating their science. But clearly this question matters when science spills into the political-religious arena, as is the case with the teaching of so-called intelligent design creationism (IDC).

Some well-known religious scientists are biologists Kenneth Miller, Francis Collins, and Francisco Ayala. Since they are also opponents of IDC, they are frequently brought forward to counter IDC arguments since they embody counterevidence to IDC advocate charges that supporters of evolution are necessarily atheists.

Scientists who are also philosophical naturalists have generally not been prominent in the IDC debate, or have had their atheistic/agnostic views downplayed. This may be because of the political-religious climate in the US that has led to a strategy of not alienating those religious people who also oppose IDC. As Sam Harris, author of The End of Faith: Religion, Terror, and the Future of Reason, says: "Because it is taboo to criticize a person's religious beliefs, political debate over questions of public policy (stem-cell research, the ethics of assisted suicide and euthanasia, obscenity and free speech, gay marriage, etc.) generally gets framed in terms appropriate to a theocracy."

Harris argues that this is not a good strategy. "While understandable, I believe that such scruples are now misplaced. The Trojan Horse has passed the innermost gates of the city, and scary religious imbeciles are now spilling out." As I said in the previous post, a general awareness that this is what is happening is sinking in. He goes on:

The issue is not, as ID advocates allege, whether science can "rule out" the existence of the biblical God. There are an infinite number of ludicrous ideas that science could not "rule out," but which no sensible person would entertain. The issue is whether there is any good reason to believe the sorts of things that religious dogmatists believe - that God exists and takes an interest in the affairs of human beings; that the soul enters the zygote at the moment of conception (and, therefore, that blastocysts are the moral equivalents of persons); etc. There simply is no good reason to believe such things, and scientists should stop hiding their light under a bushel and make this emphatically obvious to everyone."

Harris' views have received enthusiastic support from Richard Dawkins, a prominent neo-Darwinian and atheist who has long criticized what he sees as the attempts by the late Stephen Jay Gould and others to accommodate religious sensibilities and downplay the irrationality of religious beliefs for fear of causing offense and creating an anti-science backlash. He thinks that tiptoeing around religious beliefs simply strengthens the hand of those who wish to undermine science.

As I said earlier, in pursuing scientific questions scientists do not care about the religious views of scientists. But when confronting the challenge of IDC and its young Earth adherents, should scientists who are philosophical naturalists stay out of the picture and leave it to only the religious methodological naturalists to combat IDC ideas, since the IDC people love to portray all scientists as atheists? Or should philosophical naturalists not feel hesitant to also challenge IDC, but from an atheistic position, and thus risk confusing the political struggle?

My personal view is that atheists should be fully involved and not keep quiet because of short term political needs.

POST SCRIPT: Another villager against Huckabee

Another Republican Villager (a former aide to Bush who calls himself a political conservative and evangelical Christian) suddenly discovers, in the light of Huckabee's ascendancy, the virtues of the separation of church and state. He goes to comical lengths to explain why Bush's playing of the Jesus card is good while Huckabee's more forthright religiosity is bad.

Whiskey Fire shares my amusement at these contortions.

August 30, 2007

Charles Darwin in his own words

I have written a lot about the theory of evolution and in the process have quoted short excerpts from various authors, Charles Darwin included. (Please see here for previous posts in this series.)

But in going back and reading the first edition of On the Origin of Species (1859), I am struck by how prescient Darwin was in anticipating the objections that would be raised against his theory and why. He could well have been talking about the situation today, except that then the people who were skeptical and who he was trying to persuade were his scientific colleagues. Nowadays scientists are almost all converts to natural selection (as he predicted might happen) and it is religious lay people who make the same objections he addressed long ago.

To get the full flavor of Darwin's thinking and his style of writing, here is a somewhat long passage from his conclusions, where he summarizes his case (p. 480-484). The sections in boldface are my own emphasis. (Darwin's complete works are now available online.)

I have now recapitulated the chief facts and considerations which have thoroughly convinced me that species have changed, and are still slowly changing by the preservation and accumulation of successive slight favourable variations. Why, it may be asked, have all the most eminent living naturalists and geologists rejected this view of the mutability of species? It cannot be asserted that organic beings in a state of nature are subject to no variation; it cannot be proved that the amount of variation in the course of long ages is a limited quantity; no clear distinction has been, or can be, drawn between species and well-marked varieties. It cannot be maintained that species when intercrossed are invariably sterile, and varieties invariably fertile; or that sterility is a special endowment and sign of creation. The belief that species were immutable productions was almost unavoidable as long as the history of the world was thought to be of short duration; and now that we have acquired some idea of the lapse of time, we are too apt to assume, without proof, that the geological record is so perfect that it would have afforded us plain evidence of the mutation of species, if they had undergone mutation.

But the chief cause of our natural unwillingness to admit that one species has given birth to other and distinct species, is that we are always slow in admitting any great change of which we do not see the intermediate steps. The difficulty is the same as that felt by so many geologists, when Lyell first insisted that long lines of inland cliffs had been formed, and great valleys excavated, by the slow action of the coast-waves. The mind cannot possibly grasp the full meaning of the term of a hundred million years; it cannot add up and perceive the full effects of many slight variations, accumulated during an almost infinite number of generations.

Although I am fully convinced of the truth of the views given in this volume under the form of an abstract, I by no means expect to convince experienced naturalists whose minds are stocked with a multitude of facts all viewed, during a long course of years, from a point of view directly opposite to mine. It is so easy to hide our ignorance under such expressions as the "plan of creation," "unity of design," &c., and to think that we give an explanation when we only restate a fact. Any one whose disposition leads him to attach more weight to unexplained difficulties than to the explanation of a certain number of facts will certainly reject my theory. A few naturalists, endowed with much flexibility of mind, and who have already begun to doubt on the immutability of species, may be influenced by this volume; but I look with confidence to the future, to young and rising naturalists, who will be able to view both sides of the question with impartiality. Whoever is led to believe that species are mutable will do good service by conscientiously expressing his conviction; for only thus can the load of prejudice by which this subject is overwhelmed be removed.

Several eminent naturalists have of late published their belief that a multitude of reputed species in each genus are not real species; but that other species are real, that is, have been independently created. This seems to me a strange conclusion to arrive at. They admit that a multitude of forms, which till lately they themselves thought were special creations, and which are still thus looked at by the majority of naturalists, and which consequently have every external characteristic feature of true species,—they admit that these have been produced by variation, but they refuse to extend the same view to other and very slightly different forms. Nevertheless they do not pretend that they can define, or even conjecture, which are the created forms of life, and which are those produced by secondary laws. They admit variation as a vera causa in one case, they arbitrarily reject it in another, without assigning any distinction in the two cases. The day will come when this will be given as a curious illustration of the blindness of preconceived opinion. These authors seem no more startled at a miraculous act of creation than at an ordinary birth. But do they really believe that at innumerable periods in the earth's history certain elemental atoms have been commanded suddenly to flash into living tissues? Do they believe that at each supposed act of creation one individual or many were produced? Were all the infinitely numerous kinds of animals and plants created as eggs or seed, or as full grown? and in the case of mammals, were they created bearing the false marks of nourishment from the mother's womb? Although naturalists very properly demand a full explanation of every difficulty from those who believe in the mutability of species, on their own side they ignore the whole subject of the first appearance of species in what they consider reverent silence.

It may be asked how far I extend the doctrine of the modification of species. The question is difficult to answer, because the more distinct the forms are which we may consider, by so much the arguments fall away in force. But some arguments of the greatest weight extend very far. . . I believe that animals have descended from at most only four or five progenitors, and plants from an equal or lesser number.

Analogy would lead me one step further, namely, to the belief that all animals and plants have descended from some one prototype. But analogy may be a deceitful guide. Nevertheless all living things have much in common, in their chemical composition, their germinal vesicles, their cellular structure, and their laws of growth and reproduction. We see this even in so trifling a circumstance as that the same poison often similarly affects plants and animals; or that the poison secreted by the gall-fly produces monstrous growths on the wild rose or oak-tree. Therefore I should infer from analogy that probably all the organic beings which have ever lived on this earth have descended from some one primordial form, into which life was first breathed.

And then the very last, almost poetic, words in the book (p. 490):

There is grandeur in this view of life, with its several powers, having been originally breathed into a few forms or into one; and that, whilst this planet has gone cycling on according to the fixed law of gravity, from so simple a beginning endless forms most beautiful and most wonderful have been, and are being, evolved.

Darwin's achievements are truly magnificent, putting him in the same class as Einstein and Newton, among the greatest scientists of all time.

POST SCRIPT: The Larry Craig incident

Senator Larry Craig (R-Idaho) has taken some strong "family values" and anti-gay stands in the past, despite long standing rumors that he himself was gay. The recent news report that he had pleaded guilty to "lewd" conduct in a public restroom has caused speculation that his career is now over.

It is despicable to harass gays with anti-gay rhetoric and legislation, becoming even worse if those doing so are secretly gay themselves. But Talking Points Memo expresses well my unease with what happened to Craig in the most recent episode. It is not clear from published reports that he did anything that really warranted his arrest and that he was, as Josh Marshall says, essentially caught in a Catch-22 caused by his own risky behavior.

Glenn Greenwald documents the brazen contradictions that right-wingers are indulging in the way they respond to the recent Craig revelation, the reports that surfaced back in 2006 that he was gay, and the recent case of Senator David Vitter (R-Louisiana), another "family values" champion who was found to be a customer of prostitutes.

August 13, 2007

Can we ever be certain about scientific theories?

(I am taking some time off from new blog posts. Until I begin posting again, which should not be more than a couple of weeks, I will repost some very early ones, updated if necessary. Today's one is from February 17, 2005.)

A commenter to a previous posting raised an interesting perspective that requires a fresh posting, because it reflects a commonly held view about how the validity of scientific theories get established.

The commenter says:

A scientist cannot be certain about a theory until that theory has truly been tested, and thus far, I am unaware of our having observed the evolution of one species from another species. Perhaps, in time, we will observe this, at which point the theory will have been verified. But until then, Evolution is merely a theory and a model.

While we may have the opportunity to test Evolution as time passes, it is very highly doubtful that we will ever be able to test any of the various theories for the origins of the Universe.

I would like to address just two points: What does it mean to "test" a theory? And can scientists ever "verify" a theory and "be certain" about it?

Verificationism as a concept to validate scientific theories has been tried and found to be wanting. The problem is that any non-trivial theory generates an infinite number of predictions. All the predictions cannot be exhaustively verified. Only a sample of the possible predictions can be tested and there is no universal yardstick that can be used to measure when a theory has been verified. It is a matter of consensus judgment on the part of scientists as to when a theory becomes an accepted one, and this is done on a case-by-case basis by the practitioners in that field or sub-field.

This means, however, that people who are opposed to a theory can always point to at least one particular result that has not been directly observed and claim that the theory has not been 'verified' or 'proven.' This is the strategy adopted by ID supporters to attack evolutionary theory. But using this kind of reasoning will result in every single theory in science being denied scientific status.

Theories do get tested. Testing a theory has been a cornerstone of science practice ever since Galileo but it means different things depending on whether you are talking about an experimental science like electrochemistry and condensed matter physics, or a historical science like cosmology, evolution, geology, and astronomy.

Any scientific theory is always more than an explanation of prior events. It also must necessarily predict new observations and it is these predictions that are used to test theories. In the case of experimental sciences, laboratory experiments can be performed under controlled conditions in order to generate new data that can be compared with predictions or used to infer new theories.

In the case of historical sciences, however, observations are used to unearth data that are pre-existing but as yet unknown. Hence the 'predictions' may be more appropriately called 'retrodictions' (or sometimes 'postdictions'), in that they predict that you will find things that already exist. For example, in cosmology the retrodictions were the existence of a cosmic microwave background radiation of a certain temperature, the relative abundances of light nuclei, and so forth. The discovery of the planet Neptune was considered a successful 'prediction' of Newtonian theory, although Neptune had presumably always been there.

The testing of a historical science is analogous is to that of the investigation of a crime where the detective says things like "If the criminal went through the woods, then we should be able to see footprints." This kind of evidence is also historical but is as powerful as those of futuristic predictions, so historical sciences are not necessarily at a lower level of credibility than experimental sciences.

Theories in cosmology, astronomy, geology, and evolution are all tested in this way. As Ernst Mayr (who died a few days ago at the age of 100) said in What Evolution Is (2001): "Evolution as a whole, and the explanation of particular evolutionary events, must be inferred from observations. Such inferences must be tested again and again against new observations, and the original inference is either falsified or considerably strengthened when confirmed by all of these tests. However, most inferences made by evolutionists have by now been tested successfully so often that they are accepted as certainties." (emphasis added).

In saying that most inferences are 'accepted as certainties', Mayr is exaggerating a little. Ever since the turn of the 20th century, it has been accepted that scientific knowledge is fallible and that absolute certainty cannot be achieved. But scientists do achieve a remarkable consensus on deciding at any given time what theoretical frameworks they have confidence in and will be used to guide future research. Such frameworks have been given the name 'paradigms' by Thomas Kuhn in The Structure of Scientific Revolutions (1970).

When scientists say they 'believe' in evolution (or the Big Bang), the word is being used in quite a different way from that used in religion. It is used as shorthand to say that they have confidence that the underlying mechanism of the theory has been well tested by seeing where its predictions lead. It is definitely not "merely a theory and a model" if by the word 'merely' the commenter implies a theory that is unsupported or untested.

So yes, evolution, like all the other major scientific paradigms, both historical and experimental, has been well tested.

POST SCRIPT: Dick Cheney in 1994

It turns out that many of the arguments made by those opposed to the 2003 invasion of Iraq were anticipated by (of all people) Dick Cheney in 1994. Who knew?

Thanks to This Modern World

August 09, 2007

Petitions and politics in science

In a recent discussion on a listserv for physics teachers, someone strongly recommended the book The Politically Incorrect Guide to Science by Tom Bethell, saying that it exposed how mainstream science was suppressing some ideas for non-science reasons, in particular how the great weaknesses of evolutionary theory were being hidden.

I had not read this book myself but these kinds of arguments are familiar to me and Bethell had written an article describing his own book. It struck me as extraordinarily shallow, rehashing arguments that have long been discredited, and invoking misleading (and old) chestnuts about evolution occurring only by chance, and missing transitional forms, etc. In fact, he seemed to have drawn his arguments against evolution from the playbook of the intelligent design creationists, in particular Jonathan Wells' book Icons of Evolution.

He even made the argument that the only thing that has been seen is 'microevolution' (small changes within species) and not macroevolution (change from one species to another). But the distinction drawn between micro- and macroevolution is untenable, since it has long been realized that there is a large overlap between varieties within species and between species as a whole, making the drawing of such distinctions difficult. Darwin himself pointed this out in his On the Origin of Species (chapter II), where he emphasized how difficult it was for even experts to classify whether animals were varieties within a single species or different species.

It is amazing that in this day and age people like Bethell still bring up Haeckel's embryos. Modern biologists don't take Haeckel's misleading sketches seriously anymore since his theory was discredited more than a hundred years ago and only intelligent design creationists (IDC) keep bringing it up repeatedly to argue that scientists falsify things in order to buttress the case for evolution. In the documentary A Flock of Dodos IDC advocate John Calvert talks about how biology textbooks use Haeckel's figures to mislead children but when asked to show this, thumbs through some textbooks and cannot find any examples. He had simply accepted this folklore uncritically. What use Haeckel has now is purely pedagogical.

The Haeckel case is analogous to someone finding that the Bohr model of the atom is still being taught in middle school science textbooks, "discovering" that Bohr's model violates Maxwell's laws of electromagnetism, and thus concluding that quantum mechanics is wrong and that children are being misled into accepting it. Quantum mechanics has come a long way since the Bohr atom and does not depend on it, just like evolutionary biology and Haeckel's embryos. To keep bringing it up is a sign of desperation

Bethell's argument about the fact that no one has seen half-bats and that therefore step-by-step evolution could not have occurred, reminds me of those people who say that it is absurd that an electron can go through two slits or that twins age differently based on their speeds. After all, has anyone actually SEEN an electron go through two slits? Has anyone actually SEEN twins age differently? If we haven't seen such things directly, they must not occur, right? I have described earlier how incomplete the fossil record is, because fossilization is extremely unlikely, and how arguments that depend on the existence of gaps in the fossil record can never be satisfied because new gaps can always be created.

As I have said in this series on evolution, to really appreciate the theory one has to get beyond the simple minded rhetoric of the kind that Bethell indulges in and look at the underlying details and the mathematics. The question of whether evolution is a "fact" is a red herring. "Theory" and "fact" are fluid terms in science. What is true is that the fully developed theory of evolution, known as the neo-Darwinian synthesis, is the most productive and useful theory in biology today, and forms the basis of almost all research in that field.

People who dislike the theory of evolution often point to the petition that the Discovery Institute (DI), the main driver of Intelligent Design Creationism (IDC), put out signed by 700 people that says: "We are skeptical of claims for the ability of random mutation and natural selection to account for the complexity of life. Careful examination of the evidence for Darwinian theory should be encouraged." Such people cite this as evidence that the theory is weak and then ask: "Why are so many scientists jumping off the evolution band wagon?"

But scientists are taught to be skeptical and to examine carefully the evidence for any theory. And if a theory (like evolution) challenges their religious beliefs, they are likely to be even more skeptical of it. That is a natural human tendency. It does not take any effort for a mathematician or physicist or philosopher to say she is skeptical of evolution, just as it does not cost any biologist anything to say that he is skeptical of the big-bang. After all, in each case, they are not personally working with that theory and are unlikely to know anything about it in any detail and thus can let other factors have a greater influence. As of the time when there were 400 people who had signed on, about 80% were not even biologists. (The story of one religious scientist who signed on to the Discovery Institute statement and only later realized what was going on can be read here.)

Another problem with the petition wording is that although Darwin proposed the mutation and natural selection mechanism, developments since then have added other mechanisms such as gene flow and genetic drift, so even a biologist who sees no problem with evolution would agree that mutation and natural selection alone are not sufficient.

I personally am skeptical of ANY theory in ANY field as being the last word (or the 'truth') on the subject because the history of science teaches us that scientific theories have always been provisional. So for me the DI statement itself is nothing more than a platitude. The fundamental issue is whether the biological community feels that evolution is in a crisis, and as far as I am aware, the biological science community does not think so and continues to use that theory as the foundation for their work. So these kinds of statements are just meaningless. When biologists start using alternative theories to generate predictions and start getting positive results, then we can take those other theories seriously.

It is important to realize that despite so many years of pushing intelligent design creationism, the people at the Discovery Institute have not been able to generate even one prediction, let alone do any experiments to investigate their theory. What they are doing is not science, it is lobbying and public relations.

Statements like the ones put out by the Discovery Institute on evolution are, however, useful as indicators of what people desire or yearn for.

For example, I am skeptical of the idea of dark matter as the explanation for the anomalous velocity distribution of stars on the arms of spiral galaxies. Why? Mostly for aesthetic reasons. It seems a bit contrived to me and the idea of huge amounts of matter surrounding us that we cannot detect reminds me uncomfortably of the arguments for the ether before Einstein's theory showed that the ether was a redundant concept.

I am hoping that a nicer theory than dark matter comes along and I know I am not alone in feeling this way and that other card-carrying physicists share my view. So if someone handed me a petition saying that I was skeptical of the theory of dark matter and would like the evidence for it to be examined carefully, that statement's content would not be objectionable to me. I would totally agree.

But I would not sign because it is pointless. I have not done any real work to support my misgivings. I have not developed an alternative theory, generated hypotheses, made predictions, or explained any existing data. Physicists who actually work on the spiral galaxy problem (even if they were completely outnumbered by the people who sign a petition dismissing dark matter theory) would be perfectly justified in ignoring me and any other physicists who sign such a petition in the absence of any substantive counter-theory.

What the DI petition on evolution tells us is that there are about 700 people who wish and hope that a theory more congenial to them than evolution comes along. That's fair enough but hardly major news. They have every right to feel that way and to say so. But it is by no means a measure of the merits of the theory, however many people sign on to it, and it is dishonest of the Discovery Institute to make such a claim.

Bethell's thesis that the scientific community is conspiring to suppress the truth about the weaknesses of evolution is silly. Given that the US has high levels of religiosity and public skepticism about evolution and widespread unease that evolution is undermining religious beliefs, any scientist who found good evidence for special creation would be deluged with funding from both government and private sources and receive high visibility and acclaim. Furthermore, such a discovery would open up vast new areas of research. In such a climate, why would any scientist not publish findings that provided evidence for special creation?

People like Bethell are trying to achieve by public relations what they cannot do using science. They are not the first to try to do this and will not be the last. But they will fail, just like their predecessors.

POST SCRIPT: The insanity of the employer-based health care system

This question posed at a Democratic presidential candidates forum illustrates perfectly why we need a single-payer, universal health care system.

June 08, 2007

Highway merging and the theory of evolution

Some time ago, I wrote about the best way for traffic to merge on a highway, say when a lane is closed up ahead. There are those drivers who begin to merge as soon as the signs warning of impending closure appear, thus making their lanes clear. Others take advantage of this lane opening up to drive fast right up to the merge point and then try to squeeze into the other lane.

I said that although people who followed the latter strategy were looked upon disapprovingly as queue jumpers, it seemed to me like the most efficient thing to do to optimize traffic flow was to follow the lead of the seemingly anti-social people and stay in the closed lane until the last moment since that had the effect of minimizing the length of the restricted road. To merge earlier meant that one had effectively made the restricted portion longer.

Some commenters (Gregory Szorc, another Greg, and Jeremy Smith) disagreed with me, saying that what is important is not the length of the restricted road section but the ability of traffic to maintain speed. After all, a single lane of cars can travel quite smoothly at 60 mph for quite a distance, even if there is a lot of traffic. They said that the best thing to do is to merge into the other lane whenever you can do so without significantly losing speed. Clearly this means merging as soon as possible, when traffic is still light, rather than following my suggestion of waiting until the latest moment when traffic is heavier and merging has to be done at a low speed.

The last month I have been doing a lot of highway driving and have been observing this again and realize that I was wrong and the commenters right. When traffic is light, people can merge at any point and not back up traffic because the speed at which they merge is close to the normal speed. So it seems that the key feature is the ability to maintain speed and to merge when you can do so, which means when the traffic flow is light, which usually is well before the actual lane closing. In fact, I think that highway workers should post signs many, many miles ahead of the restriction and recommend that people merge as soon as possible.

But highway signs alone are not going to be enough to have the desired effect. What is needed is widespread public awareness of the benefits of merging well before you actually have to.

Of course, there will always be people who 'cheat' and try to go as far as possible along the closed lane and thus end up slowing traffic at the merge point and destroying the benefits for all. What can be done about this?

Interestingly, this phenomenon parallels the problem of explaining altruistic behavior using evolution by natural selection. It is easy to argue that a group benefits if all its members practice some particular trait, say by sharing food equally all the time so that everyone survives in both good times and bad. But the catch is that evolution by natural selection works on the basis of what is good for a single organism, not for groups, because it is an organism that has genes and propagates it. And that means that a cheater (i.e., someone who, when he has plenty, hides some of his food without being caught) benefits more than the others and is more likely to survive. If this tendency to cheat is an inherited trait, then over time cheaters will come to dominate in the population. Evolutionary biologists have developed theories on how to explain the evolution of altruistic behavior in the face of this seeming advantage for cheating.

In the case of highway merging, if everyone, without exception, follows the early highway merging rule, then long bottlenecks could be a thing of the past, unless traffic is so heavy that merging at normal speed is just impossible. But the occasional cheater will get a short-term benefit of getting a long stretch of open road, while the people behind him get the negative effects of having him slow down traffic at the merge point. So he gets the benefit of others merging early while others bear the cost of his cheating, making cheating an advantageous option to that single organism.

Of course, I am not suggesting that selfish and inconsiderate highway driving habits are inherited traits that will spread in the population by being passed down to the inconsiderate driver's children via his or her genes. But they could be like a 'meme', a mental virus that, like a gene, is a replicator that seeks to propagate and increase its incidence in the population, which in this case consists of the minds of people. This meme would encourage people to benefit themselves in the short-term at the expense of others, even though in the long term they too lose when someone else practicing the same behavior slows down traffic ahead of them.

May 22, 2007

Asking the wrong questions about science history

In his influential book The Structure of Scientific Revolutions, Thomas Kuhn points out that the kinds of questions we often ask about the history of science and that we think are simple and have been adequately answered (such as "who discovered oxygen and when?" "Who discovered X-rays and when?") turn out on close examination to be extremely difficult, if not impossible, to answer.

It is not that there are no answers given in authoritative sources. It is that when we actually do examine the historical record, the situation turns out to be very murky, giving rise to the strong suspicion that such questions are the wrong ones to ask about the scientific enterprise. The simple answers that are given to such questions represent a rewriting of history to give readers a simple narrative but at the expense of giving a distorted sense of how science is done, as if scientific discoveries were clear and decisive events. I remember being very impressed by Kuhn's examples to support his thesis when I first read his book and subsequent readings of science history have convinced me that he is right.

For example, the latest issue of the newsletter of the American Physical Society's called the APS NEWS (vol. 16, no. 5, May 2007, p. 2) has an account of the discovery of the neutron. (The article is here but the current issue is password protected and non-APS members will have to wait a month before it is archived and people are given open access.) The title says "May 1932: Chadwick reports the discovery of the neutron" and recounts the familiar (to physicists anyway) story of how James Chadwick 75 years ago this month made the famous discovery for which he received the Nobel prize in 1935.

As the article proceeds to describe the history of the process, it becomes clear that its own story contradicts the impression given in the title.

As early as the 1920s, people had suspected that there was something in the atom's nucleus other than protons. Some thought these additional particles were made up of an electrically neutral combination of the already known proton and electron but no one could confirm this. But experiments went on trying to isolate and identify the particle, and around 1930 two scientists Bothe and Becker found radiation coming from a target of Beryllium that had been bombarded with alpha particles. They thought that this radiation consisted of high-energy photons. Other experiments done by Frederic and Irene Joliot-Curie also found similar radiation that they too attributed to high-energy photons.

Chadwick thought that this explanation didn't quite fit and did his own experiments and concluded that the radiation was caused by a new neutral particle that was slightly heavier than a proton. He called it the neutron. He published a paper in February 1932 where he suggested this possibility and then in May 1932 submitted another paper in which he was more definite. It is this paper that gives him the claim to be the discoverer.

But like all major scientific discoveries, acceptance of the new idea is not immediate within the community and it took until around 1934 for a consensus to emerge that this neutron was indeed a new fundamental particle.

So who "discovered" the neutron and when? Was it the people who concluded much earlier than 1932 that there was something else in the nucleus other than protons? They were right after all. Was it Bothe or Becker, or the Juliot-Curies who first succeeded in isolating this particle by knocking neutrons out of materials? They had, after all, "seen" isolated neutrons even if they had not identified it as such. Or do we give the honor to Chadwick for first providing a plausible claim that it was a neutron?

As to when the neutron was discovered, it is also hard to say. Was it when its existence was first suspect in the early 1920s? Or when it was first isolated experimentally around 1930? If we say that since the title of discoverer was awarded to Chadwick, the date of discovery has to be assigned to something he specifically did, when exactly did he realize that he had discovered the neutron? In his first preliminary paper in February 1932? Or in his more definite paper in May? Clearly he must have known what he knew before he submitted (or wrote) the papers.

All we know for sure is that sometime between 1930 and 1934, the neutron was "discovered" and that certain scientists played key roles in that process. For historical conciseness, we give the honor to Chadwick and fix the date as May 1932 and the judgment is not an unreasonable one, as long we insist on demanding that such events have a definite date and author. But it is good to be reminded that all such assignments of time and place and people for scientific discoveries mask a much more complex process, where "discoveries" involve extended periods of time involving large numbers of people during which understanding is increased incrementally. There is often no clear before-after split.

The detailed stories are almost always more fascinating than the truncated histories we are taught.

May 21, 2007

The nature of consciousness

In the model of Cartesian dualism, we think of the mind as a non-material entity that interacts somehow with the material brain/body in some way. Descartes thought that the locus of interaction existed within the pineal gland in the brain but that specific idea has long since been discarded.

But that still leaves the more fundamental idea, referred to now as Cartesian dualism, that states that I do have a mind that represents the essential 'me' that uses my material body to receive experiences via my senses, stores them in my memory, and orders actions that get executed by my body. This idea that there is an inner me is very powerful because it seems to correspond so intuitively with our everyday experience and the awareness that we have of our own bodies and the way we interact with our environment. Even the way we use language is intricately bound up with the idea that there exists some essence of ourselves, as can be seen by the way the words 'we' and 'our' was used in this and the previous sentences. The power of this intuitive idea of something or someone inside us controlling things has resulted in phrases like 'the ghost in the machine' or a 'homunculus' (from the Latin for 'little man') to describe the phenomenon.

For religious people, the mind is further mixed up with ideas of the soul and thus gains additional properties. The soul is considered to be non-material and can exist independently of the body, allowing for the possibility of an afterlife even after the body has ceased to exist. This soul model causes some problems that resist easy answers. For example, life begins with the creation of a single fertilized egg. This single fertilized cell (called a zygote) then starts to multiply to 2, 4, 8, 16 , 32,. . . cells and so on. All these cells are material things. At what stage along this progression did a non-material entity like the soul appear and attach itself to the collection of cells?

I think it is safe to say that almost all cognitive scientists reject the idea of a non-material mind, some kind of homunculus inside the brain somewhere that 'runs' us. This immediately rules out the religious idea of a non-material soul, at least in any traditional sense in which the word is used.

But even though the existence of a non-material mind or soul has been ruled out, the Cartesian dualistic model is still a seductive idea that can tempt even those who reject any religious ideas and accept a framework in which the material body (and brain) is all there is. The reason it is so seductive is that even if we discard the mind/body distinction as being based on a nonmaterial/material splitting, the idea of a central processing agent still seems intuitively obvious.

Consider a situation where I am responding to something in my environment. We know that we experience the external world through our five senses (sight, sound, smell, touch, taste) and that these senses are triggered by material objects coming into contact with the appropriate sense organs (eyes, ears, nose, skin, tongue) and excite the nerve endings located in those organs. These excitations are then transmitted along the nervous system to that part of our brains called the sensory cortex after which they. . .what?

At this point, things get a bit murky. Clearly these signals enter and proceed through our brain and excite the neural networks so that our brain becomes 'aware' of the phenomena we experienced, but the problematic issue is what exactly constitutes 'awareness.'

Suppose for the moment we stop trying to understand the incoming process and switch to the outgoing process. It seems like we have the ability to make conscious and unconscious decisions (pick up a cup or shake our head) and then the brain's neural networks send these signals to the part of the brain known as the motor cortex which transmits them to the appropriate part of the nervous system that sends the signal to the body part that executes the action by contracting muscles.

It seems reasonable to assume that in-between the end of the incoming pathway and the start of the outgoing pathway that I have described that there is some central part of the brain, a sort of command unit, that acts as a kind of clearing house where the incoming signals get registered and processed, stored in memory for later recall, older memories and responses get activated, theories are created, plans are made, and finally decisions for action are initiated.

As a metaphor for this command unit, we can imagine a highly sophisticated kind of home theater inside our brain where the screen displays what we see, speakers provide the sound, and is also capable of providing smell and touch and taste sensations, and banks of powerful computers by which memories can be stored and retrieved and action orders transmitted. 'Conscious events' are those that are projected onto this screen along with the accessory phenomena.

Daniel Dennett in his book Consciousness Explained (1991) calls this model the Cartesian Theater and warns against falling prey to its seductive plausibility. Accepting it, he points out, means that we are implicitly accepting the idea of a homunculus, or ghost in the machine, who is the occupant of this theater in the brain and who is the inner person, the 'real me' and what that inner person experiences is sometimes referred to as the 'mind's eye.' One problem is that this approach leads to an infinite regress as we try to understand how the Cartesian Theater itself works.

But if this simple and attractive model of consciousness is not true, then what is? This is where things get a little (actually a whole lot) complicated. It is clear that it is easier to describe what cognitive scientists think consciousness is not than what they think it is.

More to come. . .

May 18, 2007

Does science destroy life's mysteries?

One of the reasons that elite science and elite religion are now coming into conflict is that science is now addressing questions that once were considered purely philosophical. By 'purely philosophical' I mean questions that are serious and deep but for which answers are sought in terms of logic and reason and thought experiments, with the only data used being those that lie easily at hand or appeals to common everyday experience.

The difference with science is that the latter does not stop there but instead uses those things as just starting points for more esoteric investigations. It takes those initial ideas and converts them into research programs where the consequences of the ideas are deduced for well-defined situations that can be examined experimentally and tentative hypotheses can be tested.

Daniel Dennett in his book Consciousness Explained (1991) talks (p. 21) about how science tackles what he calls 'mysteries':

A mystery is a phenomenon that people don't know how to think about – yet. There have been other great mysteries: the mystery of the origin of the universe, the mystery of life and reproduction, the mystery of the design to be found in nature, the mysteries of time, space, and gravity. These were not just areas of scientific ignorance but of utter bafflement and wonder. We do not yet have the final answers to any of the questions of cosmology and particle physics, molecular genetics and evolutionary theory, but we do know how to think about them. The mysteries haven't vanished, but they have been tamed. They no longer overwhelm our efforts to think about the phenomena, because now we know how to tell the misbegotten questions from the tight questions, and even if we turn out to be dead wrong about some of the currently accepted answers, we know how to go about looking for better answers.

That passage, I think, captures well what happens when something enters the world of science. The mystery gets tamed and becomes a problem to be solved.

The charge that people sometimes make against science is that it seems to take away all the awe and mystery of life's wonders by 'explaining' them. I have never quite understood that criticism. If at all, my sense of awe is enhanced by having a better understanding of phenomena. For example, I have always enjoyed seeing rainbows. Has my enjoyment become less now because I happen to know how multiple scattering of light in individual droplets of water produce the effect?

As another example, I recently listened to a magnificent concert of the Cleveland Orchestra playing Tchaikovsky's Piano Concerto #1. It was a truly moving experience. Was my sense of awe at the brilliance of the composition and its execution diminished by my knowledge that the orchestra players were using their instruments to cause the air around them to vibrate and that those vibrations then entered my ear, got converted to nerve signals that entered my brain, which was then able to Fourier transform the signals into reconstructing rich orchestral 'sounds' that my brain used to trigger chemical reactions that resulted in my sense of emotional satisfaction? I don't think so. I kind of like the fact that I can enjoy the experience on so many levels, from the purely experiential to the emotional and the cerebral. In fact, for me the truly awe inspiring thing is that we have reached such depths of understanding of something that would have seemed so mysterious just a few hundred years ago.

The taming of mysteries and converting them into planned research programs of investigation is now rapidly progressing in the areas of cognition and consciousness. The reason that this causes conflict is because such close examination can result in the philosophical justifications for religion being undermined.

For example, the existence of god is predicated on a belief in a Cartesian dualism. God is 'out there' somewhere separate from my body while 'I' am here encapsulated by my body, and there is some gateway that enables that boundary to be crossed so that 'I' can sense god. For many religious people, this contact between the 'I' and god is a deep mystery.

In some sense, Descartes started taming this mystery by postulating that the contact gateway lay in the pineal gland in the brain but he could not explain how the interaction between the non-material god and the material brain occurred. Of course, no one takes the special role of the pineal gland seriously anymore. But the basic Cartesian dualism problem remains for both religious and non-religious people, in the form of understanding the mind-brain split. What is the 'I' of the mind that makes decisions and initiates actions and seems to control my life? Does it exist as a non-material entity apart from the material brain? If so how does it interact with it, since the brain, being the place where our sensory system stores its information, is the source of our experiences and the generator of our actions?

Religious people extend this idea further and tend to think of the mind as somehow synonymous with the 'soul' and as a non-material entity that is separate from the body though occupying a space somewhere in the brain, or at least the body. It is the mind/soul that is the 'I' that interacts with a non-material god. So the mind/soul is the 'real' me that passes on to the next life after death and the body is just the temporary vehicle that 'I' use to interact with the material world.

Religious people tend to leave things there and suggest that the nature of the mind/soul and how it interacts with both the material world (including the body that encapsulates it) and god is a mystery, maybe even the most fundamental mystery of all, never to be understood. And for a long time, even scientists would have conceded that we had no idea how to even begin to address these questions.

But no longer. The cognitive scientists have tamed even this mystery and converted it into a problem. This does not mean that the problem of understanding the mind and consciousness has been solved. Far from it. But it does mean that scientists are now able to pose questions about the brain and consciousness in very concrete ways and suggest experiments to further advance knowledge. Although they do not have answers yet, one should be prepared for major advances in knowledge in this area.

And as these results start to come in, the prospects for maintaining beliefs in god and religion are not good. Because if history is any guide, the transition is always one way, from mystery to problem, and not the other way around. And once scientists see something as a problem to be solved, they tend to be tenacious in developing better and better theories and tools for solving it until only some details remain obscure. And the way the community of scientists build this knowledge structure is truly awe-inspiring.

So the answer to this post's title is yes, science does destroy the mysteries but it increases the awe.

More to come. . .

May 16, 2007

Philosophy and science

An interesting example of the different ways that scientists and 'pure' philosophers view things arose in an exchange I had in the comments of a previous post.

Commenter Kenneth brought up an interesting argument that I had not heard before for the existence of the afterlife, an argument that he said had originally been proposed by the philosopher Spinoza (1632-1677). Basically the argument boiled down to the assumption that each one of us is simply a collection of atoms arranged in a particular way. When a person (A) dies, those atoms are dispersed and join the universe of atoms that percolate through space and time. But there is always the possibility that, purely by chance as a result of random motion, a set of atoms will arrange themselves in exactly the same arrangement that made up A when A was still alive. So thus A will have been 'reborn.' Kenneth argues that thus the existence of life after death has been established, at least in principle.

The nature of the argument can be perhaps understood better with a simpler example of thoroughly mixing ink and water in a glass and then leaving it alone to sit undisturbed. We would think that this mixing is an irreversible process and that separation into water and ink again would not be possible except as a result of extraordinary efforts by external agents. But in fact if you simply wait long enough, there is a very remote possibility that the random motion of the individual ink and water molecules will result in a momentary spontaneous separation of the mixture in the container into two separate regions, one of pure water and the other of purely ink molecules (whatever ink molecules are).

Since all that this argument requires is the ability to wait for a very long time for which these unlikely events to occur, Kenneth has satisfied himself, from a philosophical point of view, that Spinoza's argument is valid. And that once we concede the possibility that someone's atoms can be reconstituted in its original form, the existence of life after death has been established, at least in principle

But science does not limit itself to these 'in principle' arguments. Such arguments are just the first steps. Science is always looking at the detailed consequences of such ideas in order to translate them into research programs. And this is where Spinoza's argument for the possibility of an afterlife breaks down.

For one thing, the human body is not just an arrangement of atoms, like that of molecules in a mixture of ink and water, or the oxygen and nitrogen molecules in a container of air. The atoms in the human body are bound together in complex organic molecules, which are in turn held together by other forces to form cells and tissues and so on. It is not enough to just bring the atoms together, you also have to create the chemical reactions that fuse them into these molecules, and this requires energy from the outside used in a very directed way.

It is like frying an egg in a pan. Just breaking an egg into a skillet and leaving it there will not result in a fried egg, however long you wait, unless there is a source of energy to drive the reaction forward. A fried egg is not just a rearrangement of the atoms in a raw egg. It is one in which new compounds have been created and the creation of these compounds is a non-random process.

In addition, the probability of all the atoms that make up your body randomly arriving at the same locations that they occupied when you were alive is microscopically small. This is not a source of concern to Kenneth because all he needs is that this probability not be zero in order to satisfy his 'in principle' condition. But there is an inverse relationship between the probability of an event and the likely time that you would have to wait for the event to occur. For example, if you repeatedly throw a die, you would have to wait longer to get a six than to get just any even number because the probability of the former is less than that of the latter.

In the case of the body's atoms coming together again, the probability is so small that the expected time for it to occur would be incredibly long. Again, it would not matter if this were a philosopher's 'in principle' argument. But those arguments tacitly assume that nothing else is changing in the environment and that we have an infinite amount of time in the world to wait for things to occur.

But in reality events are never in isolation and science is always concerned about the interconnectedness of things. And this is where the 'in principle' argument breaks down. We know that the lifetime of the Sun is about ten billion years and that it will then become a huge 'red giant' that will grow enormously and even envelop the Earth. And later still, all the energy producing nuclear reactions in the stars will end, resulting in the heat death of the universe. So there will not be any surplus energy around, even in principle, to drive the chemical reactions to reconstitute the body's molecules, even if they did manage to arrive randomly in exactly the right positions.

I think that this is where scientific research and philosophical speculations diverge. A scientist is not interested in just 'in principle' arguments for the afterlife of the kind that Kenneth says Spinoza makes. To be become interesting to scientists, Kenneth will have to provide at least numerical estimates of the probability the body's atoms reconstituting themselves, and then use that probability to estimate the expected time for such an event to occur.

If that time is more than the expected heat death of the universe, then the question becomes moot. If it is less, then the scientist will ask if there is enough free energy at that time to drive the reaction forward and what is the probability that this energy will spontaneously be directed at the atoms in just the right amounts and directions to recreate the human body.

All these considerations, when brought together, suggest that Spinoza's argument fails and that life after death as proposed by him is not going to ever happen.

That is the kind of difference between the approaches of pure philosophy and science.

May 15, 2007

Alternative realities

One of the things that I have noticed in recent years is the proliferation of what I call 'alternative realities'.

In classical learning theory, it is believed that when someone confronts evidence that runs counter to that person's prior knowledge, a state of cognitive dissonance occurs in the mind of the learner which only goes away when the learner's knowledge structures have been adjusted to accommodate the new information.

This model of learning underlies what are known as 'inquiry' methods of teaching science where the teacher, having an understanding of what her students are likely to erroneously believe about some phenomena (such as electricity), deliberately sets up experiments for them to do whose results will directly confront their misconceptions, thus forcing the student into the difficult process of re-evaluation of what they already believe. By repeatedly going through this process at different levels of sophistication and context, the hoped for transformation is that the student develops an experiential understanding of the 'true' theory that the teacher is trying to teach.

One attractive feature of this mode of science instruction is that it models and parallels the scientific process, where the predictions of theories or paradigm are repeatedly being confronted with actual data. Seemingly discrepant data creates a kind of 'cognitive dissonance' in the scientific community as a whole which is usually resolved in one of several ways: by the data being shown to be incorrect or irreproducible, or by the theory being modified and extended to enable the incorporation of the data, or (more rarely) the overthrow of the existing paradigm to be replaced by a new one for which the discrepant data is no longer a problem. This process of resolution can take quite a long time (in some famous cases over a hundred years) and during that time the unresolved discrepant data occupies a kind of limbo. Its existence is recognized and acknowledged but other work proceeds unaffected.

What does not happen is the peremptory rejection of the data for no reason other than the fact that it disagrees with the existing theory, and to construct an alternative theory simply for the sake of excluding the troublesome data.

But what is happening in some areas now is the adoption of precisely the last option. Evidence and data is being rejected if they contradict existing beliefs. And in order to prevent that rejection causing any cognitive dissonance, alternative realities are being constructed that seem to describe a parallel universe where reality does not intrude.

In politics, for example, the idea that you can control the nature of reality rather than respond to it was expressed in the famous article published by Ron Suskind in which he said how startled he was when a high Bush administration official told him in 2002 that: "guys like me were 'in what we call the reality-based community,' which he defined as people who 'believe that solutions emerge from your judicious study of discernible reality.' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. 'That's not the way the world really works anymore,' he continued. 'We're an empire now, and when we act, we create our own reality.'" This kind of administration hubris over their ability or control or create reality explains a lot how the debacle in Iraq occurred.

But this idea that one can either ignore reality or even create your own alternate one is becoming even more widespread. For example, practically everybody has by now heard of Wikipedia, the online open-source encyclopedia that has rapidly become a valuable resource for people to get information on a wide range of things. People have criticized it for the anonymity of the writers and the fact that some of the articles may be less than accurate and that it can sometimes be vulnerable (at least briefly) to the pranks of mischievous elements. All these shortcomings are being dealt with by the site's creators and despite them, Wikipedia has achieved an enviable level of usage.

One criticism that I had not heard was that Wikipedia had an anti-Christian and anti-American agenda. But apparently this is believed by some quarters and they have constructed a conservative alternative called Conservapedia. It says of its goals: "Conservapedia is a much-needed alternative to Wikipedia, which is increasingly anti-Christian and anti-American. . . . Conservapedia is an online resource and meeting place where we favor Christianity and America."

When I first heard of this, I thought it was an Onion-like spoof but this is not the case. The site is truly something to behold and can be a source of endless amusement for those in the reality-based world. For example, it says that "nothing useful has even been built on the theory of relativity" and that "This theory rejects Isaac Newton's God-given theory of gravitation and replaces it with a concept that there is a continuum of space and time, and that large masses (like the sun) bend space in a manner similar to how a finger can depress an area of a balloon."

It praises the 1925 Scopes "Monkey" trial for saving the state of Tennessee from 75 years of teaching of the "oppressive evolution theory."

Or about kangaroos: "Like all modern animals, modern kangaroos originated in the Middle East and are the descendants of the two founding members of the modern kangaroo baramin that were taken aboard Noah's Ark prior to the Great Flood." For more hilarious Conservapedia nuttiness that "shows" that dinosaurs lived at the same time as humans and how they could have fitted into the Ark, see here.

If you want to keep living in an alternative reality, then another source is QubeTV which bills itself as the "conservative version of YouTube." Again, I had not been aware that YouTube had been the spearhead of a secret liberal agenda, but this is apparently what some people believe.

Or there is Chatting with Charley, Charley being someone who tries to cherry-pick bits of science to support his contention that the Earth is less than 10,000 years old.

And there is the rise of creationist 'museums' (like the one in Petersburg, KY organized by the group Answers in Genesis) that seek to convince visitors that the information given in regular museums are wrong because they do not conform to the what is found on the Bible.

So what is behind this rise of alternative sites like these? I think the problem is that these religious fundamentalist people who want young people to continue to believe in these ideas like a young Earth and Noah's ark are worried that exposure to popular sites like YouTube and Wikipedia will result in them having cognitive dissonances when they realize that normal people don't believe any of the stuff that they believe. The story of the ark and Noah's great flood is an event of major importance to creationists and forms the basis for their entire 'science'. If young people find no references to it at all when they look up things in Wikipedia, one can see why they might start wondering why, and some may begin to question their beliefs.

So the creators of these sites are trying to create a whole 'alternative reality' that true believers need never leave and thus never have to confront reality. What is interesting about these kinds of religious ventures is that they take almost all of science for granted and then find one seemingly discrepant event (which can usually be explained but they ignore this) and then build an elaborate alternate reality on this slender reed.

It will be an interesting exercise to see how far they can take this. As science and other forms of knowledge expand, the alternate worlds will have to get more and more elaborate and contrived to counter the information generated by them. This has to be an unstable situation.

After all, as Stephen Colbert said, reality has a well-known liberal bias.

May 14, 2007

The science-religion debate

The ABC news 'Face Off'', the 'great' debate between religion and atheism, was broadcast on Nightline last week. You can see the video of the program here. (You may be able to find the video of the full debate here.)

The side arguing for God's existence was evangelist Ray "Banana Man" Comfort and his trusty sidekick Boy Wonder Kirk Cameron. The side arguing against was Brian "Sapient" (not his real last name) and Kelly, the creators of the Blasphemy Challenge and the people behind the Rational Response Squad.

The debate was initiated by Comfort who had contacted ABC News and requested it, saying that he could prove god's existence. He set the bar for himself quite high. He promised ABC News that he would "prove God's existence, absolutely, scientifically, without mentioning the Bible or faith" and added that "I am amazed at how many people think that God's existence is a matter of faith. It's not, and I will prove it at the debate - once and for all. This is not a joke. I will present undeniable scientific proof that God exists."

The video of the program shows that the 'debate' was at a disappointingly low level, although to be fair the debate lasted for about 90 minutes and only edited portions were shown. From the outset, Comfort broke his promise, invoking both the Bible and faith. But even when it came to the 'science' part of his argument, he resorted once again to the tired Paley's watch/Mount Rushmore arguments.

The shorter version of this old argument is this: "We can immediately tell when something is designed. If something is designed, it must have a designer. Nature looks designed to us and therefore must have been designed. That designer can only be god."

The operational and philosophical weaknesses of this argument has been exposed by many people, including me, so that anyone who advances it cannot really be taken seriously unless they address those challenges to it. As far as I can see, Comfort did not do this. Although Comfort had previously alleged that the banana was the "atheist's nightmare" (because it fits so perfectly in the human hand and human mouth, the banana and human hand and mouth had to have been designed that way) he did not bring bananas along as props. Perhaps he had been warned that his video of that claim has been the source of widespread merriment.

Kirk Cameron's role seemed to be to undermine evolutionary theory but the clips of him doing that showed an embarrassing ignorance and shallowness. He invoked the old argument about the paucity of transitional forms but even here he brought it up in a form that would have made even those sympathetic to his point of view wince. He seemed to have the bizarre notion that evolution by natural selection predicts the existence every possible intermediate state between all existing life forms. He showed artist's sketches of things that he called a "croc-o-duck (a duck with the head of a crocodile) and a "bull frog" (consisting of an animal that was half-bull and half-frog) and argued that the fact that we do not see such things means that evolution is wrong. Really. It was painful to watch him make a fool of himself on national TV.

Cameron seems to be suffering from an extreme form of a common misunderstanding about transitional forms. The fact that humans and other existing animals share common ancestors does not imply that there should be forms that are transitional between them as they exist now. What evolutionary theory states is that if you take any existing organism and follow its ancestors back in time, you will have a gradual evolution in the way the organisms look. So when we talk about transitional forms, we first have to fix the two times that set the boundaries. If we take one boundary as the present time and the other boundary as (say) four billion years ago when the first eukaryotic cell appeared, then there are a large number of transitional forms between those two forms. Richard Dawkins book The Ancestor's Tale gives an excellent account of the type and sequence of the transitional forms that have been found. Of course, these ancestral forms have evolved along the many descendant forms so we would not expect to see them now in the same form they were when they were our ancestors. They can only be found in that form as fossils.

The DNA sequencing shows the connections between species as well and provide further evidence of the way species branched off at various points in time. So when evolutionary biologists speak of 'transitional forms', they are referring to finding fossils of those ancestors who preceded various branch points. The recent discovery of Tiktaalik, the 375-million year old fossil that has the characteristics of what a common ancestor of fish and mammals and amphibians would look like, is one such example. So is Archaeopteryx as a transitional form.

The 'missing link' argument against evolution, although lacking content, is one that will never die. One reason is the existence of people like Cameron who use it incorrectly. Another is that it is infinitely adaptable. For example, suppose you have a species now and a species that existed (say) two billion years ago and demand proof of the existence of a missing link. Suppose a fossil is found that is one billion years old that fits the bill. Will this satisfy those who demand proof of the missing link? No, because opponents of evolution can now shift their argument and demand proofs of the existence of two 'missing' links, one between the fossils of two and one billion years ago, and the other between one billion years ago and the present. In fact, the more transitional fossils that are found, the more 'missing links' that can be postulated!

This is what has happened with past discoveries of fossils. The fossil record of evolution has been getting steadily greater but the calls for 'proof' of the existence of missing links have not diminished.

POST SCRIPT: Antiwar.com fundraising drive

The website Antiwar.com is having a fundraiser. If you can, please support it. It is an invaluable source of news and commentary that is far broader and deeper than you can find almost anywhere else.

May 04, 2007

The new atheism-6: The biological origins of religion and morality

(See part 1, part 2, part 3, part 4, and part 5.)

You would think that natural selection would work against religion because those individuals who spent their time in prayer and other rituals, and used precious energy and resources in building temples and offering sacrifices, would be at a survival disadvantage when compared to those who used their time more productively. In the previous post, I outlined the basic framework of natural selection and summarized the arguments of those who explain the survival value of religion by saying that religious ideas are passed on and evolve as a byproduct of the survival advantage that accrues from young children being predisposed to believe their parents and other adult authority figures.

But while that may explain how religions propagate once they come into being, it is harder to understand how religious ideas arose in the first place. If the outbreak of religion were an occasional event occurring here or there at random, then we could just dismiss it as an anomaly, like the way that random genetic mutations cause rare diseases. But religion is not like that. As David P. Barash says in The Chronicle of Higher Education (Volume 53, Issue 33, Page B6, April 20, 200.): "On the one hand, religious belief of one sort or another seems ubiquitous, suggesting that it might well have emerged, somehow, from universal human nature, the common evolutionary background shared by all humans. On the other hand, it often appears that religious practice is fitness-reducing rather than enhancing — and, if so, that genetically mediated tendencies toward religion should have been selected against."

Barash summarizes the various suggestions that have been put forth to overcome this problem of how religion could have originated.

Other, related hypotheses of religion include the anthropologist Pascal Boyer's grandly titled Religion Explained, which argues that natural selection would have favored a mechanism for detecting "agency" in nature, enabling its possessor to predict who is about to do what (and, often, to whom). Since false positives would be much less fitness-reducing than false negatives (i.e., better to attribute malign intent to a tornado and take cover than to assume it is benign and suffer as a result), selection would promote hypersensitivity, or "overdetection," essentially a hair-trigger system whereby motive is attributed not only to other people and mastodons, but also to trees, hurricanes, or the sun. Add, next, the benefit of "decoupling" such predictions from the actual presence of the being in question ("What might my rival be planning right now?"), and the stage is set for attributing causation to "agents" whose agency might well be entirely imagined.

Boyer's work, in turn, converges on that of Stewart Guthrie, whose 1993 book, Faces in the Clouds, made a powerful case for the potency of anthropomorphism, the human tendency to see human (or humanlike) images in natural phenomena. This inclination has morphed into a more specific, named phenomenon: pareidolia, the perception of patterns where none exist (some recent, "real" examples: Jesus' face in a tortilla, the Virgin Mary's outline in a semimelted hunk of chocolate, Mother Teresa's profile in a cinnamon bun).

The same kinds of ideas are invoked to explain the origins of morality but here the work has advanced a lot more. The idea that morality comes only from religion has no validity, given that natural selection provides alternative explanations. As Barash says: "Taken together or in various combinations, kin selection, reciprocal altruism, group selection, third-party effects, and courtship possibilities, as well as simple susceptibility to social and cultural indoctrination, provide biologists with more than enough for the conclusion: God is no longer needed to explain "Moral Law.""

This is not to say that the question of the biological origins of morality has been completely solved.

In Darwin's Cathedral, David Sloan Wilson explored the possibility that religious belief is advantageous for its practitioners because it contributes to solidarity — including but not limited to moral codes — that benefits the group and wouldn't otherwise be within reach. That notion, appealing as it might be, is actually a logical and mathematical stretch for most biologists, relying as it does upon group selection. The problem is that even if groups displaying a particular trait do better than groups lacking it, selection acting within such groups should favor individuals who "cheat." Mathematical models have shown that group selection can work in theory, but only if the differential survival of religious groups more than compensates for any disadvantage suffered by individuals within each group. It is at least possible that human beings meet this requirement, especially when it comes to religion, since within-group self-policing could maintain religiosity; it certainly did during the Inquisition.

So where do things stand? The status of the game is that while there have been major advances in understanding the biological origins (based on natural selection) in the propagation and evolution of religious ideas, and the origins of morality, there still needs a lot more work to be done, especially on the question of the origin of religion. As Barash says:

We must conclude, sadly, that a convincing evolutionary explanation for the origin of religion has yet to be formulated. In any event, such an account, were it to arise, would doubtless be unconvincing to believers because, whatever it postulated, it would not conclude that religious belief arose because (1) it simply represents an accurate perception of God, comparable to identifying food, a predator, or a prospective mate; or (2) it was installed in the human mind and/or genome by God, presumably for his glory and our counterevidentiary enlightenment.

But the goal can never be to change the minds of people about the lack of necessity of god by direct arguments. That rarely succeeds for reasons to be discussed in a future posting. In fact, although I have written many posts on why belief in god is irrational, I basically agree with Charles Darwin's approach when he said "It appears to me (whether rightly or wrongly) that direct arguments against Christianity and theism produce hardly any effect on the public; and freedom of thought is best promoted by the gradual illumination of men's minds which follows from the advance of science."

The reasons for my posts are not to persuade the determined believers to change their minds but to add to the universe of ideas, so that people who are not particularly committed to religion will find that their musings are not the dangerous thoughts of an apostate that will be punished by an angry god, but the perfectly rational doubts that arise in the minds of anyone who values the role of evidence and the pursuit of scientific inquiry.

What is exciting about the recent developments is that questions of religion and morality are now being investigated using scientific tools and methods, and those are bound to result in greater detailed understanding of those phenomena.

More to come. . .

POST SCRIPT: This should be fun

Apparently ABC News has decided to stage a science-religion debate. Who suggested this idea and offered to represent religion? None other than Ray "Banana Man" Comfort and his sidekick, Boy Wonder Kirk Cameron.

Apparently Comfort requested the debate in order to counter The Blasphemy Challenge. Comfort says: "I am amazed at how many people think that God's existence is a matter of faith. It's not, and I will prove it at the debate - once and for all. This is not a joke. I will present undeniable scientific proof that God exists."

Right. Frankly, if I was a religious person, I would be really worried about letting Comfort be my standard bearer. But who knows, maybe he has found a proof more powerful than the banana. (Scroll down to see the video if you don't know what I'm talking about.) Perhaps he has managed to find god's designing hand in the avocado also. Maybe he will bring along Peanut Butter Man to clinch the case.

The debate will occur on May 5, 2007 and apparently will be streamed live on the ABC website and later be shown on Nightline.

Of course, what Comfort and people like him really yearn for is media exposure and he probably doesn't care if people hoot with laughter at his "proofs" of god.

May 03, 2007

The new atheism-5: The scientific approach to philosophical questions

(See part 1, part 2, part 3, and part 4.)

The biological sciences approach to the questions of the origins of religious belief and morality is not to ask what the proximate causes are that led to belief in god and the afterlife (for which the answers may be to satisfy curiosity and provide comfort) but to see what evolutionary advantage accrues to those individuals who hold such beliefs, because natural selection works on individual organisms, not groups.

To better understand how evolutionary biology addresses these questions, it is useful to review the basic tenets of evolution by natural selection. Following Philip Kitcher's The Advancement of Science, (p.19), Darwin’s four fundamental evidentiary claims can be stated as follows:

1. The Principle of Variation: At any stage in the history of a species, there will be variation among the members of the species: different organisms belonging to the species will have different properties.

In other words, children are never identical with their parents. Within each species there is considerable diversity in properties and in support of this position Darwin took great pains to point out how hard it was to distinguish between different varieties within the same species, and between species.

2. The Principle of the Struggle for Existence: At any stage in the history of a species, more organisms are born than can survive to reproduce.

If there is an abundance of food and other resources, the population of any species would multiply exponentially. The fact that it doesn't is due to limitations in these necessary elements and this is what results in only some surviving and their populations reaching more or less stable values.

3. The Principle of Variation in Fitness: At any stage in the history of a species, some of the variation among members of the species is variation with respect to properties that affect the ability to survive and reproduce; some organisms have characteristics that better dispose them to survive and reproduce.

The members of a species that are more likely to survive and pass on their properties to the next generation are those that have properties that give them some survival advantage in the environment in which they find themselves. It is important to note that only some of the properties need to be advantageous for the organism to have preferential survival. Other properties may also flourish not because they have a similar advantage but because they are somehow linked to the advantageous properties and are thus carried along. Thus some properties may simply be byproducts of selection for other properties.

4. The Strong Principle of Inheritance: Heritability is the norm; most properties of an organism are inherited by its descendents.

Most properties that we have (five fingers, four limbs, heart, etc.) are inherited from our ancestors.

From these four principles, we infer the crucial fifth:

5. The Principle of Natural Selection: Typically, the history of a species will show the modification of that species in the direction of those characteristics which better dispose their bearers to survive and reproduce; properties which dispose their bearers to survive and reproduce are likely to become more prevalent in successive generations of the species.

So natural selection will favor those organisms that, by chance mutation in their genes, have properties that give them better chances for survival, and thus these characteristics will appear in the next generation in greater abundance.

This is the powerful theory that Darwin and Wallace proposed and which forms the basis of all modern biology. Note that it does not deal with how life originated in the first place and Darwin was frank about this limitation and offered just the broadest and mildest speculation about that big question. There is no question that when dealing with the issue of life itself, the problem of how life evolved and diversified has received better answers than the question of how life first originated.

Pretty much the same situation applies to religious beliefs (and the evolution of language also, but that is a topic for another day). Once religious ideas came into being, it is not hard to see how they could have continued and produced the present diversity using the above principles.

It is obvious that when it comes to religion, the strong principle of inheritance applies. The best predictor of what a person's religious beliefs are is the religious belief of the parents. Most children believe the same religious ideas as their parents except for slight variations. Most young children have very little idea that other religions even exist and don't even think of their own beliefs as 'beliefs' because they have been taught them as facts and believe them because their parents told them. (Interestingly, it is found that the eldest child is likely to be more faithful in adhering to the parents' beliefs than subsequent children.)

Applying the theory of natural selection to religious beliefs, the theory goes in the direction of religion being propagated as an accidental byproduct of selection for something else. It has been argued that in terms of natural selection, there is a definite survival advantage to favor a genetic predisposition for children to believe parents and other authority figures than to disbelieve them, and that thus this quality will be preferentially selected. In other words, natural selection does not select for religious beliefs per se, but religious beliefs are propagated as a byproduct of selection for trusting one's parents.

To see how believing what one's parents tell you is beneficial, we know that unlike many animals, young children are not at all capable of surviving in the wild on their own. They need parents to protect them. A child who listens to her parents (don't touch the fire, don't walk over the edge of the cliff, etc.) is more likely to survive than a child who ignores the authorities around her. Thus it is not hard to see how natural selection would prefer to select for a propensity to believe authority figures and that thus human children have evolved to have a predisposition to believe them.

But as Richard Dawkins points out in The God Delusion (p. 176) the catch is that the child is not able to discriminate between useful and useless bits of advice. "The child cannot know that 'Don't paddle in the crocodile-infested Limpopo' is good advice but 'You must sacrifice a goat at the time of the full moon, otherwise the rains will fail' is at best a waste of time and goats. Both admonitions sound equally trustworthy. Both come from a respected source and are delivered with a solemn earnestness that commands respect and demands obedience."

So while there is a survival value to the child inheriting a genetic predisposition to believe what her parents tell her, a byproduct of this is that the child inherits the religious beliefs of the parents as well, with slight variations. So once religious ideas gain currency in the early days of human evolution, they start propagating and diversifying like any other organism in the tree of life and become distinct entities that share a common root. Over time, just as individual biological variations became separated and formed into distinct species, so do religious beliefs. After some time, with the process often assisted by some charismatic religious leader, these religious variations became codified to become the distinct religious doctrines we see around us.

Another suggestion is that religious ideas, once they come into being, are 'memes' (ideas) that are analogous to genes but act like the mental counterparts of viruses, in that they act to propagate themselves and not for the benefit of the organism they inhabit. Dawkins describes the possible existence of 'memeplexes', a collection of memes that form the environment of ideas in which other memes have to compete for survival. He suggests that existing memeplexes might favor the survival of the following memes (p. 199):

• You will survive your own death
• If you die, you will go to an especially wonderful part of paradise where you will enjoy seventy two virgins (spare a thought for the unfortunate virgins)
• Heretics, blasphemers and apostates should be killed (or otherwise punished, for example by ostracism from their families)
• Belief in God is a supreme virtue. If you find your belief wavering, work hard at restoring it, and beg God to help your unbelief. (In my discussion of Pascal's Wager I mentioned the odd assumption that the one thing God really wants of us is belief. At the time I treated it as an oddity. Now we have an explanation for it.)
• Faith (without evidence) is a virtue. The more your beliefs defy the evidence, the more virtuous you are. Virtuoso believers who can manage to believe something really weird, unsupported and insupportable, in the teeth of evidence and reason, are especially rewarded.
• Everybody, even those who do not hold religious beliefs, must respect them with higher level of automatic and unquestioned respect than that accorded to other kinds of belief. . .
• There are some weird things (such as the Trinity, transubstantiation, incarnation) that we are not meant to understand. Don't even try to understand one of these, for the attempt to understand might destroy it. Learn how to gain fulfillment in calling it a mystery.
• Beautiful music, art, and scriptures are themselves self-replicating tokens of religious ideas.

I am not too familiar with the whole meme framework but I mention it here for the benefit of those who may know more about it.

I think that, just as in the case of life, there is a plausible biological explanation for how religious ideas propagate and diversify once they come into existence. The more difficult challenges are explaining what caused religious ideas to come into being in the first place, and similarly, what are the biological origins of morality.

More to come. . .

POST SCRIPT: Amazing pool shots

I have played pool only a few times in my life, enough to give me an appreciation of how skilful this player is. It is said that skill at pool is a sign of a mispent youth. By that rule, this pool player must have completely wasted his life.


Amazing - Watch today’s top amazing videos here

April 27, 2007

The new atheism-2: Breaking down the wall

In the post-Galileo world, elite religion and elite science have tended to get along pretty well. Opposing the heliocentric model of the solar system has been roundly criticized as a stupid thing for the Catholic church to do and, since then elite science and elite religion have seemed to find a modus vivendi that enables them to avoid conflicts.

A large number of people, scientists and non-scientists alike, have managed to believe in a deity while at the same time being more-or-less active members of churches, temples, and mosques. They have managed to do this by viewing the creation narratives in their respective religious texts as figurative and metaphorical, and not as records of actual historical events. Such people also tend to believe that the world is split up into two realms, a belief which is captured in a statement issued in 1981 by the council of the prestigious National Academy of Sciences which says "[R]eligion and science are separate and mutually exclusive realms of human thought whose presentation in the same context leads to misunderstanding of both scientific theory and religious belief."

Most of the people who subscribe to this kind of statement see no conflict between scientific and religious belief structures because each one deals with one of two distinct worlds that do not overlap. So scientists are supposed to deal with the physical world while religion deals with the spiritual world. Such people tend to view the periodic legal and political skirmishes between the creationist and scientific camps as the work of overzealous extremists, both religious and atheist, who are attempting to mix together things that should properly stay separate. They feel that their own point of view is very reasonable and find it hard to understand why everyone does not accept it.

Stephen Jay Gould, who was himself not religious, was a key advocate of this model of peaceful coexistence between the two worlds (or as he called them 'magisteria') of science and religion, going to the extent of even writing a book Rocks of Ages advocating it. He gave this model a somewhat pretentious name of Non-Overlapping MAgisteria or NOMA.

What this model successfully did was to allow elite religion and elite science to work together against those Christianists who sought to base public policy on religious beliefs. Thus in the periodic skirmishes over teaching intelligent design, prayer in schools, and other church-state separation issues, scientists and elite religionists tended to be on the same side, jointly opposing the attempts of people who sought to replace secular society with one based on a fundamentalist Christian foundation.

But this model peaceful coexistence model has some fatal flaws (that I have discussed before) and can only be sustaine by people strictly compartmentalizing their beliefs to avoid having to come to grips with the problems. Others are aware of the lack of viability of this model but have sought to downplay the problems in order to preserve the political alliance between the elite science and religion camps. But this is where things are changing.

The initial challenges to this peaceful co-existence model came from intelligent design creationism theorists like Berkeley emeritus law professor Phillip Johnson, who sought to drive a wedge between elite science and elite religion by arguing that one could not simultaneously be a methodological naturalist and a believer in god, since the former excluded the latter. His aim was to force elite religionists to make a choice: are you with god or with atheistic science?

In doing so, he was conflating the two different concepts of methodological and philosophical naturalism to serve his rhetorical purposes. As I have written before, one is not forced to be a philosophical naturalist (which essentially means atheist) in order to be a scientist, but there is little doubt that elite scientists are overwhelmingly atheist or agnostic.

But more recently, the attack on the peaceful coexistence model has come from a visible and vocal group of atheists who have also argued that this 'two worlds' model that allows elite religion to coexist with elite science is essentially a sham, and that intellectual honesty demands that this be pointed out. This new rise in vocal atheism can be seen everywhere in a flurry of books and films and blogs. There has been a rise in organizations seeking to bring the views of atheists to the public's attention and a new lobbying group has been created called the Secular Coalition for America (SCA) that includes atheists, agnostics, freethinkers, and humanists, and seeks to increase the visibility of non-theistic viewpoints in the United States.

As intelligent design creationism seems to be a spent force these days, receiving one setback after another since the Dover verdict, and reduced to a traveling road show that exhorts the true believers, this new attitude by atheists challenging the two-worlds model comes too late to help the cause of Johnson and his allies to advance the teaching of intelligent design creationism in schools by creating a split between elite science and elite religion. But this new outspokenness amongst atheists has caused some ripples in the fabric of elite opinion, and is sometimes referred to as the 'new atheism'.

Some key voices in this new attitude are Richard Dawkins (The God Delusion), Sam Harris (Letter to a Christian Nation and The End of Faith), Daniel Dennett (Darwin's Dangerous Idea, Consciousness Explained and Breaking the Spell), Victor Stenger (God: The Failed Hypothesis) and Brian Flemming (creator of the film The God Who Wasn't There).

The soothing view of advocates of peaceful coexistence that religion is a neutral ideology that some followers take in an evil direction while others take in a good one is being challenged. The new tack taken by the new atheists is that even though individual religious people are often very good, that is largely irrelevant. The problem with religion is that, at the very least, believing in a god requires one to suspend rational and critical thinking, and that is never a good thing. As Voltaire said: "If we believe absurdities, we shall commit atrocities."

Thus they have taken on the task of highlighting the fact that belief in a god has no credible objective evidence to support it and thus should not be believed by any person who supports reason and science. As Dawkins, one of the most forceful and vociferous among them, says: "I am attacking God, all gods, anything and everything supernatural, wherever and whenever they have been or will be invented."

It is this new front between elite science and elite religion in the science-religion wars that has caused some turbulence.

More to come. . .

POST SCRIPT: Cricket World Cup final

The final of the World Cup is being played between Australia and Sri Lanka on Saturday, April 28, 2007. The game starts at 9:30 am (US Eastern time) and will probably last around six hours, barring a complete rout by one side.

I have been told that people can see a live telecast of it in DeGrace 312 (Biology building). If you want to see what cricket is like as played by two good teams, you should drop by. There is a charge which I think is $10.00 but am not sure since I just heard about it.

In the semi-finals, Sri Lanka beat New Zealand and Australia beat South Africa. South Africa came into the tournament as the favorites but gave several lack-luster performances and barely made it into the final four. Australia has been the dominant team, crushing their opponents, and are undefeated, so they are now the heavy favorites for the title. Sri Lanka has been playing well too, but they will have to be absolutely at the top of their game to defeat the powerful Aussies.

It should be a good game.

April 26, 2007

The new atheism-1: The times they are a-changing

The year 2006 may have seen the beginning of a new chapter in the relationship between religious people and atheists. As I emphasized in my 2000 book Quest for Truth: Scientific Progress and Religious Beliefs (from which I am excerpting certain passages here), the relationship between science and religion is very complex because the words 'science' and 'religion' are both umbrella terms that encompass a wide range of ideas and attitudes.

The changing relationships become easier to understand if we follow theologian Langston Gilkey and divide up each group into two: elite religion and popular religion, and elite science and popular 'science'.

Elite religion is that which is believed by theologians and the more sophisticated members of mainstream religions. This group seeks to accommodate the knowledge created by science. It sees science and religion as describing two complementary areas of knowledge and tends to take scientific advances in its stride. Such people are comfortable with demythologizing the Bible and other religious texts and reinterpreting its knowledge in terms of recent developments in science. This group tends to have little difficulty seeing almost all the Biblical stories such as those of Noah and Moses (and especially the miraculous events) as metaphors and not historical. They believe in a god who can and does act in the world but how that happens is left unspecified and it is also left vague as to whether such interventions violate established scientific laws. Their religious beliefs are elastic enough that such people can absorb almost any scientific advance. That still leaves some problematic miracles at the heart of each religion (the resurrection of Jesus being one for Christians) that they are reluctant to demythologize, but in such cases refuge is taken by saying that science cannot disprove that it happened and so it could be true.

Popular religion, on the other hand, takes almost all its authority from religious texts and insists that all scientific knowledge must be interpreted to be consistent with these texts, since the latter are supposedly infallible. Fundamentalist religions of all stripes fall into this category. In the case of Christians, this group is likely to insist on the historicity of Noah, Moses, Jesus and all the other stories for which there is little or no corroborating historical evidence. For popular religionists, it is essential that the Bible and Koran and other religious texts be treated as scientifically and historically unimpeachable.

Elite science is that produced by the scientific establishment in universities and other research centers and published in scientific journals. Such science follows a strict methodological naturalistic philosophy, which argues that when investigating any phenomenon, we postulate as explanations only natural causes based on physical laws that lead to reproducible results. Elite science does not allow for the intervention of agents that can act arbitrarily in violation of natural laws as the explanation for any phenomenon.

Popular 'science' does not limit itself to methodological naturalism but allows for the action of supernatural forces. Such people find no difficulty believing in superstitions, horoscopes, astrology, telekinesis, witchcraft, and so on, and have no trouble believing that there could be some substance to the claims of astrologers, parapsychologists, fortune tellers, spoon benders, mind readers, faith healers, and the like. The idea of widespread existence of supernatural forces of all sorts does not strike such people as implausible. (The late Kurt Vonnegut, Jr. once said, "Those who believe in telekinetics, raise my hand.")

I hate to assign the label 'science' to what are such blatantly unscientific beliefs but feel obliged to follow Gilkey's terminology completely, and it does provide a kind of symmetry in terminology. But I will try to remember to put it in ironic quotes to remind us that all these beliefs are not really science in any sense of the word that a scientists would accept.

So what is the status of the relationship between the four groups?

Popular 'science' and popular religion have never had any real problems with each other methodologically. After all, they both are willing to accept the intervention of supernatural agents in everyday lives, in violation of the laws of science. For example, creationists mix their popular religion about god specially creating species with ideas about a 6,000 year-old Earth, which they try and justify using popular 'science', which essentially means rejecting much of accepted science and creating ad hoc theories and fitting evidence to reinforce beliefs that are based on religious texts. What differences there are between popular 'science' and popular religion lie along moral dimensions. Fundamentalist Christians might dislike and oppose witchcraft, but that is because they think the latter is 'evil', the product of a 'bad' supernatural agent, not because they think that the idea of witchcraft itself is preposterous.

Elite religion has had an uneasy relationship with popular 'science'. Elite religion is embarrassed by the notion that god, which for them is a sophisticated concept, would be compatible with other supernatural agents that go running around interfering with the laws of science on a daily basis. But they cannot come down too hard on popular 'science' because the only way to consistently do so would be to unequivocally rule out the action of all supernatural agents, which would put themselves too out of business. Once you have accepted the existence of at least one supernatural agent, you have pretty much lost any credibility to oppose any others. So this prevents elite religion from expressing a full-throated denunciation of popular science.

Elite and popular religions tend to get along better. Most large religious denominations encompass both kinds of believers and try not to antagonize any segment. So, for example, even though clergy are likely to know that very little of what is contained in the Bible and other religious texts is historically true (See here and the links therein), they are likely to not emphasize that fact to their congregations. While most people start out as children as popular religionists, if they begin to develop doubts about the historicity of the great flood and the like and ask questions, their priests and parents are likely to concede privately that it is acceptable to not believe in the literal truth of the events portrayed in the religious texts, because they are metaphors of a higher and deeper truth. Thus people who begin to question are slowly edged along the road to elite religion.

Elite science has been in conflict with popular 'science' and popular religion for some time now and this situation is likely to continue since the principle of methodological naturalism is a non-negotiable divide. One either accepts it or rejects it as a working hypothesis. Elite science rejects astrology and the like as frauds perpetrated on the gullible. The methodological naturalism that is characteristic of elite science does not allow the intervention of supernatural agents. Thus believers in popular science and popular religion are hostile to elite science because the latter does not allow for supernatural agents as explanations for anything.

All these relationships have been fairly stable for the last few centuries. It is the final remaining relationship, between elite science and elite religion, that is currently undergoing some serious upheaval and sparked the intense science-religion debates that we are currently experiencing, and will form the subject of future postings.

POST SCRIPT: New secular student group at Case

A group of students have taken the initiative to create a Case chapter of the Campus Freethought Alliance. The organizer is a student named Batool who can be reached at bxa21(at)case.edu if you would like more information about the group. I have been asked to serve as the group's advisor and have accepted.

The CFA's mission can be found on its website.

The Campus Freethought Alliance (CFA) is an international not-for-profit umbrella organization uniting freethinking, skeptic, secularist, nontheist, and humanist students and student organizations. Its purposes are:

-To encourage freedom from superstition, irrationalism, and dogma.
-To further the acceptance and application of science, reason, and critical thinking in all areas of human endeavor.
-To challenge misrepresentations of non-religious convictions and lifestyles.
-To create a campus community for freethinkers and skeptics.
-To cultivate in ourselves — and others — a sense of responsibility to, and compassion for, humanity.
-To counter all forms of religious political extremism.
-To defend religious freedom and the separation of church and state.
-To defend individual freedoms and civil liberties for all persons, regardless of race, sex, gender, class, creed, ethnicity, sexual orientation, and disability.
-To unite freethinkers, skeptics, and humanists and consolidate campus resources to these ends.

April 18, 2007

False symmetry

In recent posts, I have been pointing out that while it is impossible to disprove god's existence, that did not mean that it was rational to believe in god. The reason for those posts was to address a false symmetry that is sometimes posed between atheism and religious belief. That symmetry takes roughly the following form:

1. It cannot be proved that god does not exist
2. Therefore not believing in god's existence is as much an act of faith as believing in it.

Some extend this line of reasoning even further, to argue that therefore atheism is also a religion and that thus keeping prayer and religious education out of schools is equivalent to promoting one particular 'religion' (atheism), and thus violates the establishment clause of the First Amendment.

This is a false symmetry. While atheists would accept the first statement, they would reject the second. The crucial difference is the role that evidence plays in shaping beliefs.

I said that because of the impossibility of proving a negative, the current state of absence of evidence for god and the afterlife was all the proof we were ever going to get. If people think that a more convincing proof is required for disbelief in god, then I am curious to learn what form it would take. So far, nothing has been offered, as far as I know.

Atheists take the following position:

1. We believe in those things that have sufficient and convincing evidentiary support.
2. We disbelieve those things for which there is insufficient evidentiary support.
3. The more evidence there is in favor of a belief, the more we are likely to believe and vice versa.

The crucial difference can be seen in response to my question as to what evidence it would take to make them disbelieve in god and the afterlife. The commenters in this blog (who are all people who have obviously given this question considerable thought) agreed that there was no conceivable evidence that would make them give up their beliefs. And yet, they do not believe in Santa Claus and the Tooth Fairy and the Easter Bunny, which have no evidentiary support either. So religious belief is decoupled from evidence. In fact, belief in god in the absence of evidence is taken as a virtue, a sign of the depth of one's faith.

On the other hand, atheists take a position that is consistent with a scientific outlook. They believe in those things for which there is persuasive, objective, corroborative, and cumulative evidence, even if it cannot be proved beyond any doubt. They can also always conceive of some evidence that would persuade them to give up their most cherished theories. For example, if human fossils that are two billion years old were ever found, that would seriously undermine the theory of evolution by natural selection.

Similarly, atheists can conceive of all manner of things that would require them to accept the existence of god. As another example, suppose god were to suddenly appear on all TV stations, announcing his/her existence, the way that V appeared in the excellent film V for Vendetta. Of course, that by itself would not be convincing since people nowadays are skeptical of the power of technology. Some people are convinced that the Moon landings and the 9/11 attacks were hoaxes.

So to be really convincing, god would have to announce in that broadcast that he/she would stop the Earth's rotation for 24 hours, starting at some specified time. Such an act would violate the laws of conservation of energy and angular momentum, which are foundations of physics. If that happened, I don't see how anyone could doubt god's existence.

Of course, god would have to take some precautions. Simply stopping the Earth's rotation would, according to the laws of physics, at the very least unleash huge tsunamis and earthquakes that would wreak destruction on a massive scale. But since an omnipotent, omnipresent, omniscient god can keep track of and do everything at once, I am sure that these negative consequences of stopping the Earth can be avoided. And this is not asking for too much evidence since the Bible says that god has done this in the past (Joshua 10:12-13). To be accurate, the Bible says that god stopped the Sun, not the Earth's rotation, but we can grant some license for pre-Copernican thinking.

I am not saying that this is the only proof of god's existence that would be acceptable to atheists. One can suggest a vast number of similar evidences. But it does suggest the nature of the evidence that would be required to be convincing.

So that is where things stand. Atheists, like scientists, can always articulate what evidence (or lack of it) makes them believe some things and disbelieve others. They can also specify what kind of evidence would make them call into question what they currently believe and convert them to belief about things they are currently skeptical of.

But religious believers have no choice but to say that there are some beliefs that they will never give up on, whatever the evidence. It is important to realize that there is nothing inherently wrong with taking this position. Kathy in her comments to previous posts quite rightly points out that faith is irrational and that logic and evidence have nothing to do with it. I agree with her.

What I am saying is that the atheist's lack of belief in god and the afterlife are, like a scientist's, based on logic and the absence of evidence while religious beliefs have to part company with evidence at some point. And this is where the symmetry breaks down.

POST SCRIPT: The secret doubts of believers

In a previous post, I suggested that it was strange that religious believers in their daily lives did not act in ways that were consistent with an all-knowing, all-powerful god and suggested that perhaps people were more atheistic than they were willing to let on. Of course, there is hardly any new idea under the sun. It turns out that long ago philosopher David Hume suspected the same thing, as he wrote in his The Natural History of Religion chapter XII (1757):

We may observe, that, notwithstanding the dogmatical, imperious style of all superstition, the conviction of the religionists, in all ages, is more affected than real, and scarcely ever approaches, in any degree, to that solid belief and persuasion, which governs us in the common affairs of life. Men dare not avow, even to their own hearts, the doubts which they entertain on such subjects: They make a merit of implicit faith; and disguise to themselves their real infidelity, by the strongest asseverations and most positive bigotry. But nature is too hard for all their endeavours, and suffers not the obscure, glimmering light, afforded in those shadowy regions, to equal the strong impressions, made by common sense and by experience. The usual course of men's conduct belies their words, and shows, that their assent in these matters is some unaccountable operation of the mind between disbelief and conviction, but approaching much nearer to the former than to the latter.

March 13, 2007

The undogmatic dogmatism of scientists

In a recent online discussion about whether intelligent design creationism should be taught as part of science, one of the participants took exception to a statement by someone else that the theory of evolution is so well established that it was of no use to allow for the inclusion of intelligent design creationism. The challenger asked, quite reasonably: "On what things is there no room for debate? Of what things are we so certain that we're willing to close the door to possibilities? If academics allow themselves to appear dogmatic about their theories, we legitimize dogmatism. We should be careful that scientists themselves do not become the new proselytizers to claim they hold absolute truth."

This puzzlement is not uncommon and not unjustified. Seen from the outside, scientists must seem as if we either cannot make up our minds as to what we know for certain and what we are unsure of, or we are accused of cynically shifting our position for polemical advantage, sometimes arguing that evolution is a fact beyond dispute (in order to exclude intelligent design creationism as a viable competitor) while also asserting that intelligent design creationism is not scientific because it is not falsifiable. On the surface, those two positions seem inconsistent, applying different criteria to the two theories.

It is true that scientists assert that "evolution is a fact," just as they assert that "gravity is a fact." They also acknowledge the "theory" of evolution and the "theory" of gravity. And they also assert that ALL knowledge is provisional and subject to change.

How can all these things be simultaneously true? How can something be at the same time a fact and a theory, certain and yet subject to change? These are deep questions and ones that can lead to heated discussions since they affect deeply held core beliefs about science and religion.

These also happen to be questions that form the core of the seminar course I teach to sophomores. We discuss all kinds of things in my course including science and religion, intelligent design etc. and it is remarkable that in the four years that I have taught it, there have been absolutely no blowups or confrontations or unpleasantness, although colleagues have told me that these very same questions have caused problems in their classes. The relative harmony of my class exists despite the fact that I know that many of my students are quite religious, from a variety of traditions, and they know that I am an atheist. These personal beliefs are not things that we keep secret because they shed important perspectives on the discussions.

Perhaps the reason for the lack of friction is that my course starts with looking closely at what science's knowledge structure is. We read Pierre Duhem, Karl Popper, Thomas Kuhn, Imre Lakatos, Larry Laudan and other historians and philosophers of science and see how it is that science, unlike other areas of knowledge, progresses rapidly because of the commitment of its practitioners to a paradigm in which the framework in which problems are posed and solved are well defined. The paradigm consists of a scientific consensus about which theory (or a set of closely related theories) should be used for analyzing a problem, rules for determining what kinds of research problems are appropriate, the kinds of evidence, arguments, and reasoning that are valid, and the conditions that solutions to these research problems must satisfy if they are deemed to be satisfactory. That complex paradigmatic framework is sometimes loosely and collectively referred to as a "theory" and students quickly realize that the popular meaning of the word "theory" as some sort of simple hypothesis or guess does not apply in the scientific realm.

As long as that paradigmatic framework (or "theory") is fruitful and brings forth new problems and successes, it remains inviolate from challenges, and practitioners strenuously resist attempts at overthrowing it. The "theory" is thus treated and defended as if it were a "fact" and it is this that is perceived by some outside of science as dogmatism and an unwillingness to change.

But as Kuhn so persuasively argues, it is this very commitment to a paradigm that is the reason for science's amazing success, because the scientist working on a problem defined within a paradigm can be assured a priori that it is legitimate and important, and that only skill and ingenuity stands between her and the solution. Solving such problems within a paradigm is a sign of superior skill and brings rewards to the scientist who achieves it. Such conditions ensure that scientists will persevere in the face of challenges and adversity, and it is this kind of dogged determination that has resulted in the scientific breakthroughs from which we now benefit.

Kuhn likens this commitment of scientists to a paradigm to that of an industrialist to the manufacturing process that exists to make a particular product. As long as the product is made well, the manufacturer is not going to retool the factory because of the enormous effort and costs involved. Similarly, learning how to successfully exploit a scientific paradigm involves a long period of scientific apprenticeship in a discipline and scientists are unlikely to replace a working paradigm with another one without a very good reason. Learning to work well within a new paradigm is as costly as retooling a factory, and one does not do so cavalierly but only if one is forced into it. The dogmatism of science is thus pragmatic and not ideological.

But we do know that scientific revolutions, both major and minor, occur periodically. Very few of our current paradigms have a long history. So how and why do scientific paradigms change? They occur when the dominant paradigm shows signs of losing its fruitfulness, when it fails to generate interesting new problems or runs out of gas in providing solutions. It is almost never the case that one (or even a few) unsolved problems result in its overthrow because all scientific paradigms at all times have had many unsolved problems. A few counterexamples by themselves are never sufficient to overthrow a paradigm, though they can be a contributing factor. This is the fundamental error that advocates of intelligent design creationism (IDC) make when they argue that just because evolution by natural selection has not as yet explained some phenomena, Darwin's theory must be rejected.

To be taken seriously, a new paradigm must also promise to be more fruitful than its predecessor, open up new areas of research, and promise new and interesting problems for scientists to work on. It does that by postulating naturalistic mechanisms that make predictions that can be tested. If it can do so and the predictions turn out to be successful, the commitment to the existing paradigm can be undermined, and the process begins by which the paradigm may be eventually overthrown. IDC has never come even close to meeting this requirement.

Some people have challenged the idea that scientific theories have to have as necessary conditions that they be naturalistic and predictive, arguing that insisting they be so is to impose dogmatic methodological rules. But the requirement that scientific theories be naturalistic and predictive are not ad-hoc rules imposed from outside. They follow as a consequence of needing the paradigm to be able to generate new research programs. How could it be otherwise?

This is why IDC, by pointing to a few supposedly unsolved problems in evolutionary theory, has not been able to convince the biology community of the need to change the way they look at things. Intelligent design creationism does not provide mechanisms and it does not make predictions and has not been able to produce new research.

When we discuss things in the light of the history of science, the students in my class understand why science does things the way it does, why it determinedly holds on to some theories while being willing to abandon others, and that this process has nothing to do with dogma in the traditional religious sense. Religious dogma consists of a commitment to an unchanging core set of beliefs. Scientific "dogma" (i.e. strong commitment to a paradigm and resistance to change) is always provisional and can under the right conditions be replaced by an equally strong commitment to a new "dogma."

Almost all my students are religious in various ways, and while some find the idea of IDC appealing, they seem to have little difficulty understanding that its inability to enter the world of science is not a question of it being right or wrong, but is because of the nature of science and the nature of IDC. IDC simply does not fit into the kind of framework required to be a fruitful scientific theory.

February 28, 2007

The Failure of Intelligent Design Creationism

On Monday I attended the talk given by intelligent design creationism (IDC) advocate Michael Behe (author of Darwin's Black Box) at Strosacker. The program consisted of a talk for about an hour by Behe followed by a 20-minute response by Professor Hillel Chiel of the Biology Department at Case.

As regular readers of this blog know, I am quite familiar with the IDC program, having read Behe's book and other IDC literature, written about the topic extensively, and debated Behe and other IDC advocates in 2002 in Kansas and again in Michigan. So I was curious to see what new developments had occurred since my last encounter with him.

Michael Behe gives good talks and the full auditorium had an enjoyable evening. He has an engaging manner, good sense of humor, and presents his ideas in a clear way. But I already knew that having heard his talks before. What disappointed me was that there was absolutely nothing new in his talk, which was entirely a rehash of the same things he was saying five years ago. The examples he gave in support of intelligent design were the same as in his book that was published in 1996. The only new things since that book were his rebuttals of some criticisms of his book, but even those were things that he said in his 2002 talks. I recognized all the quotes and examples.

Behe made the familiar line of argument of IDC: 1. We immediately know when we see designed systems. (The Mount Rushmore example, a standby of IDC advocates, was once again evoked. See here and here for my earlier postings about this.) 2. There seems to be clear appearance of design in many biological systems. 3. Some of these systems are "irreducibly complex" in that if you take away any single component, the system fails to function. (He brought out the familiar mousetrap analogy and the flagellum and the blood-clotting examples). 4. Evolution by natural selection and its gradual approach to change cannot explain these phenomena and evolution advocates resort to implausible and hand-waving explanations. 5. Hence the existence of such systems implies a designer.

In his brief response, Chiel addressed all these arguments. Chiel said that the reason IDC is not science is that it does not provide any hypothesis to be tested and thus does not provide the basis for any research program. (The very fact that IDC has not produced anything new for over a decade is evidence of that.) On the other hand, evolution by natural selection is the basis of research in almost all of biology. He gave the example of his own research and also how bacteria, in order to develop drug-resistant strains, actually generate more random mutations so that there is a greater chance of producing a resistant strain that will survive due to natural selection. Scientists try to prevent these mutations from occurring as part of their struggle to prevent these strains from emerging. Thus Darwin's theory provides the basis of such scientific work.

Chiel also made a very important point about the whole irreducible complexity argument. Behe's "irreducibly complex" systems are those that have many interlocking parts so that taking any one component away destroys the functionality of the system. Since it is unlikely that all the parts could have evolved separately and then come together in one fell swoop to create the functioning system, Behe infers that they must have been designed in some way.

Chiel pointed out the flaw in this argument. How a system gets built cannot be inferred from what happens if you take away something from the system after it is built. It is quite possible for a complex system to be built gradually, piece by piece, such that when you take something away from the final object, it fails completely. To use an example of my own, it is like a house of cards. You build it up carefully one card at a time. But once built, take away almost any card and the whole system collapses. This is because in the process of constructing complex things, some parts initially play the role of scaffolding or some other auxiliary purpose. But with a change in functionality in the final system, a part that was initially an option can become essential.

For another example, take cars (this is also my example, not Chiels's). They have evolved gradually to be the complex machines we now have. Currently, GPS guidance systems in cars are an auxiliary device that are sometimes installed as a convenience but are not essential. If you have one in your car, you can remove it and the car is still functional. But in the future we could have a transport system where cars do not need drivers but run under their own remote controlled navigation and steering systems. Suddenly the GPS device is no longer an option but becomes crucial to the functioning of the car. Chiel said that complex biological systems are like that, co-opting things as needed to perform desirable but optional functions which can later become essential components.

Kenneth Miller's review of Behe's book provides a detailed example of how systems that satisfy Behe's description of being "irreducibly complex" actually evolved.

The three smallest bones in the human body, the malleus, incus, and stapes, carry sound vibrations across the middle ear, from the membrane-like tympanum (the eardrum) to the oval window. This five component system fits Behe's test of irreducible complexity perfectly - if any one of its parts are taken away or modified, hearing would be lost. This is the kind of system that evolution supposedly cannot produce. Unfortunately for "intelligent design," the fossil record elegantly and precisely documents exactly how this system formed. During the evolution of mammals, bones that originally formed the rear portion of the reptilian lower jaw were gradually pushed backwards and reduced in size until they migrated into the middle ear, forming the bony connections that carry vibrations into the inner ears of present-day mammals. A system of perfectly-formed, interlocking components, specified by multiple genes, was gradually refashioned and adapted for another purpose altogether - something that this book claims to be impossible. As the well-informed reader may know, creationist critics of this interpretation of fossils in the reptile to mammal transition once charged that this could not have taken place. What would happen, they joked, to the unfortunate reptile while he was waiting for two of his jaw bones to migrate into the middle ear? The poor creature could neither hear nor eat! As students of evolution may know, A. W. Crompton of Harvard University brought this laughter to a deafening halt when he unearthed a fossil with a double articulation of the jaw joint - an adaptation that would allow the animal to both eat and hear during the transition, enabling natural selection to favor each of the intermediate stages.

Chiel also debunked the notion that there is a "controversy" over Darwin's theory and that therefore the controversy should be taught. He said that there was no scientific controversy among scientists and that therefore neither IDC nor "the controversy" belonged in any science curriculum. However he said that IDC should be taught as part of a humanities or social sciences curriculum

He pointed out that scientists practiced methodological naturalism as a necessary element of their work but that did not entail philosophical naturalism (which is atheism). (See here for an earlier posting on this.) He pointed out that if in the future Darwinian evolution turns out to be an inadequate theory, there was still no requirement to adopt IDC because there would be other alternative naturalistic theories.

In his talk he also made the point that IDC is not only not science, it is also bad theology because linking one's religious belief to one scientific theory is dangerous. He posed the hypothetical question of what would have happened to someone whose religious belief was based on Newtonian physics (or to its flaws). When relativity and quantum mechanics came along, their faith would have been seriously undermined.

What was interesting is that Hillel Chiel, in addition to being a first-rate scientist, is a very observant Orthodox Jew, who is extremely knowledgeable about the Bible and its commentaries. I have known him for many years and he and I are in almost perfect agreement on almost everything about the nature of science. This illustrates my point that amongst scientists, their position on religious beliefs (or philosophical naturalism) is totally irrelevant. All that is required of a scientist is a commitment to methodological naturalism in their work. Some scientists like Chiel choose to reject philosophical naturalism and are devoutly religious, while others (like me) choose to accept it and become atheists. But those choices have no effect on the scientific work of either group. Chiel is far more religiously observant than most scientists I know, including (I suspect) Behe. And yet I think Chiel and I have far more in common that Behe and me, because we both share a commitment to methodological naturalism in science, which Behe does not.

The problem with IDC is that it is a sterile theory, producing no mechanisms or predictions or research programs. I suspect that most of the people who were in Strosacker Auditorium on Monday probably agree with Behe that god somehow acts in the world in some mysterious way that they do not know. Where Behe gets into trouble is in trying to assert that this belief has a scientific basis. That claim is simply not credible.

February 19, 2007

The odd response to global warming warnings

The recent release of the latest IPCC report on global warming gives a comprehensive review of the current state of knowledge and represent an overwhelming scientific consensus on the nature of the problem confronting us.

The report's conclusions paint a gloomy picture:

The report states in unequivocal terms that the climate is warming globally and that since the middle of the 20th century, human industrial activity – the burning of fossil fuels and, to a lesser extent, land-use changes – is warming's main driver. Since the last report in 2001, confidence in that statement has risen from "likely" (greater than a 66 percent chance) to "very likely" (greater than 90 percent).

• Temperatures are "likely" to rise 2 degrees to 4.5 degrees Celsius by the end of the century, if CO2 concentrations reach twice their preindustrial level. Within that range, the most likely result is 3 degrees C (5.4 degrees Fahrenheit). That additional warmth will distribute itself unevenly, with the highest increases in the Arctic and progressively smaller increases farther south.

• Sea levels could rise by century's end from 28 to 58 centimeters (11 to 23 inches) above 1999 levels globally. That's a narrower range than the IPCC offered in 2001, when it projected a range of 9 to 88 centimeters. Even if CO2 concentrations could be stabilized at twice preindustrial levels by 2100, thermal expansion of the oceans alone could raise sea levels an additional 1 to 3 feet by 2300. But recent research also suggests that the Greenland ice sheet is losing mass faster than expected, leaving open the possibility that sea-level increases will be higher if the melting trend continues to accelerate. If Greenland's ice cap continues to lose mass over the next 1,000 years, the entire ice cap would vanish, raising sea levels by some 23 feet.

What is interesting is the response of the global warming deniers. The Guardian newspaper reports that the so-called 'think tank' the American Enterprise Institute is actually trying to bribe scientists to dispute the report. Funded with $1.6 million from Exxon-Mobil, the AEI is offering scientists $10,000 each "for articles that emphasise the shortcomings of a report from the UN's Intergovernmental Panel on Climate Change (IPCC)." They are also willing to pay for travel and other perks. (Stephen Colbert comments on the bribes.)

Ben Stewart of Greenpeace is quoted as saying: "The AEI is more than just a thinktank, it functions as the Bush administration's intellectual Cosa Nostra. They are White House surrogates in the last throes of their campaign of climate change denial. They lost on the science; they lost on the moral case for action. All they've got left is a suitcase full of cash."

That sounds like an accurate description to me.

The Guardian report also says that an Exxon-funded organization in Canada will launch a review that will challenge the IPCC report. One of the people involved is Nigel Bellamy. Some of you may recall an earlier posting of mine that discussed how his sloppy work was exposed by George Monbiot.

There is one thing about the global warming debate that puzzled me and that is the vehemence of the opposition by some ordinary people to the idea. I can understand why the big emissions-producing industries and their allies in the Bush administration are fighting the idea that global warming is occurring. They do not want to take any action that might cut into their profits.

But why are some ordinary people so emphatically opposed to this finding of the scientific community? It is not like evolution or stem-cell research where science is treading on religious toes. As far as I can tell, there are no Biblical issues here, no eleventh commandment to the flock to, yeah verily, go out and emit CO2 in abundance until the glaciers melteth into the seas.

I am not talking about people who are simply skeptical about the scientific case being made that global warming is a real threat and that it is largely caused by human activity. That kind of skepticism is understandable but does not usually create the level of passion that is characteristic of the global warming deniers.

On global warming you find what seems to be ordinary people going out of their way to ridicule the emerging scientific consensus. This is surprising because most ordinary people do not go to great lengths to ridicule those areas in which there is scientific consensus. You do not find passionate opposition to, say, scientific community suggestions on reducing transfats or warning about the dangers of smoking.

It is almost as if the members of the public who are skeptics think that the scientific community is trying to pull a fast one on them. But why would they think this? There is no advantage to scientists in global warming. Scientists get no benefit from warning about the danger. At most they can be accused of being over-cautious.

So why this unusual level of hostility to the idea that global warming might be real? Is this coming from people who are angry with scientists about other things that do offend their religious sensibilities and are now out to attack anything that scientists say that might affect their lives? Or are these people part of an "astroturf" (i.e. fake grass roots) movement funded by the oil industry and polluting companies? Or are these people who, for ideological reasons, will side with Bush and big corporations come what may, whatever the issue? Or is there some other reason that I am missing?

These are not rhetorical questions. I am genuinely puzzled as to why this is so. Any suggestions?

POST SCRIPT: Talk by Israeli academic and peace activist

Jeff Halper, an emeritus professor of anthropology at Ben Gurion University and an Israeli peace activist, will be talking today at Case. The talk is free and open to the public.

When: 4:30pm, Monday, February 19, 2007
Where: Clark 309

I have written before about Professor Halper's last visit to Case in May 2005 and how his talk was a revelation to me about what was happening in the occupied territories.

The flyer for his visit this time says:

Dr. Jeff Halper, the Coordinating Director of the Israeli Committee Against House Demolitions was a 2006 nominee for the Nobel Peace Prize. He is an Israeli-American peace activist, professor of anthropology, distinguished author and internationally acclaimed speaker. The 3rd edition of his popular book, "Obstacles to Peace: A Reframing of the Palestinian-Israeli Conflict" was released in 2005. Halper has forged a new mode of Israeli peace activity based on nonviolent direct action and civil disobedience to the Israeli Occupation. Through its resistance to the demolition of Palestinian homes and other manifestations of the Occupation, including the rebuilding of demolished homes as acts of political solidarity, ICAHD has developed a relationship of trust and close cooperation with Palestinian organizations. Believing that civil society and governmental forces must be mobilized if a just peace is to emerge in Israel/Palestine, Jeff also directs ICAHD’s extensive program of international advocacy. His popular book Obstacles to Peace is to be followed by a forthcoming work: An Israeli in Palestine: Reframing the Israel-Palestine Conflict (Pluto Press).

January 02, 2007

The joy of free thinking

(Due to the holidays, I will be taking a break from blogging. New posts will begin on Wednesday, January 3, 2007.)

There is scarcely a week that does not pass without some interesting new scientific discovery about the nature of life. You open the newspaper and read of observations of light emitted by distant stars from the very edges of the known universe, light that must have been emitted almost at the very beginning, over ten billion years ago. Such research puts us in touch with our own cosmic beginnings. See this video for images from the Hubble Space telescope of the deep field that shows galaxies nearly 80 billion light years away. It is at once humbling to realize that we are but a speck in the vast regions of space who occupy a flicker of time, while also exhilarating that despite these limitations of space and time, we have been able, thanks to science, to learn so much about the universe we live in.

Just recently there was the discovery of the fossils a possible new Hobbit-like people who lived in a remote island in the Indonesian archipelago about 18,000 years ago. Then there was the discovery in China of an almost perfectly preserved bowl of noodles that is about the 4,000 years old. Discoveries like these shed light on how evolution works and how human society evolved.

Similarly, the discoveries that come from studies of DNA tell us a lot about where humans probably originated, how we are all related to one another and how, despite our common origins, the species spread over the Earth and diversified. The fact (according to the September 21, 2005 issue of The Washington Post) that we share over 90 percent of our DNA with chimpanzees, lend further strong support (not that it needed it) to the evolutionary idea that chimpanzees and humans share a common ancestry.

I enjoy reading things like this because it reminds me that we are all linked together by one great biological evolutionary tree, with the various animal species being our cousins, and even things like worms and bacteria being somehow related to us, however distantly. Some people may find the idea of being related to a monkey repulsive but I think it is fascinating. The ability of science to investigate, to find new relationships, to explore and conjecture and come up with answers to old questions as well as create new questions to investigate is one of its greatest qualities.

And for me, personally, being an atheist makes that joy completely unalloyed. Shafars (i.e., secularists, humanists, atheists, freethinkers, agnostics, and rationalists), as well as religious people who interpret their religious texts metaphorically and not literally, do not have any concerns when new headlines describing a new scientific discovery are reported in the news. They do not have to worry whether any new fact will contradict a deeply held religious belief. They do not have to worry about whether they need to reconcile the new information with any unchanging religious text.

On the other hand, the same news items that give us fascinating glimpses of scientific discoveries undoubtedly create fresh headaches for those whose religious beliefs are based on literal readings of religious texts, because each new discovery has to be explained away if it disagrees with some dogma. There are people who devote their entire lives to this kind of apologetics, to ensure that their religious beliefs are made compatible with science. The website Answers in Genesis, for example, is devoted to making Young-Earth creationism (YEC) credible. So it goes to great lengths to show that the earth is less that 10,000 years old, all the animals could have fitted into Noah's Ark, and that dinosaurs lived at the same time as humans.

One has to admire the tenacity of such people, their willingness to devote enormous amounts of time, sometimes their whole lives, to find support for a belief structure that is continuously under siege from new scientific discoveries. It must feel like trying to hold back the tide. (See this site which tries to fit the astrophysical data received from light emitted by stars that are billions of light years away into a 10,00 year old universe model.)

Of course, scientific discoveries come too thick and fast for even the most determined literal apologists to keep up. So they tend to focus only on explaining away a few questions, the kinds of questions that the lay public is likely to be concerned about, such as whether dinosaurs existed concurrently with humans, the ages of the universe and the Earth, whether the size of the Ark was sufficient to accommodate all the species, how Noah coped with the logistical problems of feeding all the animals and disposing of the waste, how Adam and Eve's children could multiply without there already being other people around or indulging in incest, and so on.

But the rest of us don't have to worry about any of that stuff and so can enjoy new scientific discoveries without any cares, and follow them wherever they lead. It is nice to know that one can throw wide open the windows of knowledge and let anything blow in, clearing out the cobwebs of old ideas and freshening up the recesses of the mind.

It is a wonderful and exhilarating feeling.

December 29, 2006

Can the curriculum at Hogwarts be called science?

(Due to the holidays, I will be taking a break from blogging. Instead, I will be re-posting some of my more light-hearted essays, this week dealing with the Harry Potter books. New posts will begin on Wednesday, January 3, 2007.

I have somehow completed another full year of blogging. Over the year I have made about 250 posts, written over three hundred thousand words, and had a total of about 750,000 hits. In the process of researching for the posts, I have learned a lot.

I would like to thank all the people who visited, read, and commented. It has been a real pleasure and I wish all of you the very best for 2007.)

Science fiction writer Arthur C. Clarke makes the point that any sufficiently advanced technology will seem like magic to the naïve observer. This seems to be a good observation to apply to the magic that is practiced at Hogwarts. What seems to exist there is a world with highly advanced "technology", operating under strict rules that the inhabitants know how to manipulate. The more mature wizards seem to easily produce consistent results with their spells while the novices mess around until they get it right. This is not very different from what we do in the Muggle world, except that we are manipulating computers and cars that are controlled by knobs and dials and switches and keyboards, while the wizards use wands and spells. It is not a mystery to other wizards how specific results are obtained and what is required to achieve those results is skill and practice.

What is intriguing is that while the experienced wizards and witches know how to manipulate the wands and words and potions to achieve results that seem magical to us Muggles, they do not really understand the rules themselves. Hey don't even seem to be interested in understanding how their magic works. The classes at Hogwarts seem to be almost exclusively hands-on and practical, using trial and error methods, with no theory of magic. Hogwarts is more like a trade school, where they teach a craft. It is like a school of carpentry or pharmacy or boat making where you learn that "if you do this, then that will happen" without actually learning the underlying principles.

The world of Hogwarts is closer to the medieval world, where there were highly skilled craftsmen who were able to build cathedrals and ships without understanding the underlying science. Introducing modern knowledge and sensibilities into an earlier time period is a staple of fantasy and science fiction, and writers like Rowling, and Mark Twain with his A Connecticut Yankee in King Arthur's Court do it well.

An interesting question to speculate on is whether the magic the students learn at Hogwarts castle would be classified as science today. If we go back to Aristotle, when he tried to distinguish science from other forms of knowledge he classified knowledge into ' know how' (the ability to consistently achieve certain results) and 'know why' (the underlying reasons and principles for the achievement). It is only the latter kind of knowledge that he counted as science. The 'know how' knowledge is what we would now call technology. For example, a boat maker can make excellent ships (the 'know how') without knowing anything about density or the role that the relative density of materials plays in sinking and floating (the 'know why').

Trying to make the world of Hogwarts consistent with modern science would have been difficult. Rowling manages to finesse this question by making life in Hogwarts similar to life in the middle ages, with no electricity, computers, television, and other modern gadgets. Students at Hogwarts don't use cell phones and instant messaging. In one book, this kind of anachronism is explained by Hermione saying, without any explanation, that electric devices don't work inside Hogwarts. By artfully placing the reader back in a time when it was easier to envisage magic (in the form of highly advanced technology) being taken for granted in the world, and the tools of modern scientific investigation were unavailable, Rowling manages to avoid the kinds of awkward scientific questions that would ruin the effect.

Thus Rowling manages to avoid the science dilemma altogether by creating in Hogwarts what seems to be a purely 'know how' world. This enables her to let magic be the technology that drives the stories forward.

POST SCRIPT: John Edwards declares his candidacy

I tend to be a bit cynical about politicians from mainstream parties because both parties are pro-war and pro-business but John Edwards, who announced his candidacy for the Democratic nomination in 2008, seems like a cut above the rest. In his announcement he said some encouraging things.

He pledged to "reduce the U.S. troop presence in Iraq, combat poverty and global warming" and "he favored rolling back some of the tax cuts provided to wealthy Americans under President Bush as well as enacting new taxes on the profits of oil companies." He also wants to guarantee universal health care for everyone.

He said that his 2002 vote to endorse the invasion of Iraq was a mistake and that "We need to reject this McCain doctrine of surging troops and escalating the war in Iraq. . .We need to make clear we're going to leave and we need to start leaving Iraq."

The issues he highlighted include "restoring the nation's moral leadership around the globe, beginning in Iraq with a drawdown of troops; strengthening the middle class and "ending the shame of poverty"; guaranteeing health care for every American; fighting global warming; and ending what he called America's addiction to oil."

That's not a bad platform on which to run. Here is his campaign website and below is a preview of his announcement.

If he gets the nomination and persuades Russ Feingold to be his running mate, that would be a ticket with real promise.

December 28, 2006

The problem with parallel worlds

(Due to the holidays, I will be taking a break from blogging. Instead, I will be re-posting some of my more light-hearted essays, this week dealing with the Harry Potter books. New posts will begin on Wednesday, January 3, 2007.)

Fantasy writers like J. K. Rowling who want to interweave the magical with the ordinary face some serious challenges. As long as you stay purely within the world of magic at Hogwarts, you can create a self-contained world obeying its own rules. But there is clearly some added drama that accrues when you can contrast that world with the world we live in, because that helps readers to identify more with the characters. Having wizards live among Muggles opens up plenty of opportunities for both comedy and dramatic situations. It also enables us to imagine ourselves in the story, to think that there might be a parallel world that we get glimpses of but do not recognize because we do not know what to look for. Maybe our neighbors are witches and we don't know it.

The situation faced by authors like Rowling in coming up with a realistic scenario that convincingly weaves the magic and ordinary worlds is not unlike the problem facing religious people who believe in a parallel world occupied by god, heaven, angels, etc. For this parallel religious world to have any tangible consequences for people in the normal world, the two worlds must overlap at least at a few points. But how can you make the intersections consistent? How can god, who presumably exists in the parallel universe, intervene in the natural world and yet remain undetected? In a previous posting, I discussed the difficult questions that need to be addressed in making these connections fit into a coherent worldview.

In Rowling's world, one connecting point between the magical and normal worlds is the pub The Leaky Cauldron whose front door opens onto the normal world and whose back has a gate that opens onto Diagon Alley, a parallel magical world. Another connecting point is at Kings Cross railway station where the brick wall between platforms nine and ten is a secret doorway onto platform 9 ¾, where the students catch the train to Hogwarts. A third is the house at 12 Grimmauld Place, and so on.

But this plot device of having gateways connecting the two worlds, while amusing, creates problems if you try to analyze it too closely. (This is the curse of many, many years of scientific training, coupled with a determinedly rationalistic worldview. It makes me want to closely analyze everything, even fiction, for internal logical consistency.)

For example, although platform 9 ¾ is hidden from the Muggles in some kind of parallel world, the train to Hogwarts somehow seems to get back into the real world on its way to Hogwarts because it travels through the English countryside. I initially thought that this countryside might also be in the parallel world, except that in one book Ron and Harry catch up with the train in their flying car, and they started off in the normal world. In another book we are told that Hogwarts is also in the Muggle world but that it is charmed so that Muggles only see what looks like a ruined castle. We also see owls carrying mail between Hogwarts and the normal world. So clearly there must be many boundaries between the magic and Muggle worlds. What happens when people and owls cross these other boundaries?

When I read the books, such questions are for me just idle curiosity. I like to see how the author deals with these questions but the lack of logical consistency does not really bother me or take anything away from my enjoyment of the books. Rowling is not sloppy. She respects her readers' intelligence, and she gives the reader enough of a rationale for believing in her two-worlds model that we can be taken along for the ride. The logical inconsistencies she glosses over are, I think, unavoidable consequences of trying to create this kind of parallel universe model, not unlike those encountered by science fiction writers striving for plausibility. To her credit, she is skilful enough to provide enough plausibility so that the reader is not troubled (or even notices) unless he or she (like me) is actually looking for problems.

But the problems Rowling faces in constructing a two worlds model that is logically consistent is similar to that faced by people who want to believe in a spiritual world that exists in parallel with the physical world. Since Rowling is writing a work of fiction and nothing of importance rides on whether we accept the inconsistencies or not, we can just close our eyes to these minor flaws and enjoy the books.

But the same cannot be said for the similar problems that confront two-world models that underlies most religious beliefs that have a god, because we are now not dealing with fiction but presumably real life. And being able to construct a two-worlds model (with gateways between the spiritual and physical worlds) that is logically consistent is important because it may determine whether people believe or disbelieve in a god.

It was my personal inability to be able to do this that finally convinced me to become an atheist.

POST SCRIPT: Going to church

Homer Simpson makes the case for not doing so.

October 24, 2006

Emotional reactions to Darwin

There is no doubt that Darwin's ideas about evolution by natural selection carry a huge emotional impact. For many people the idea that "we are descended from apes" is too awful to contemplate and is sufficient reason alone to dismiss any claim that natural selection holds the key to understanding how we came about. (Of course, we are not descended from apes. The more accurate statement is that apes and humans share common ancestors, making them our cousins, but even this refinement does not take away the stigma that supposedly comes with being biologically related to animals such people consider inferior.)

This unease about being biologically linked to other species is widespread and transcends any particular religious tradition. In Sri Lankan rural areas, one would frequently see monkeys on trees by the side of the road. As children when we were passing them, almost invariably someone would point them out and say things like "Your relatives have come to see you." Similarly, if one said that one was going to visit the zoo, this would also result in the question as to whether one was going to visit one's relatives. This kind of humor among children was commonplace, and reflected a reflexive instinct that humans were superior to all other animal forms, and reinforced the belief that some sort of special creative process must have been at work to produce us.

But if the thought of being related to apes gives some people the creeps, imagine how much worse it will for them to realize that as we go farther back in evolutionary time, we are cousins to all sorts of life forms that might make people even more squeamish.

Reading Richard Dawkin's book The Ancestor's Tale (2004) I found that I myself was not immune from that kind of emotional reaction, even though I have no problems intellectually with accepting natural selection and all its consequences.

For example, I had little difficulty emotionally accepting that the apes and monkeys are my cousins, partly because, I suppose, that idea has been around for a long time and I have simply got used to it. Also a common ancestor to the humans and apes would not look very different from us now and is easier to envisage. But as the evolutionary clock went back in time, and I started imagining what my deep ancestors looked like, I had a variety of reactions.

The idea that I had common ancestors with dogs and cats and horses (those evolutionary branches separated from the human branch at about 85 million years ago (Mya)) did not cause me any problems. I kind of liked the idea that my dog Baxter and I can trace our separate lineages back to a time when we both had a common ancestor. It is clear that our common ancestor would not look much like present-day humans or dogs, but I cannot imagine what it might have looked like apart from having some of the common characteristics shared by dogs and humans, like being four-limbed, warm-blooded, invertebrates.

More annoying was the realization that the branch that led to the rodents like rats, squirrels and rabbits only separated from the human branch at 75 Mya, meaning that those animals that we consider vermin and would not think of having in our houses, actually have a closer relationship to humans (since our common ancestors lived more recently) than those whom we love and welcome into our homes as pets, like dogs and cats.

Somehow, the emotional reaction of finding oneself having common ancestors with dignified and majestic animals like whales (85 Mya) and elephants (105 Mya) is positive while being linked to things like snakes (at 305 Mya) felt kind of icky.

A hard bridge to cross (again I mean emotionally) was accepting that frogs and toads and salamanders shared a common ancestor with me at about 340 Mya, perhaps because I share the common perception that these animals are slimy.

Going back further, I had little negative emotional reaction to realizing that I had a common ancestor with sharks at 460 Mya but the thought that flatworms and I were related at 630 Mya was harder to take. I suppose that this is because sharks are usually perceived as admirable and graceful (if dangerous) animals while I have never liked worms, seeing them as somehow disgusting. Perhaps I will now have warmer feeling towards them, seeing that we are relatives.

Once I got over the emotional hurdle of being able to accept the fact that worms and I have common ancestors, the rest was pretty easy to accept, perhaps because the earlier life forms that our common ancestors took had to be so different that I could not really relate to them (let alone imagine them) in any way. Thus it was a breeze to accept that I am related (however distantly) with sponges, bread moulds, amoeba, and bacteria.

It was amusing to keep monitoring my emotional reactions as I read about the backward evolutionary journey. But like most difficult journeys, taking that first step is the hardest. And now I have a better understanding why many religious people simply cannot take that first step and acknowledge that chimpanzees are our cousins, in fact are the closest cousins we have in the animal kingdom, with our common ancestor living just 6 million years ago. Because once you accept that, then you have embarked on journey whose inevitable end is that you end up as one with a bacterium. It is hard to think of you being created in god's image after that.

Thus I am somewhat sympathetic to those people who find Darwin's ideas hard to stomach and desperately seek to find a more palatable alternative. However, I think their task will prove hopeless, since the basic tenets of evolution are here to stay and so we may as well get used to it.

POST SCRIPT: Ken Miller talk

Ken Miller, biologist at Brown University, expert witness at the Dover, PA "Panda Trial," and author of the book Finding Darwin's God will explain why every college student must vote.


"Trick My Vote: Science, Intellectual 
Courage, and the Battle for America's Soul"
Ford Auditorium, Allen Memorial Medical Library
11000 Euclid Avenue, Cleveland
11:30 a.m. -1:00 p.m.

The talk is free and open to the public.

Call (216) 368-8961 for more information.

I have heard Miller before and he is a very good public speaker. He is a practicing Christian (Catholic), a staunch defender of the theory of evolution by natural selection, and an opponent of efforts to include intelligent design creationism in science curricula.

October 13, 2006

Looking for deep ancestors

Richard Dawkins in his book The Ancestor's Tale (2004) tells a fascinating story. He models his book on a journey that, rather than moving through space to a particular destination, is moving in the temporal dimension, going steadily back in time. He calls it a "pilgrimage to the dawn of evolution." He starts with present day humans and follows them back into history. One reason he gives for going back in time instead of starting at the beginning and going forwards as is more commonly done is to avoid a common trap of perception. When you tell the story forwards, it is hard to avoid giving the impression that life evolved purposefully, that human beings were somehow destined to be. This is counter to evolutionary theory that says that evolution is not directed towards any goal. It tells us how the present emerged from the past. It does not tell us how the future will emerge from the present.

Dawkins points out that the another advantage of telling the story backwards is that you can choose any of the current species and go back in time and tell pretty much the same story.

As I have mentioned earlier, we quickly (in just 2,000 years) reach the time when the most recent common ancestor lived and soon after that (about 5,000 years ago) reach a point when all our ancestors were identical.

But this convergence of ancestry is not just for humans, it is for all species. If we go far enough back in time, even my dog Baxter and I share the same ancestor, which I find a very appealing notion.

Anyway, here is a concise summary of the landmarks on this pilgrimage back in time, along with some other landamrks.

About 10,000 years ago, the agricultural revolution began and about 12,000 years ago saw the beginnings of language. About 160,000 years ago saw the beginning of what we would consider modern humans, and beyond that we start reaching the precursors to modern humans, a famous milestone being the fossil Lucy, dated to 3.2 Mya (million years ago).

As we go further back in time in this pilgrimage, other species start 'joining us' in our journey. What this means is that we reach times at which an earlier species existed which then split into two branches and diverged evolutionarily to what we see now. So if we go back further in time, we should cease to view the pilgrims on the journey as a combined group of humans and other species but instead see the travelers as that earlier common ancestor species. He calls these common ancestors 'concestors'. (Concestor 0 in Dawkins' scheme is the most recent common ancestor of all humans (or MRCA) that I have discussed earlier and who lived just a few thousand years ago.)

Going back in time, at 6 Mya we meet concestor 1 when we join up with the ancestors of chimpanzees. As we go even back further, we (and when I say 'we', I remind you that we should not think of 'us' as humans at this point but as the common ancestor species of humans and chimpanzees) join up at about 40 Mya successively with gorillas, orang utans, gibbons, and finally monkeys. Remember that the 'pilgrims' look different as we pass each concestor point.

Concestor 8 occurs at about 63 Mya when we join up with mammals like lemurs and lorises. (Just prior to this, around 65 Mya, was when all the dinosaurs went extinct.) As you can imagine, concestor 8 would not look much like present-day humans at all.

About 75 Mya, we join up with rats, rabbits and other rodents (concestor 10), at 85 Mya with cats and dogs (concestor 11), at 105 Mya with elephants and manatees (concestor 13), at 310 Mya with snakes and chickens (concestor 16).

At 340 Mya, we make a big transition when join up with the ancestors of amphibians, such as frogs and salamanders (concestor 17). This point marks the first time that animals moved out of the water.

Around 440 Mya we join up with various kinds of fish (concestor 20), and around 630 Mya with flatworms (concestor 27).

After various other species ancestors' join ours, the next big rendezvous occurs at about 1,100 Mya when we join up with the ancestors of fungi, such bread molds and truffles (concestor 34).

Some time earlier than that (passing the connection with amoeba at concestor 35) but before 1,300 Mya (it is hard to pin the date) is when the next major transition occurs when we join up with green plants and algae. This common ancestor is concestor 36.

At about 2,000 Mya we arrive at concestor 38 where every species is now represented by a eukaryotic (nucleated) cell.

At about 3,500 Mya we meet up with our earliest ancestors, the eubacteria (concestor 39), the original form of life.

Dawkins' reverse story can be seen visually, told in a beer commercial in 50 seconds flat to the pounding beat of Sammy Davis Jr. singing The Rhythm of Life. (A minor quibble: There is one way in which this fun visual representation is not accurate. It shows three humans going back in evolution until we join up with ancestors of the present-day amphibians (concestor 17) in identical parallel paths. This is ruled out by the reductio ad absurdum argument written about earlier, where it was established that all present day humans must have had a single common ancestor in any earlier species.)

I must say that this book was an exhilarating journey. To see the whole of the evolution of life going backwards and merging together was a nice new way of seeing the process. Those of you who are interested in the grand sweep of evolution written for a non-specialist will find Dawkins' book a great resource.

POST SCRIPT: The Boxer

A live performance of Simon and Garfunkel singing one of my all-time favorite songs The Boxer

October 12, 2006

My ancestor Pharaoh Narmes again

I began this series of posts saying that I had discovered that there was an 80% chance that I was descended from Narmes, the first pharaoh of a united Egypt. As subsequent posts have indicated, I arrived at this, not by any detailed investigative work in tracking my lineage, but by depending upon the analysis of Douglas L. T. Rohde, Steve Olson, and Joseph T. Chang and published in the journal Nature.

After reading that paper, I became curious about who lived around the time of the identical ancestors and looked around to see if there was a named individual. I knew that writing was discovered around 5,000 years ago, so the time of the IA (identical ancestors) coincided roughly with the time that written records were starting to be kept. So there was a chance that there was a reliable contemporaneous written record of some person from the time of the IA. The chances were also great that the person whose life was recorded was likely to have been a big shot, a king or some such, whom people considered important enough to write about, on tombs and so forth.

I started investigating about who was the earliest named person we knew for sure existed. This ruled out characters from religious books like the Bible because those were written much more recently (around 900 BCE and later) and depended too much on legends and oral traditions that made them unreliable as history.

Marc Abramiuk of the Anthropology department at Case Western Reserve University suggested Narmes as a likely candidate for the honor of being the earliest known and named human being, and since he fitted into the IA period, I claimed that there is an 80% chance that he is my direct ancestor. (If any of you know of other named people who are candidates for the earliest known and recorded human being, please let me know. This is one genealogy search we can all contribute to, since every person we find from that time is likely to be the ancestor of all of us.)

Of course, there is no distinction to the claim that Narmes is my ancestor, since if that is the case, then he is also the ancestor of every other person currently alive. But that's fine by me. I don't want or need exclusive rights to him since having a famous ancestor confers no credit to me. Thinking that we are special simply because we belong to some particular group or are related to some particular individual is a symptom of tribal thinking.

Since I started on this study, I have become curious about the people who lived long ago and a bit surprised at how soon the track goes cold. The origins of written language pretty much sets the upper bound for reliable knowledge. If you think about it, given the vast ages of the Earth and the human beings that inhabit it, it is humbling to think of how little direct information have about our origins in terms of actual historical figures and recorded history, and how amazing it is that we have been able to figure out so much about the deep past using the tools of research and analysis.

This is the power of science, that we can use it to painstakingly reconstruct so much of our distant past by building carefully on what we know from the fields of archaeology, anthropology, biology, chemistry, physics, and mathematics. That interconnected web of knowledge serves as a filter that allows a lot of the guesswork and speculation and myths about our past to drain away, and leaves behind precious nuggets of hard knowledge.

In the next post in this evolution series, I will look at what we find when we go even further back in time.

October 11, 2006

Sexual selection

In a previous post, I discussed the fact that although all of us have the identical set of ancestors who lived just 5,000 years ago, this does not mean that we have the same genes. The fact that we are different is due to the fact that if most of the mating occurs within a group, then this can result in certain features becoming emphasized. In extreme case, this initial isolated mating pattern can result in a new species being formed that cannot mate with other groups that it could have done in the past.

I had always thought that the two organisms belonged to different species if they were biologically different enough that they either could not produce offspring or, as in the case of mules produced by horses and donkeys, the offspring were infertile and thus not able to reproduce.

But I learned from Richard Dawkins' book The Ancestor's Tale (2004) that two things can be considered different species even if they are perfectly capable of producing fertile offspring. All that is required for them to be considered to be different species is that they are not found to mate in the wild for whatever reason.

Normally, this happens when there is some kind of barrier that separates two groups of the same species so that they cannot mate. "No longer able to interbreed, the two populations drift apart, or are pushed apart by natural selection in different evolutionary directions" (p. 339) and thus over time evolve into different species. But the separation can also occur due to sexual selection.

He gives a fascinating example of this on page 339. He describes experiments done with two species of cichlid fish. The two species live together in Lake Victoria in Africa and are very similar, except that one has a reddish color and the other bluish. Under normal conditions, females choose males of the same color. In other words, there was no hybridization between the two colors in the wild, thus meeting the requirements for being considered different species. But when experimenters lit the fish in artificial monochromatic light so that they all looked dirty brown, the females no longer discriminated among the males and mated equally with both kinds of males and the offspring of these hybrids were fully fertile.

He also describes ring speciation using the example of the herring gull and lesser black-backed gull (p. 302). In Britain, these two kinds of birds don't hybridize even though they meet and even breed alongside one another in mixed colonies. Thus they are considered different species.

But he goes on to say:

If you follow the population of herring gulls westward to North America, then on around the world across Siberia and back to Europe again, you notice a curious fact. The 'herring gulls', as you move around the pole, gradually become less and less like herring gulls and more and more like lesser black-backed gulls, until it turns out that our Western European lesser black-backed gulls actually are the other end of a ring-shaped continuum which started with herring gulls. At every stage around the ring, the birds are sufficiently similar to their immediate neighbors in the ring to interbreed with them. Until, that is, the ends of the continuum are reached, and the ring bites itself in the tail. The herring gull and the lesser black-backed gull in Europe never interbreed, although they are linked by a continuous series of interbreeding colleagues all the way around the other side of the world.

Dawkins gives a similar example of this kind of ring speciation with salamanders in the Central Valley of California.

Why is this interesting? Because it addresses a point that sometimes comes up with skeptics of evolution. They try and argue that there is a contradiction if we had evolved from an ancestor species that was so different from us that we could not interbreed with that species. Surely, the argument goes, doesn't speciation imply that if species A evolves into species B, then must there be a time when the child is of species B while the parent is of species A. And isn't that a ridiculous notion?

The herring gulls and salamanders are the counterexamples in space (which we can directly see now) of the counterargument in time (which we can only infer). What it says is that as descendants are produced, they form a continuum in time. Each generation, while differing slightly, can interbreed with its previous generation, but over a long enough period of time, the two end points of the time continuum need not be able to interbreed.

Thus it is possible for an organism to be intermediate between two species.

Coming back to the question of why we look so different if we all shared common ancestors so recently, it is likely that the kind of selectivity practiced by the cichlid fish has resulted in certain features being shared by groups that interbreed within a restricted domain bounded by distance and geography and culture, although the process has not become so extreme that we have formed into distinct species.

I apologize for boring those readers who had had a much more extensive biology education than I have because all these things which I have been writing about recently on evolution must be well known to them. But I find all this perfectly fascinating and novel.

POST SCRIPT: Amy Goodman in Cleveland

Award-winning journalist Amy Goodman, host of the daily, grassroots, global, radio/TV news hour Democracy Now!, is on a national speaking tour to mark DN!'s 10th anniversary and launch her second book with journalist David Goodman, Static: Government Liars, Media Cheerleaders, and the People Who Fight Back.

WHEN: Saturday, October 14th, 7:00-8:30 PM
WHERE: Student Center,
John Carroll University,
20700 N. Park Blvd (University Heights), Cleveland, OH
DESCRIPTION: Amy Goodman speaks at a free event at the Student Center. Book signing to follow. Members of Iraq Veterans Against the War will give a brief presentation before the talk, as part of their collaboration with the Uprise Tour.
TICKETS: Free
MORE INFO: See here for directions.

October 10, 2006

Why we look different despite having identical ancestors

In the previous post in this series, I reported on a paper by Douglas L. T. Rohde, Steve Olson, and Joseph T. Chang and published in the journal Nature that said that if we go back about 5,000 years, the ancestors of everyone on Earth today are exactly the same. This date is called the IA point, where IA stands for 'identical ancestors'.

One question that will immediately arise in people's minds is that if all our identical ancestors lived so recently, how is it that we look so different? If you take four people from China, Sri Lanka, Sweden, and Malawi, they are usually fairly easily distinguishable based on physical appearance alone, using features such as skin color, hair, facial features, etc. How could this happen if they all had identical ancestors as recently as 5,000 years ago?

The answer lies in the fact that while it is true that we all share the same ancestors, it does not mean that we all received that same genetic information from that common ancestral pool.

It is true that each of us gets exactly half our genes from our fathers and half from our mothers. But when we pass on our genes to our children, while each child gets exactly half from each parent, that does not imply that they get exactly one quarter from each grandparent. What is true is that on average each child gets one quarter of the genes from each grandparent.

The reason for this is because when a sperm or egg is formed, the genetic information (say in the egg formed in the mother) that goes into it undergoes a process of recombination in which the genes the mother obtained from her parents get mixed up before the transfer into the egg. It is thus theoretically possible, though unlikely, that a child will have zero genetic information from one of her four grandparents.

Furthermore, as we go down to the next generation, the average genetic information received by a child is now just one-eighth from any given great-grandparent. After many generations, even the average contribution of someone to each descendant approaches zero and it is not hard to imagine that some ancestors will have descendants who inherited none of their genetic information. In fact, as Rohde, Olson, and Chang say, "because DNA is inherited in relatively large segments from ancestors, an individual will receive little or no actual genetic inheritance from the vast majority of the ancestors living at the IA point."

Furthermore, "In generations sufficiently far removed from the present, some ancestors appear much more often than do others on any current individual’s family tree, and can therefore be expected to contribute proportionately more to his or her genetic inheritance. For example, a present-day Norwegian generally owes the majority of his or her ancestry to people living in northern Europe at the IA point, and a very small portion to people living throughout the rest of the world."

So even though we all have the same set of ancestors, the amount of genetic information received from any one ancestor will vary wildly from person to person.

As long as populations remained largely isolated, they could thus evolve different physical characteristics, although even a tiny amount of migration between populations is enough to create the early common dates of the MRCA (most recent common ancestor) and IA.

There are some factors that could shift those dates back further.

If a group of humans were completely isolated, then no mixing could occur between that group and others, and the MRCA would have to have lived before the start of the isolation. A more recent MRCA would not arise until the groups were once again well integrated. In the case of Tasmania, which may have been completely isolated from mainland Australia between the flooding of the Bass Strait, 9,000–12,000 years ago, and the European colonization of the island, starting in 1803, the IA date for all living humans must fall before the start of isolation. However, the MRCA date would be unaffected, because today there are no remaining native Tasmanians without some European or mainland Australian ancestry.

No large group is known to have maintained complete reproductive isolation for extended periods.

It seems to me that these results arguing for the fact that our most recent common ancestor lived about 2,000 years ago and that we all have the same common ancestors who lived just 5,000 years ago are pretty robust.

This has profound implications for origins myths and tribalism. Some people like to have a sense of racial pride by thinking that they represent 'pure' races. This research argues that this view is rubbish. None of us are 'pure'. We are all cousins, and fairly close ones at that.

October 02, 2006

Realistic calculation of the date of our most recent common ancestor

In the previous posting, I discussed the calculation of Joseph T. Chang in which he showed that the most recent common ancestor (MRCA) of all the people living today lived around 1100 CE, while around 400 CE everyone who lived then was either the ancestor of all of us or none of us. The date when this occurs is called the IA (identical ancestor) date.

Chang got these results assuming that the population is constant over time at some value N, that the generations (with each generation lasting 30 years) are discrete and non-overlapping (i.e. mating took place only between males and females of the same generation), and that mating was random (i.e., there was equal probability of any one male in a generation to breed with any female of that same generation.)

What happens to these dates if you relax these unrealistic assumptions? One practical difficulty of going to more realistic models is that exact mathematical calculations become impossible and one has to resort to computer simulations. This was done by Douglas L. T. Rohde, Steve Olson, and Joseph T. Chang and their results were published in the journal Nature (vol. 431, September 30, 2004, pages 562-566).

As a first improvement, they divided the world into ten population centers (or 'nodes'): one each in North America, South America, Greenland, Australia, the Pacific Islands, and the Indonesian archipelago, and two nodes in Africa and in Asia. Within each subpopulation, they assumed random mating, but allowed for neighboring populations to exchange just one pair of migrants per generation. Their computer models found that the best way to accommodate varying populations was to take a fixed value N equal to the population at the time of the MRCA. They assumed N to be 250 million, which was approximately the global population in the year I CE.

Using this more realistic model, and a generation span of 30 years, they obtained the MRCA date as 300 BC and the IA date as about 3,000 BCE, both still surprisingly recent.

They then constructed an even more sophisticated and realistic model. They broke up the inhabited area into three levels of substructure: continents, countries, and towns. (These were not real places, of course, just models, but they used our knowledge of geography and migrations routes that existed before 1,500 CE to create their models.)

The model allowed for each person to have a single opportunity to migrate from his or her town of birth. Within a country, they could migrate to any other town. If the migrants went to another country, the probability of that occurring decreased with the distance to the new country. To go to another continent required them to go through certain ports, and so on. The model also incorporated our knowledge of the size of ports and when they opened up.

Generations could also overlap in this model and the birth rate of each continent was adjusted to match historical estimates.

After making all these sophisticated adjustments to make their model more realistic, they arrived at what they felt was a reasonable estimate for the MRCA and IA dates. It turns out that the MRCA lived around 55 CE and the IA date is about 2,000 BCE. They also found that our most recent common ancestor probably lived in eastern Asia, not Africa as had been commonly supposed.

So despite going to considerable lengths to simulate a realistic pattern of population growth, mating, and migration, the dates arrived at for the MRCA and the IA are still surprisingly recent.

(If the authors of the paper made their parameters very conservative, they pushed the date for the MRCA only as far back to 1,415 BCE and the IA date to 5,353 BCE.)

A little reflection should persuade anyone that this result that our most recent common ancestor lived as late as 55 CE and in just 2,000 BCE we had identical ancestors has profound implications for the way we view ourselves and our relationship with others. The authors capture the wonder of it all when they end their paper with the following comment:

[O]ur findings suggest a remarkable proposition: no matter the languages we speak or the colour of our skin, we share ancestors who planted rice on the banks of the Yangtze, who first domesticated horses on the steppes of the Ukraine, who hunted giant sloths in the forests of North and South America, and who laboured to build the Great Pyramid of Khufu.

I find this amazing and remarkably encouraging. It should be more widely known. If more people realized how close we are to each other, perhaps we would stop killing one another and treat each other like the fairly close relatives we truly are.

September 29, 2006

The most recent common ancestor of all humans living today

In order to find the date of the most recent common ancestor (MRCA) of all the people living today, Chang started out by constructing a simple mathematical model of population mixing. (See here for some background to this post.)

He assumed that the population is constant over time at some value N. He assumed that the generations are discrete and non-overlapping (i.e. mating took place only between males and females of the same generation). He also assumed that mating was random. In words, that there was equal probability of any one male in a generation to breed with any female of that same generation.

Of course, none of these assumptions is realistic. The size of a population changes with time for a variety of reasons. People also do not mate at random, being more likely to choose from those nearby, and from people within their same groupings whether those be economic, social, cultural, class, religion, etc. And cross-generational matings are not uncommon.

But for the purposes of mathematical simplicity, and to get a rough idea of the timescales involved, Chang's simple model is worth looking at because it enables him to do a rigorous mathematical calculation for the date of the MRCA. What Chang found, to everyone's surprise, was that the date of existence of the MRCA of all the humans living today was very recent. He found that the number of generations that one has to go back to get an MRCA was log(2,N), which stands for the logarithm to base 2 of the population size N. He further found that even though this was a statistical calculation, the result was very sharply peaked about this value, meaning that it was highly unlikely that the MRCA date would differ by even 1% from this value.

If you take a population N of size one million, the number of generations you have to go back is only 20 to get to our MRCA. If you take a population of one billion, our MRCA existed about 30 generations ago, or around 1100 CE (for an average generation span of 30 years).

So in Chang's model, our MCRA lived far more recently than anyone had imagined, and way less than Mitochondrial Eve (~140,000 years ago) or Homo erectus (~250,000 to one million years ago). It is kind of fascinating to think that every one of us living today share at least one ancestor who was living in the Middle Ages. I have been wondering who that person was, and where he or she lived, and what he or she was like.

But that was not the only surprising thing that Chang found. Once you get an MRCA, then that person's parents are also common ancestors for all of us, as are his/her grandparents and great-grandparents, and so on. In fact, just as the number of our ancestors increase rapidly as we go back generations, so do the number of our common ancestors once we go further back than our MRCA.

Chang found that if you go far enough back, you reach a point when every single person living at that time is either the ancestor of all of us or none of us (i.e., that person's line went extinct). In other words, there is no one who lived at that time who is the ancestor of just some of us. It is an all-or-nothing situation with an 80% chance of the former and 20% chance of the latter. To be perfectly clear about this (because it is an important point), at one particular time in the past, 20% of the people who lived at that time have no descendants alive today. Each one of the remaining 80% of the people has the entire world's population today as descendants.

So all of us have the identical entire set of ancestors who lived at that time. Chang calls that time the IA (standing for 'identical ancestors') time.

Using the same assumptions as before, Chang's calculations for the number of generations to reach the IA date is 1.77log(2,N), which means that for a billion people, it amounts to about 53 generations ago. This works out to 675 CE for a generation span of 25 years and 410 CE for 30 years.

It seems amazing (to me at least) that all of us living right now have identical ancestors that lived so recently, roughly around the period when the Prophet Muhammad lived (570-632 BCE). In fact Mark Humphrys, a professor of computer science at Dublin City University in Ireland using a different technique estimates that "Muhammad, the founder of Islam, appears on the family tree of every person in the Western world." (Thanks to commenter Steve Lubot for this link.) But it is important to realize that there is nothing special about Muhammad or about the Western world.

So taking Chang's results at face value, all the people who fight over religion today are highly likely to be descendants of each and every religious leader who lived from the time of the Prophet Mohammed and earlier. So in a very real sense, they are killing their own cousins.

Of course, Chang's results were based on a highly simplified mathematical model. In the next posting in this series, we'll see what happens when we create more realistic scenarios of population changes and mating patterns.

POST SCRIPT: Clouds

Flying to Los Angeles last week, I saw some beautiful cloud formations from above. But none of them matched the beauty of those shown here.

September 28, 2006

Some surprising facts about ancestors

In 1999, Joseph T. Chang published a very interesting paper in the journal Advances in Applied Probability (vol. 31, pages 1002-1026) titled Recent Common Ancestors of all Present-Day Individuals. To understand the paper, it helps to reflect a little on the mathematics of genealogy.

One rock-solid fact of ancestry is that every person has two, and only two, biological parents. They in turn each have two parents so going back two generations gives a person four ancestors. If you go back three generations, you have eight ancestors and so on. Each generation that you go back doubles the number of ancestors in the previous generation.

We all know that this kind of geometric progression results in one reaching very large numbers very soon and by thirty generations, the number of ancestors one has acquired has ballooned to over one billion. In forty generations, we have over one trillion ancestors.

Conservatively allowing for each generation to span 30 years (which is a little large), going back thirty generations takes us back to about 1100 CE where the population was only about 300 million, and forty generations takes us back to 800 CE where the population was less than 200 million. (If we take each generation as averaging 25 years, 30 generations takes us back to 1250 CE when the population was 350 million and in forty generations we reach 1000 CE where the population was 200 million.)

Having more ancestors that the total population leads to the clear conclusion (which is not that surprising once one thinks about it) that all our ancestors cannot have been distinct individuals but were shared. In other words, my great-great-great-grandfather on my father's side had to be the same person as my great-great-grandfather on my mother's side, or something like that.

But the interesting point is that each one of us has over a trillion ancestors in just forty generations, which must mean that you, the reader, and I must have some shared ancestors, unless the huge population of your ancestors were entirely isolated from the huge population of my ancestors, with no mixing at all between them. Given the large numbers of ancestors involved, this kind of isolation seems highly unlikely unless there was some major geographical barrier separating the populations. We know that this is not the case, since by 1000 CE, people were able to travel pretty much all over the inhabited world, and all you need is just one person from my group of ancestors mating with one person from your group of ancestors to break the isolation, because then the ancestors of that pair are shared by both of us.

So if you and I (as just two people) share common ancestors, then we can see that if we go back far enough in time, all of us living on the world today should share at least some common ancestors. (See this post for a more rigorous argument for this.) One question that Chang was investigating was that of finding out, from among all the common ancestors, when the most recent common ancestor (MRCA) of all the people living in the world today lived.

The concept of the MRCA is interesting. My siblings and I share all our ancestors so the MRCA is not meaningful. The MRCA of my cousins and I (say) are the one set of grandparents that we have in common. As my current relatives get more distant, the MRCA goes back in time but it is not hard to see that an MRCA must exist for those who are commonly referred to as 'blood' relatives.

As another example, for those who take the Bible literally, definite common ancestors would be Noah and his wife. Since everyone except the two of them and their sons and their sons' wives were killed by god in the flood, all the current inhabitants of the world should have Noah and his wife as common ancestors. But they may not be the MRCA because their sons' descendants may also have intermarried, creating a more recent MRCA.

For those of us who accept evolution, it is not hard to get our minds around the concept of all of us having an MRCA, and the fact that we must have a shared ancestor in an earlier species has a pretty rigorous proof and is fairly easily accepted. What people thought was that this person probably existed around the time of our ancestor Homo erectus, perhaps a million years ago.

But when analysis was done on the mitochondrial DNA, and its mutation rate was used to triangulate back to the time when all the current mitochondrial DNA converged on a single individual, people were surprised that the calculations revealed that the MRCA deduced from this analysis, (nicknamed Mitochondrial Eve) lived much more recently, only about 140,000 years ago, probably in Africa. All present-day mitochondrial DNA is descended from this single individual. A similar analysis can be done for the Y chromosome to trace back to 'Y-chromosome Adam', and that person lived about 60,000 years ago (Richard Dawkins, The Ancestor's Tale (2004), pages 52-55).

But as Dawkins cautions (page 54):

[I]t is important to understand that Eve and Adam are only two out of a multitude of MRCAs that we could reach if we traced our way back through different lines. They are the special-case common ancestors that we reach if we travel up the family tree from mother to mother to mother, or father to father to father respectively. But there are many, many other ways of going up the family tree: mother to father to father to mother, mother to mother to father to father, and so forth. Each of these pathways will have a different MRCA.

Our normal concept of genealogy traces back through both sexes and thus the web of ancestral pathways becomes increases tangled and complex as you go back in time. As a result there is a greater chance of my ancestral pathways intersecting with the ancestral pathways of other people. It is thus reasonable to suppose that if we look at all these pathways, we will find a more recent MRCA than Mitochondrial Eve or Y-chromosome Adam. But this kind of calculation using mutation rates is not easy to do for things other than sex-specific chromosomes like mitochondrial DNA.

In order to try and fix the date of existence of the MRCA of everyone living today using the lines through both sexes, Chang used the tools of mathematics and statistics rather than genealogical charts or DNA mutations. And he found something very surprising, to be discussed in the next posting.

POST SCRIPT: If you live in fear, the terrorists have won

Tom Tomorrow points out the absurdity of people terrorizing themselves.

September 27, 2006

My ancestor Narmer, the first Pharaoh of Egypt

While doing some research on my ancestors last month, I made the surprising discovery that I am the direct descendent of Narmer, who was the first Pharaoh of Egypt and lived around 3,100 BCE. Narmer (thought by some to be the same person as Menes) was not your run-of-the-mill pharaoh. He is a bona fide Pharaoh Hall of Famer, credited with unifying the land that became Egypt and founding the very first dynasty. Of course, given the poor nature of record keeping back in those days one can never be absolutely certain of such things, but I am 80 percent certain that he is my direct ancestor.

How do I know this? I did not do an actual genealogy chart of my ancestors. It is a curious thing but people in Sri Lanka are nowhere near as enthusiastic about tracing their ancestral roots as the people in the US. I know who my grandparents are and I know some of their siblings but that is about it. I think that may be true for most Sri Lankans. I do not recall ever having discussions with anyone in Sri Lanka where people talked about ancestors farther back than three or four generations. It was not a topic of much interest.

Contrast this with America where people are fascinated with their ancestry and go to great lengths to trace back as far as they can, even hundreds of years. It is not unusual to have a conversation in America and for people to spontaneously raise the topic of where their ancestors came from and how far back they have tracked them. And people here are very excited when they find someone in their past who is famous (or even infamous) or had a role in some major historical event or is even just mentioned in some historical document.

Since thinking about my ancestors last month, I have been pondering why there is such a marked difference in interest in the two countries and have come up with some hypotheses, although I have no idea if these explanations are valid.

One possible explanation is that tracing one's ancestors in Sri Lanka is likely to be a fairly boring exercise with little expectation of anything exciting turning up. After all, it is a small island nation that has a recorded history of about 3,000 years. I know the village where my paternal grandfather, for example, was born and raised. If you trace back farther you will likely arrive at another person in that same village or a neighboring village. If you go back yet further, it will probably be another person in that same village or region, and so on for generation after generation. The likelihood of finding something really surprising or interesting is small. Pretty boring stuff, hardly worth putting a lot of effort into.

In the US, it is quite different. As one goes back in time, one will fairly soon reach ancestors who came from another continent or came over with the early settlers or were members of a Native American tribe. All of these are sufficiently novel and interesting facts that may make worthwhile the hard work necessary in finding one's roots.

Another factor is the quality of the recordkeeping as you go back in time. The structure of American and European societies was such that maintaining records was desirable. The fairly early adoption of a mercantilist society, capitalism, and private property ownership meant that you had to know who owned what and, most importantly, who inherited the property when someone died. This required that careful records of births and death be kept. This record keeping was also facilitated by church records. Since churches were institutions that also performed civil functions and married people, baptized their children, and buried them when they died, church records are rich sources of genealogical information.

Countries like Sri Lanka remained feudal until later and in many such societies land was either owned by the local feudal lord or held in common by the villagers, so questions of property inheritance were not major issues. Furthermore, Buddhist and Hindu religions (which are the main religions in Sri Lanka) are much less hierarchical in organizational structure than Christianity, and I believe their clergy do not have the same dual civil/religious role that Christian clergy have when it comes to marriages. So Buddhist and Hindu temples are not repositories of marriage, birth, and death records the way that Christian churches are.

A comprehensive mercantilist and capitalist economy came much later in Sri Lanka than in (say) Europe so one is likely to run up against a genealogical blank wall much sooner there, making the search for one's ancestors a much more frustrating task. Coupled with the fact that the long history and relatively little migratory behavior, and it is easy to see why tracking one's ancestors is not a particularly popular endeavor.

Even with good record keeping, tracing one's ancestors is a time-consuming task, requiring that one spend enormous amounts of time and effort in libraries and other archival institutions, poring over old records, and following many false trails.

In tracing my own ancestors, I did not do any of that laborious detective work. So how is it that by merely sitting lazily at my desk in the US in front of a computer, I could state that I am 80% confident that I, a person of Sri Lankan origin, am in a direct line from the very first pharaoh of Egypt?

That's the story for the next posting.

POST SCRIPT: Russell's teapot cartoon

Here is another cartoon from the creator of the blog Russell's Teapot. His cartoons are also a weekly feature on MachinesLikeUs.

russellteapot2.jpg

September 26, 2006

Our common ancestors

Darwin's theory of natural selection implies that we are all descended from common ancestors. Most people who have doubts about the theory tend to think that this is a proposition that we can either choose to accept or deny. After all, no one was around to see it, were they?

But Richard Dawkins' excellent book The Ancestor's Tale (2004) gives a surprisingly rigorous argument (on page 39) that back in the distant past, we must have all had common ancestors. He is such a good writer, both stylish and concise, that paraphrasing him would be a waste of time and I will give you an extended quote:

If we go sufficiently far back, everybody's ancestors are shared. All your ancestors are mine, whoever you are, and all mine are yours. Not just approximately, but literally. This is one of those truths that turns out, on reflection, to need no new evidence. We prove it by pure reason, using the mathematician's trick of reductio ad absurdum. Take our imaginary time machine absurdly far back, say 100 million years, to an age when our ancestors resembled shrews or possums. Somewhere in the world at that ancient date, at least one of my personal ancestors must have been living, or I wouldn't be here. Let us call this particular little mammal Henry (it happens to be a family name). We seek to prove that if Henry is my ancestor he must be yours too. Imagine, for a moment, the contrary: I am descended from Henry and you are not. For this to be so, your lineage and mine would have to have marched, side by side yet never touching, through 100 million years of evolution to the present, never interbreeding yet ending up at the same evolutionary destination – so alike that your relatives are still capable of interbreeding with mine. This reductio is clearly absurd. If Henry is my ancestor, he must be yours too. If not mine, he cannot be yours.

Without specifying how ancient is 'sufficiently', we have just proved that a sufficiently ancient individual with any human descendants at all must be an ancestor of the entire human race. Long-distance ancestry, of a particular group of descendants such as the human species, is an all-or-nothing affair. Moreover, it is perfectly possible that Henry is my ancestor (and necessarily yours, given that you are human enough to be reading this book) while his brother Eric is the ancestor of, say, all the surviving aadvarks. Not only is it possible. It is a remarkable fact that there must be a moment in history when there were two animals in the same species, one of whom became the ancestor of all humans and no aardvarks, while the other became the ancestor of all aardvarks and no humans. They may well have met, and may even have been brothers. You can cross out aardvark and substitute any other modern species you like, and the statement must still be true. Think it through, and you will find that it follows from the fact that all species are cousins of one another. Bear in mind when you do so that the 'ancestor of all aardvarks' will also be the ancestor of lots of very different things beside aardvarks[.]

There is one aspect of this argument that is crucial and that is that our common shared ancestor Henry that Dawkins is talking about has to have lived at a time when he was of a different species from us, since the reductio argument he is using depends crucially on the unlikelihood of species evolution following separate but parallel tracks to arrive at the same species end point. Since all humans are descendants of this single animal Henry, we conclude that all the early humans must be the ancestors of all of us. So when Dawkins talks of us all sharing the same ancestors at some point, he means human ancestors, since all humans evolved from Henry's line.

Of course, as time progresses, the human species descended fro Henry produced more descendants who then produced yet more descendants and so on, and there must come a time when the lines diverged so that not everyone living at later times is the ancestor of all of us, but only some. That transition time is called the identical ancestors (IA) time. i.e., Earlier than that, every human was the ancestor of all of us or none of us (i.e., their line went extinct). After the IA time, people share only some ancestors.

It is not hard to see that as time progresses even further, there will come a time when we all share just one common human ancestor, referred to as the most recent common ancestor or MRCA. After that time, everyone living today no longer shares a common ancestor.

I don't know about you, but to me there is something extraordinarily beautiful about this idea that at one point in time we all shared the same single ancestor, and that some time further back, everyone who lived at that time was the ancestor of all of us. It seems to be such a decisive argument against tribalism. It is hard to maintain the idea that some groups of people are 'special' in some way, when we not only all descended from a single animal Henry, but that at a later time we all shared the same set of human ancestors. Not only that, but we are also cousins of all the species that currently exist.

No wonder some religious extremists are afraid to have their children learn this theory. It is so captivating one can see how it would fascinate and draw in anybody who begins to think seriously about it.

Having established that we have both an MRCA and a time where all our human ancestors were identical (the IA time), this raises the question of when these dates occurred.

And therein lies another surprise, to be discussed in an upcoming post in this series.

POST SCRIPT: We're number 1?

Comedian Lewis Black tries to help Americans to see themselves as others see them.

September 25, 2006

Evolution and atheism

It is commonly charged by some religious people that acceptance of the theory of evolution by natural selection implies acceptance of atheism. Co-discovered by Charles Darwin and Alfred Wallace and brought to widespread public attention with the publication of Darwin's The Origin of Species in 1859, this theory immediately gained opposition in Europe, primarily from clergy, with the conflict showcased by the famous debate between Bishop Wilberforce and Thomas Huxley in 1860.

Edward J. Larson in his book The Summer of the Gods says that opposition to Darwin's ideas arose much more slowly in the US, not reaching high levels until 1920 or so. But, as we are all aware, the controversy has proved much more durable here, evolution remaining a controversial topic long after the rest of the world has accepted it. As James Watson (co-discover of DNA) says "Today, the theory of evolution is an accepted fact for everyone but a fundamentalist minority, whose objections are based not on reasoning but on doctrinaire adherence to religious principles.” (Thanks to MachinesLikeUs for the quote.) The radical clerics of US Christianity and the Intelligent Design Creationist (IDC) forces have been trying to discredit the theory of evolution by arguing that accepting it leaves no room for belief in a creator.

Underlying this opposition seems to be distaste for the idea that humans are not special creations, distinct from other animal forms. I occasionally get comments on my postings that ask me with incredulity how I could possibly believe that I am "descended from monkeys." I have written before about this popular misconception of evolution. The theory does not assert that we are descended from monkeys, only that we share the same ancestors. In other words, we are cousins of monkeys. I think that the people who oppose evolution find the idea of any kind of biological relationship with other animals so repulsive that they cannot get past that and see what evolution actually asserts.

Of course, this feeling of incompatibility between Christianity and evolution is not empirically confirmed because many Christians have no personal difficulties reconciling belief in god with acceptance of natural selection.

But recently I have been reading more about evolution and I am beginning to think that the radical clerics are right in a sense. A deep understanding of evolution may lead people away from god and religion, but not for the reasons that are commonly stated. The reasons I postulate have nothing to do with our relationship with monkeys or any other animals or whether god intervenes in the process of evolution, but with the underlying worldview and philosophy of natural selection.

All this may be quite familiar to others who are better educated in biology than me. But it is all new to me because my own education in Sri Lanka was quite narrowly focused so that my last biology class was in eighth grade. And even there I can't remember doing anything interesting or even learning about evolution in any great detail. I remember breaking apart and studying the parts of flowers (I recall words like 'stamen' and 'pistil' coming onto the discussion). I remember learning about the various ways by which pollination occurred and the various kinds of root systems plants had. I also remember the obscure fact that there were two kinds of cells called 'xylem' and 'phloem' though I cannot for the life of me remember why they were important or what they did.

The final straw that made me ditch biology was when we did a dissection of a rat to see its insides. The combination of the smell of formaldehyde and seeing an animal cut open and pinned made me gag, and realized that I did not want to learn any more biology. And I didn't, until very recently

But now I have been reading a lot about evolution (currently Richard Dawkins' excellent book The Ancestor's Tale (2004)) and am deeply impressed with the beauty and grandeur of the theory. I regret that I did not learn about it earlier but, looking on the bright side, perhaps it is only now that I am ready to appreciate the deep, and even surprising, truths that it reveals about our relationships to all the other living things.

And the truths that the theory of evolution reveal (to me at least) are that the divisions we use (religion, language, race) to separate ourselves into tribes are even less justifiable than I had earlier thought. There is a lot of surprising knowledge that flows from the idea of evolution that I think is not known to many even otherwise well-educated people.

The reason that this knowledge is dangerous for religion is that all religions depend for their justification on making the assertion, at some point, that they are somehow superior to other religions. Some people are subtle about it and keep this belief quiet, while others aggressively proclaim it to the world, causing friction. But it is always there. Once someone accepts that the differences between religions are negligible, it becomes easier to accept that all religions are false, and that therefore god does not exist too.

When we plumb the depths of evolutionary theory, it quickly becomes clear that the the last two thousand or so years of history (which is the time when the current major religions came into being) are so insignificant that it is preposterous to think that god hung around for so long before putting his stamp on events.

It is this feature of evolution, rather than a frontal assault on the role of god, that I believe subtly undermines belief in god. The next series of posts will expand on these less discussed aspects of evolution.

POST SCRIPT: Darwin the man

Robert Krulwich is an NPR reporter who does excellent stories on science. On Morning Edition on Wednesday, September 20, 2006 he had a delightful piece (you can read about it and listen to the nine-minute audio clip here) about how Darwin went about doing experiments to test various problematic aspects of his theory, such as how plants could have traveled across oceans to populate distant continents.

The substance of the report was an interview with David Quammen, the author of what seems like a fascinating new book The Reluctant Mr. Darwin. They talk about "what happens when a meticulous, shy, socially conservative man comes up with a revolutionary, new, dangerous idea. Darwin gets so nervous thinking what he's thinking, yet he is so sure that it's a promising idea. He can't let it out but he can't let it go. Instead, he spends years, decades even, checking and double checking his evidence."

They describe how "Charles Darwin and his butler dropped asparagus into a tub and how Darwin and his oldest son studied dead pigeons floating upside down in a bowl to test ideas about evolution."

The anecdotes about Darwin the man seem to indicate that he was a great father and an all round decent human being, who treated even the insects he studied with care and concern.

One fascinating anecdote was that one of Darwin's correspondents, who sent him a specimen of a beetle with a tiny clam attached to its leg that shed light on how clams may have 'flown' large distances, was the grandfather of Francis Crick, co-discover with James Watson of DNA, the mechanism that finally explained how Darwin's theory worked.

It seems like a fascinating book and the NPR interview was excellent.

August 31, 2006

Keeping creationism out of Ohio's science classes

Recall that the pro-IDC (intelligent design creationism) forces in Kansas received a setback in their Republican primary elections earlier this month. Now there is a chance to repeat that in Ohio.

I wrote earlier about a challenge being mounted to the attempt by Deborah Owens-Fink (one of the most pro-IDC activists in Ohio) to be re-elected to the Ohio Board of Education from Ohio District Seven. It seems as if the pro-science forces have managed to recruit a good candidate to run against her. He is Tom Sawyer, who is a former US congressman. I received the message below from Patricia Princehouse who has been tireless in her attempts at keeping religious ideas out of the science curriculum.

The worst creationist activist on Ohio's Board of Education is up for re-election (Deborah Owens Fink).

But now she has competition! And with your help, we can win!

We have recruited former congressman Tom Sawyer to run against her. His website is here.

Contributions are urgently needed for Congressman Sawyer's campaign.

(Credit cards accepted here or send check to address below.)

Fink has pledged to raise lots of money & we have no doubt that creationists across the country will pour tens of thousands of dollars into her campaign. We may not be able to match them, but Sawyer is an experienced politician who can make wise use of what he gets. We need to see he gets as much as possible.

HOW MUCH SHOULD I GIVE?

1) Remember that almost every Ohioan that pays Ohio income tax, can take as a
TAX CREDIT (not just a deduction) up to $50 ($100 married couples filing jointly) in donations to Board of Ed candidates. So, please try to give at least the free $50 that you can get back on your taxes.

2) How much would you give if you could erase the past 4 years of damage to Ohio's public schools? $100? $1000? $5000? Please seriously consider giving more than you've ever given before. You stand poised to prevent worse damage over the next 4 years...

Fink is circulating a fund-raising letter in which she thumbs her nose at science & refers to America's National Academy of Sciences as a "group of so-called scientists."

We can protect Ohio from another 4 years of retrograde motion and put someone on the Board who can move Ohio forward toward solving real problems like school funding, literacy, and the achievement gap.

But your help is urgently needed...

www.votetomsawyer.com

I WANT TO DO MORE:

Great! Please spread the word about the web site --in & out of state! (Remember, what happens in Ohio gets exported around the country, so defeating creationism in Ohio benefits the entire country) You can do even more as a volunteer (at home, on the phone, or on the street, even 1 hour of your time can make a difference, especially as we get closer to the election) To volunteer, email Steve Weeks at eul1993@hotmail.com

For info on what Fink has done to science education in Ohio, see here.
For more info on Sawyer, see here.
For more info on other races in Ohio see the HOPE website.
For more info on races nationwide, see here.

To mail donations: Send a check made out to: Vote Tom Sawyer

and mail to:
Martin Spector, Treasurer
4040 Embassy Pkwy, Suite 500, Akron, OH 44333

I was not aware of this provision in Ohio's tax code that effectively gives you a full refund for up to $50 for contributions to campaigns like this. I have not been able to check this information myself and see what, if any, restrcitions apply and if it applies only to school board elections or other elections as well.

For more information on other School Board elections where the pro-science HOPE (Help Ohio Public Education) organization is supporting candidates, see their website.

It would be nice if Ohio voters take the lead from Kansas voters and also reject IDC-promoting candidates.

POST SCRIPT: Saying what needs to be said

Keith Olbermann on MSNBC's Countdown delivers a blistering commentary on Donald Rumsfeld and the rest of the Bush Administration. You can see it here.

August 24, 2006

The language of science

Good scientists write carefully but not defensively. By carefully, I mean that they strive to be clear and to not over-reach, i.e., reach conclusions beyond what is warranted by the evidence. But they are not overly concerned with whether their words will be taken out of context and misused or subject to other forms of manipulation. It is an unwritten rule of scientific discourse that you do not score cheap debating points. Scientists are expected to respect those who oppose them and deal with the substance of their arguments and not indulge in superficial word games.

This is a why a scientist like Niels Bohr, who was notoriously obscure in his speech and writing, could still became a giant in the field. Scientists like Einstein who thought Bohr quite wrong about quantum mechanics, recognized the value of his insights, and took the trouble to pierce through the verbal fog and clarify Bohr's own ideas and make him understandable to others.

But scoring points using debating tricks such as selective quotation and word play is the norm in the political arena. Hence political speech requires people learn to speak defensively, so that an unfortunate choice of words will not be used to imply that they said something that they did not intend to.

As long as these two worlds of science and politics remain separate, there is no problem. But it is becoming increasingly difficult to maintain that line. Scientists who, intentionally enter the political arena or inadvertently do so by getting involved in questions that have political implications (say global warming or intelligent design) often find themselves blindsided because they have not learned to use the kinds of defensive circumlocutions that politicians use.

For example, scientists will often use anthropomorphic language when describing phenomena. They will say things like "the electron wants to go here" or "this organism is designed to survive in this ecosystem." Scientists do not actually mean that there is some consciousness behind these things. But this breezy language livens up the subject and it serves as convenient shorthand for the more correct but convoluted consciousness-free language. Fellow scientists understand this custom.

But those who wish to pursue a broader agenda often use this casual language to imply things that the authors never intended. For example, intelligent design creationists (IDC) carefully scour the scientific literature to look for the word "design" and pounce on it to imply that the scientist writers are implicitly acknowledging that there the world is designed. They try to imply that many members of the scientific community secretly believe that the world is intentionally designed but try to hide it because of their secular political agenda, and that their language often inadvertently reveals their true beliefs.

For example, in a science article on butterflies physicist Pete Vukusic, is quoted as saying: "It's amazing that butterflies have evolved such sophisticated design features which can so exquisitely manipulate light and colour. Nature's design and engineering is truly inspirational."

This was seized on by IDC advocate William Dembski on his website where he highlights the phrase Nature's design and engineering is truly inspirational as if Vukusic was implying that butterflies were the work of a designer. This is just nonsense borne out of desperation.

The more politically savvy scientists, veterans of these wars, have learned to play this game. For example, I am currently reading an excellent book called The Ancestor's Tale by Richard Dawkins (more on this fascinating book in later postings) and in it, whenever there is a chance that what he says maybe misconstrued as implying intent in nature, he repeatedly warns intelligent design creationists to not take those sentences out of context and imply that they mean something other than what he intends. He sometimes takes the same ideas and writes it defensively to show how to translate between popular and very precise scientific writing.

But others have to learn the hard way. Consider for example, the experience of Peter Doran, a professor of earth and environmental sciences at the University of Illinois at Chicago. In 2002, he and his colleagues published a paper in Nature that " found that from 1996 to 2000, one small, ice-free area of the Antarctic mainland had actually cooled. Our report also analyzed temperatures for the mainland in such a way as to remove the influence of the peninsula warming and found that, from 1966 to 2000, more of the continent had cooled than had warmed. Our summary statement pointed out how the cooling trend posed challenges to models of Antarctic climate and ecosystem change."

That paper was immediately seized upon by opponents of global warming to argue that the Earth was actually cooling, even though Doran tried to explain that his paper said no such thing.

Doran said that this legend has only grown in the four years since, despite his efforts to kill it. He says "Our results have been misused as “evidence” against global warming by Michael Crichton in his novel “State of Fear” and by Ann Coulter in her latest book, “Godless: The Church of Liberalism.” Search my name on the Web, and you will find pages of links to everything from climate discussion groups to Senate policy committee documents — all citing my 2002 study as reason to doubt that the earth is warming. One recent Web column even put words in my mouth. I have never said that “the unexpected colder climate in Antarctica may possibly be signaling a lessening of the current global warming cycle.” I have never thought such a thing either."

He ends with this plea. "I would like to remove my name from the list of scientists who dispute global warming. I know my coauthors would as well."

It would be too bad if scientists, like politicians, had to also begin to carefully parse words so as to avoid even the remotest possibility of being misconstrued. It would be sad if they had to pepper their writings with the kinds of disclaimers one sees on medications ("This statement should not be taken to imply that we are supporting the following positions:. . ."). Scientific writing already suffers from various maladies: an overdose of passive-voice, jargon, and formulaic style are among the sins that immediately come to mind. To add defensiveness to the list would make scientific writing even more difficult to read.

August 21, 2006

Taking steps to avoid global warming

One of the curious features of the debate over what should be done about global warming is what we should be done about it. I can actually understand the position of those who are skeptical about whether things like the Kyoto treaty will solve the problem. I can understand those who worry that government regulations might not work.

What puzzles me are those people who somehow see the actions taken to reduce the production of greenhouse gases as some sort of affront that has to be opposed.

In actuality, what we are being asked to do, as individuals, can hardly be considered to be a major sacrifice. We are not being asked to live in caves and eat our food raw. All that is asked from us is that we tone down out lifestyles by just a little bit. Driving more energy efficient cars is not a hardship. Why do people feel that driving a gas guzzler is somehow a right that they should enjoy? Requiring better energy efficiency standards from the manufacturers of cars and other goods may result in a slight increase in prices for them. But why is that seen as a violation of free enterprise when we have all kinds of other regulations in place already that also result in higher prices? Turning the thermostats slightly down in winter and up in summer, or using fans more than air-conditioners do not really affect our lives in a major way.

Reducing the amount of packaging that is used, or getting in the habit of recycling items, may result in slight inconveniences but are hardly major issues. Maybe because I grew up in a third world country, the idea of reusing things comes more naturally to me. In Sri Lanka, people took their own shopping bags with them to the stores. The small shops down the street would wrap their items for their customers in old newspapers. They bought the newspapers from people like us. Every week or so, a man would comes down our street to buy our old newspapers and bottles and then resell them to the shops for reuse.

(A memory from my childhood. The newspapers were bought and sold by the pound. My grandmother suspected that the scales used by the recycling merchant for weighing were rigged so she developed an independent measure of the weight. My grandmother figured out exactly how many sheets of newspaper made up a pound. As a little boy, it was my job to carefully count out the pages and create one pound stacks of them.)

Everything was used many times before it was thrown away. Something had to become broken or torn beyond repair before it was thrown away. In my recent trips, though, I noticed Sri Lanka has become "modern" now. The bigger stores and supermarkets have everything highly packaged, and put items in plastic shopping bags to take home, just like here.

I noticed that by living in America these many years, I had slowly abandoned many of my instinctive reuse/recycle habits that I once had, but am trying to get back to that now that I believe that global warming is a threat and resources are limited. For example, I noticed that in New Zealand, a lot of people took their own cloth shopping bags to the supermarkets to bring their groceries home in. Since my return, I have also adopted this practice. It is one of those things that are easy to do. I also tell cashiers at bookstores and elsewhere to not put stuff in paper or plastic bags unless I have to carry a lot of stuff and it becomes really necessary. I heard that in Ireland, they charge 25 cents for each flimsy plastic bag in order to discourage people from unthinkingly getting them.

Some of the wasteful things we do are simply lifestyle choices that consume energy and resources, provide little or no benefits to us, but are harmful to the environment. For example, take the bottled water craze. Larry Lack in his article Bottled Water Madness points out the huge negative impact this particular industry has had on the environment and people's health for no discernible benefit.

Unless you live somewhere where the water actually tastes bad or is known to be impure (and there are just a few places like these in America) or tap water is not easily accessible, there is no real reason to buy bottled water. Municipal tap water is monitored for quality and safety more often and with higher standards than bottled water, so it is actually better for you. In addition, the amount of plastic used in packaging bottled water is enormous and it fills up landfills even faster. And drinking tap water will save you money.

Giving up bottled water is hardly a hardship. It actually makes your life easier. Drinking tap water is so much easier than going to the store, buying cases of water, storing it, getting rid of empty bottles, etc. that I am truly puzzled by bottled water's commercial success, and impressed at the advertising industry's ability to persuade people to buy it in such large quantities.

While each conservation measure that we adopt helps, we need to have large numbers of people doing it in order to have an impact and this is where the problems arise. It is not clear that purely voluntary actions are sufficient. As we saw with the demise of Easter Island, entire communities can stand by while their environment is destroyed. Are people willing to demand, let alone merely allow, governments to legislate more actions that conserve energy and resources? Are we willing to simply buy less stuff?

The industries that produce the greenhouse gases know that most people, being reasonable, are not going to balk at taking these very minor steps (they cannot even be called sacrifices) to conserve resources and reduce emissions if the risk of not doing so is to destroy the environment. So the debate has been framed as one of rights. How dare people be told what car they should drive! How dare they be asked to turn off the lights when they leave the room! How dare they be asked to save energy by adjusting the thermostats! It is each person's right to be able to do whatever they can afford!

It seems strange to me that a public that is so unconcerned about their violations of privacy, civil rights, and age old constitutional and legal protections, can get so riled up about what are basically minor consumer issues.

I can understand why people get fired up about evolution. It does, after all, go against many people's deeply held religious beliefs. But the vehemence with which some people oppose any measures to reduce greenhouse gases is truly puzzling to me. Even if we suppose scientists are wrong in their consensus beliefs that there is no global warming. All that would mean is that our greenhouse gas reduction strategies were unnecessary.
But why is that such an awful fate to contemplate, so much so that some people are willing to fight it with such vehemence? I just don't get it.

POST SCRIPT: I'm back!

I had a terrific drive across the country to California last week with my daughter, taking her (and her car) to start graduate school there. I enjoyed it so much that I am wondering when I can do it again, taking a different route. To create another excuse, I am already urging my younger daughter to think about also going to graduate school on the west coast.

I'll write more about the trip later.

August 18, 2006

Should secularists fight for 100% separation of church and state?

(This week I will be on my long-anticipated drive across the country to San Francisco. During that time, I am reposting some of the very early items from this blog.

Thanks to all those who gave me suggestions on what to see on the way. I now realize that I must have been crazy to think that I could see more than a tiny fraction of the natural sights of this vast and beautiful country, and will have to do many more trips.

I will start posting new items on Monday, August 21, 2006.)

Like most atheists, it really is of no concern to me what other people believe. If you do not believe in a god or heaven and hell in any form, then the question of what other people believe about god is as of little concern to you as questions about which sports teams they root for or what cars they drive.

If you are a follower of a theistic religion, however, you cannot help but feel part of a struggle against evil, and often that evil is personified as Satan, and non-believers or believers of other faiths can be seen as followers of that evil. Organized religions also need members to survive, to keep the institution going. So for members of organized religion, there is often a mandate to try and get other people to also believe, and thus we have revivals and evangelical outreach efforts and proselytizing.

But atheists have no organization to support and keep alive with membership dues. We have no special book or building or tradition to uphold and maintain. You will never find atheists going from door to door spreading the lack of the Word.

This raises an interesting question. Should atheists be concerned about religious symbolism in the public sphere such as placing nativity scenes on government property at Christmas or placing tablets of the Ten Commandments in courthouses, both of which have been the subjects of heated legal struggles involving interpretations of the First Amendment to the constitution? If those symbols mean nothing to us, why should we care where they appear?

In a purely intellectual sense, the answer is that atheists (and other secularists) should not care. Since for the atheist the nativity scene has as little meaning as any other barnyard scene, and the Ten Commandments have as much moral force as (say) any of Dave Letterman's top ten lists, why should these things bother us? Perhaps we should just let these things go and avoid all the nasty legal fights.

Some people have advocated just this approach. Rather than fighting for 100% separation of church and state, they suggest that we should compromise on some matters. That way we can avoid the divisiveness of legal battles and also prevent the portrayal of atheists as mean-spirited people who are trying to obstruct other people from showing their devotion to their religion. If we had (say) 90% separation of church and state, wouldn't that be worth it in order to stop the acrimony? Bloggers Matthew Yglesias and Kevin Drum present arguments in favor of this view, and it does have a certain appeal, especially for people who prefer to avoid confrontations and have a live-and-let-live philosophy.

But this approach rests on a critical assumption that has not been tested and is very likely to be false. This assumption is that the religious community that is pushing for the inclusion of religious symbolism in the public sphere has a limited set of goals (like the items given above) and that they will stop pushing once they have achieved them. This may also be the assumption of those members of non-Christian religions in the US who wish to have cordial relations with Christians and thus end up siding with them on the religious symbolism question.

But there is good reason to believe that the people who are pushing most hard for the inclusion of religious symbolism actually want a lot more than a few tokens of Christian presence in the public sphere. They actually want a country that is run on "Christian" principles (for the reason for the quote marks, see here.) For them, a breach in the establishment clause of the first amendment for seemingly harmless symbolism is just the overture to a movement to eventually have their version of religion completely integrated with public and civic life. (This is similar to the "wedge strategy" using so-called intelligent design (ID). ID advocates see the inclusion of ID (with its lack of an explicit mention of god) in the science curriculum as the first stage in replacing evolution altogether and bringing god back into the schools.)

Digby, the author of the blog Hullabaloo argues that although he also does not really care about the ten commandments and so on, he thinks that the compromise strategy is a bad idea. He gives excellent counter-arguments and also provides some good links on this topic. Check out both sides. Although temperamentally my sympathies are with Yglesias and Drum, I think Digby wins the debate.

So the idea of peaceful coexistence on the religious symbolism issue, much as it appeals to people who don't enjoy the acrimony that comes with conflicts over principle, may be simply unworkable in practice.

August 07, 2006

Global warming-9: The demise of Easter Island

Easter Island tends to grip the imagination of people. But the things that people remember most about it (even perhaps the only thing) are the giant stone statues of faces that exist on the island.

Jared Diamond tells the sad story of this island as a warning to us all in a chapter of his book Collapse: How societies choose to fail or succeed, but an earlier essay by him can be seen here. Thanks to MachinesLikeUs.com for the link.)

The reason that Easter Island, more than any of the other examples given by Diamond, strikes me as being relevant to global warming is because the island, being remote from the rest of the world, comes closest to the Earth in being an almost isolated system.

Easter Island, with an area of only 64 square miles, is the world's most isolated scrap of habitable land. It lies in the Pacific Ocean more than 2,000 miles west of the nearest continent (South America), 1,400 miles from even the nearest habitable island (Pitcairn).

All the other examples of collapse cited by Diamond were linked more closely to the rest of the world, and so it is possible to speculate that outside forces contributed to their decay and demise. But the Easter Islanders seemed to have clearly done it all by themselves. Diamond poses the question of how and why "In just a few centuries, the people of Easter Island wiped out their forest, drove their plants and animals to extinction, and saw their complex society spiral into chaos and cannibalism."

As this extended except from Diamond points out, initially Easter Island had a lot going for it.

Its subtropical location and latitude - at 27 degrees south, it is approximately as far below the equator as Houston is north of it - help give it a rather mild climate, while its volcanic origins make its soil fertile. In theory, this combination of blessings should have made Easter a miniature paradise, remote from problems that beset the rest of the world.
. . .
The earliest radiocarbon dates associated with human activities are around A.D. 400 to 700, in reasonable agreement with the approximate settlement date of 400 estimated by linguists. The period of statue construction peaked around 1200 to 1500, with few if any statues erected thereafter. Densities of archeological sites suggest a large population; an estimate of 7,000 people is widely quoted by archeologists, but other estimates range up to 20,000, which does not seem implausible for an island of Easter's area and fertility.
. . .
For at least 30,000 years before human arrival and during the early years of Polynesian settlement, Easter was not a wasteland at all. Instead, a subtropical forest of trees and woody bushes towered over a ground layer of shrubs, herbs, ferns, and grasses. . . . The most common tree in the forest was a species of palm now absent on Easter but formerly so abundant that the bottom strata of the sediment column were packed with its pollen. The Easter Island palm was closely related to the still-surviving Chilean wine palm, which grows up to 82 feet tall and 6 feet in diameter. The tall, unbranched trunks of the Easter Island palm would have been ideal for transporting and erecting statues and constructing large canoes. The palm would also have been a valuable food source, since its Chilean relative yields edible nuts as well as sap from which Chileans make sugar, syrup, honey, and wine.
. . .
Among the prodigious numbers of seabirds that bred on Easter were albatross, boobies, frigate birds, fulmars, petrels, prions, shearwaters, storm petrels, terns, and tropic birds. With at least 25 nesting species, Easter was the richest seabird breeding site in Polynesia and probably in the whole Pacific.
. . .
Such evidence lets us imagine the island onto which Easter's first Polynesian colonists stepped ashore some 1,600 years ago, after a long canoe voyage from eastern Polynesia.
. . .
The first Polynesian colonists found themselves on an island with fertile soil, abundant food, bountiful building materials, ample lebensraum, and all the prerequisites for comfortable living. They prospered and multiplied.

But the inhabitants then set about creating a lifestyle that slowly but surely destroyed the very environment around them.

Eventually Easter's growing population was cutting the forest more rapidly than the forest was regenerating. The people used the land for gardens and the wood for fuel, canoes, and houses - and, of course, for lugging statues. As forest disappeared, the islanders ran out of timber and rope to transport and erect their statues. Life became more uncomfortable - springs and streams dried up, and wood was no longer available for fires.
. . .
By the time the Dutch explorer Jacob Roggeveen arrived there on Easter day in 1722 (thus giving the island its modern name) his first impression was not of a paradise but of a wasteland. What he saw was grassland without any trees or bushes over ten feet in height.

This is the description of the island given by Roggeveen:

"We originally, from a further distance, have considered the said Easter Island as sandy; the reason for that is this, that we counted as sand the withered grass, hay, or other scorched and burnt vegetation, because its wasted appearance could give no other impression than of a singular poverty and barrenness."

When scientists catalogued life on the island, they found the range of flora and fauna a shadow of its former rich variety and abundance.

Modern botanists have identified only 47 species of higher plants native to Easter, most of them grasses, sedges, and ferns. The list includes just two species of small trees and two of woody shrubs. With such flora, the islanders Roggeveen encountered had no source of real firewood to warm themselves during Easter's cool, wet, windy winters. Their native animals included nothing larger than insects, not even a single species of native bat, land bird, land snail, or lizard. For domestic animals, they had only chickens.

In another extended except, Jared Diamond poses the key questions, provides the answers, and lays out their chilling significance.

As we try to imagine the decline of Easter's civilization, we ask ourselves, "Why didn't they look around, realize what they were doing, and stop before it was too late? What were they thinking when they cut down the last palm tree?"

I suspect, though, that the disaster happened not with a bang but with a whimper. After all, there are those hundreds of abandoned statues to consider. The forest the islanders depended on for rollers and rope didn't simply disappear one day - it vanished slowly, over decades. Perhaps war interrupted the moving teams; perhaps by the time the carvers had finished their work, the last rope snapped. In the meantime, any islander who tried to warn about the dangers of progressive deforestation would have been overridden by vested interests of carvers, bureaucrats, and chiefs, whose jobs depended on continued deforestation. Our Pacific Northwest loggers are only the latest in a long line of loggers to cry, "Jobs over trees!" The changes in forest cover from year to year would have been hard to detect: yes, this year we cleared those woods over there, but trees are starting to grow back again on this abandoned garden site here. Only older people, recollecting their childhoods decades earlier, could have recognized a difference. Their children could no more have comprehended their parents' tales than my eight-year-old sons today can comprehend my wife's and my tales of what Los Angeles was like 30 years ago.

Gradually trees became fewer, smaller, and less important. By the time the last fruit-bearing adult palm tree was cut, palms had long since ceased to be of economic significance. That left only smaller and smaller palm saplings to clear each year, along with other bushes and treelets. No one would have noticed the felling of the last small palm.

By now the meaning of Easter Island for us should be chillingly obvious. Easter Island is Earth writ small. Today, again, a rising population confronts shrinking resources. We too have no emigration valve, because all human societies are linked by international transport, and we can no more escape into space than the Easter Islanders could flee into the ocean. If we continue to follow our present course, we shall have exhausted the world's major fisheries, tropical rain forests, fossil fuels, and much of our soil by the time my sons reach my current age. (my italics)

Every day newspapers report details of famished countries - Afghanistan, Liberia, Rwanda, Sierra Leone, Somalia, the former Yugoslavia, Zaire - where soldiers have appropriated the wealth or where central government is yielding to local gangs of thugs. With the risk of nuclear war receding, the threat of our ending with a bang no longer has a chance of galvanizing us to halt our course. Our risk now is of winding down, slowly, in a whimper. Corrective action is blocked by vested interests, by well-intentioned political and business leaders, and by their electorates, all of whom are perfectly correct in not noticing big changes from year to year. Instead, each year there are just somewhat more people, and somewhat fewer resources, on Earth.

It would be easy to close our eyes or to give up in despair. If mere thousands of Easter Islanders with only stone tools and their own muscle power sufficed to destroy their society, how can billions of people with metal tools and machine power fail to do worse? But there is one crucial difference. The Easter Islanders had no books and no histories of other doomed societies. Unlike the Easter Islanders, we have histories of the past - information that can save us. My main hope for my sons' generation is that we may now choose to learn from the fates of societies like Easter's.

It was this story that alarmed me personally and made me realize that we cannot assume that collective self-interest alone will result in environmental problems being recognized and addressed. We need to take global warming seriously, even if we are not 100% certain that it is on an irreversible course. Unlike the people of Easter Island, we have knowledge of the past. We have the ability, via science, to understand the environmental problems facing us. We have the technology to solve the problems.

The only remaining unanswered question is whether we have the will to take the requisite steps. Or, like the Easter Islanders, whether we will drive ourselves, literally and metaphorically, into near oblivion.

August 04, 2006

Global warming-8: The danger of complacency

The documentary An Inconvenient Truth provides a good introduction to the problem of global warming. The film has three interwoven threads: (1) a documentary showing a slide-show talk that former Vice-President Al Gore gives around the world on the facts of global warming, mixed with film footage of the impact of warming on the environment; (2) the story of Gore's own interest in this topic; and (3) some self-promotion by Gore.

While I could have done without the last and was not particularly interested in the second, the first part was done very well. It captured most of the state of the science accurately and presented it in a visually captivating way. The film is sobering and well worth seeing to get an introduction to the science behind the problem and a sense of the gravity of the situation we are facing.

The August 2006 issue of The Progressive magazine has an interview with scientist James Hansen, of NASA's Goddard Institute for Space Scientists and a leading expert on global warming. He was referred to as the Paul Revere of global warming because of his very early alarm sounding, and was the scientist whom the Bush administration tried to gag.

Hansen has been relentless in trying to get people to care about what is happening to the planet. In 1981, Hansen and his colleagues were the first to introduce the term "global warming" in the scientific context in an article in the journal Science. Hansen's lab monitors 10,000 temperature gauges around the world to get the average temperature and this value has been steadily rising. He says that further warming of more than one degree Celsius "will make the Earth warmer than it has been in a million years."

He says that we have a decade, maybe two, to do something before we reach the tipping point where irreversible changes set in and we are consigned to a world vastly different from the one we are used to. He says that we have to stabilize carbon dioxide emissions within the next decade and cannot wait for new technologies like capturing emissions from burning coal. He says that we have to focus on energy efficiency and renewable sources and move away from carbon burning.

I had already been convinced for a few years that global warming was real and serious. But although I was concerned about the problem, I was not alarmed. I felt that since it was a serious problem, and one that affected everyone, political leaders would have no choice but to eventually address it. Although individual political leaders like George W. Bush or the Australian Prime Minister John Howard might choose to ignore the scientific consensus and do nothing that might harm the financial interests of their political supporters, I felt that eventually public alarm about their deteriorating environment would be so great that pressure would be brought on political leaders, whoever they were and whatever their own inclinations, that they would have no choice but to take appropriate action.

I was basically putting my faith in people taking action when a serious threat to their own lives was created. Simple self-preservation, and the desire to leave the world a better place for one's children and grandchildren and generations to come, were such strong emotions that I was sure they would reflexively kick in when people realized that the planet was being threatened, and they would do whatever it takes to address the problem.

I now realize that I was far too naïve. I think that my complacent attitude (which I suspect is not uncommon) is totally mistaken.

Like many things, what caused me to change was something that was seemingly tangential. I attended Jared Diamond's excellent talk at Case last year where he spoke about the ideas in his book Collapse: How societies choose to fail or succeed. He spoke of past civilizations (some great ones) that allowed their societies to be destroyed by not taking actions to halt the processes that destroyed them. He gave examples from Montana, Pitcairn Island, the Anasazi, the Mayans, Norse Greenland, Rwanda, and Haiti

The entire populations of these past and present communities seemingly did nothing as they destroyed their own environments. Even when the signs of decay and impending catastrophe had reached levels that seem, at least to us now, staggeringly obvious, they still did nothing, continuing to pursue short-term benefits at the expense of long-term protection of the very environment that sustained them and had allowed them to prosper.

Looking at them now, we wonder how the people and leaders of those societies could have been so blind to the fate that was so slowly but surely engulfing them and how stupid to not see the warning signs. Listening to his talk, I realized that I should not be so sanguine that people are any wiser now, and think that they will recognize and address problems that directly affect them. Self-delusion seems to be a hazard that afflicts entire societies.

Of all the examples that Diamond gave, the one that was most poignant and gripped my imagination was the case of Easter Island. For most people, the big mystery and romance associated with the island lies with its famous statues. Diamond describes them:

Easter Island's most famous feature is its huge stone statues, more than 200 of which once stood on massive stone platforms lining the coast. At least 700 more, in all stages of completion, were abandoned in quarries or on ancient roads between the quarries and the coast, as if the carvers and moving crews had thrown down their tools and walked off the job. Most of the erected statues were carved in a single quarry and then somehow transported as far as six miles - despite heights as great as 33 feet and weights up to 82 tons. The abandoned statues, meanwhile, were as much as 65 feet tall and weighed up to 270 tons. The stone platforms were equally gigantic: up to 500 feet long and 10 feet high, with facing slabs weighing up to 10 tons.

The number, size, and quality of the statues seemed to indicate that there existed, at least at one time, a fairly large population that had the tools and resources and ingenuity to create them. And yet, travelers who arrived at the Island in the 18th century found quite a different situation, and that created a puzzle.

[T]he islanders had no wheels, no draft animals, and no source of power except their own muscles. How did they transport the giant statues for miles, even before erecting them? To deepen the mystery, the statues were still standing in 1770, but by 1864 all of them had been pulled down, by the islanders themselves. Why then did they carve them in the first place? And why did they stop?

The puzzle of the statues is just one of the many that involve the island. Unraveling them has resulted in a chilling story of how an isolated community managed to destroy its own environment.

Next: The demise of Easter Island

POST SCRIPT: The Simpsons

For all of us fans of the show, here is a live action version of the opening sequence. (Thanks to the editor of MachinesLikeUS.com for the link.)

Which just proves my contention that we all benefit from the fact that there are a huge number of talented people out there in internetland with way too much time on their hands.

August 02, 2006

Global warming-7: The current status of the scientific consensus

So what is the scientific consensus about the answers to the key questions concerning global warming?

The British magazine New Scientist gives a review of the state of affairs concerning climate change, along with a handy summary sheet of the main points, and the Intergovernmental Panel on Climate Change (IPCC) report (thanks to Brian Gray of the Kelvin Smith Library who runs the blog e3 Information Overload for the link) provides more detailed information. Here are some tentative answers to the five key questions I raised in a previous post.

1. Is warming occurring? In other words, are average temperatures rising with time?

Here we have to distinguish between the more recent period (starting in 1861) when we have direct measurements of temperature and the prior periods, for which we have to infer temperatures using proxy measures such as using tree rings or bubbles trapped in ice cores that date back 750,000 years.

For the recent past, the IPCC report says that "The global average surface temperature has increased by 0.6 ± 0.2°C since the late 19th century".

For the period prior to that, the report says "It is likely that the rate and duration of the warming of the 20th century is larger than any other time during the last 1,000 years. The 1990s are likely to have been the warmest decade of the millennium in the Northern Hemisphere, and 1998 is likely to have been the warmest year."

2. If so, is it part of normal cyclical warming/cooling trends that have occurred over geologic time or is the current warming going outside those traditional limits?

Some skeptics have pointed to relative warm periods associated with the 11th to 14th centuries, and relative cool periods associated with the 15th to 19th centuries in the Northern Hemisphere as evidence that the kinds of warm temperatures we have witnessed recently are part of global cyclical patterns. However the IPCC reports says that "evidence does not support these “Medieval Warm Period” and “Little Ice Age” periods, respectively, as being globally synchronous." In other words, these were likely regional phenomena.

If we go back even further the report says that "It is likely that large rapid decadal temperature changes occurred during the last glacial and its deglaciation (between about 100,000 and 10,000 years ago), particularly in high latitudes of the Northern Hemisphere. In a few places during the deglaciation, local increases in temperature of 5 to 10°C are likely to have occurred over periods as short as a few decades. During the last 10,000 years, there is emerging evidence of significant rapid regional temperature changes, which are part of the natural variability of climate."

So while rapid localized changes in temperature have occurred, there is little evidence that these were global in scope.

But there are also suggestions that temperature swings in the past may have been greater than originally thought.

3. Are the consequences of global warming such that we can perhaps live with them (slightly milder winters and warmer summers) or are they going to be catastrophic (causing massive flooding of coastal areas due to rising ocean levels, severe droughts, blistering heat waves, total melting of the polar regions, widespread environmental and ecological damage)?

The answer to these important questions, of course, depend on projections for the future which in turn depend on what actions are taken. The IPCC report outlines possible scenarios here. But some things, such as the reductions in the polar ice caps and snow cover generally are already visible.

One of the most dramatic consequences of snow and glacier melting is a rise in sea levels. It is estimated that a 30 cm (one foot) rise in sea levels results in shorelines receding by 30 meters. Some recent studies suggest that the IPCC report estimates of possible rise in sea levels were low, and more recent estimates are that sea levels could rise by six feet, which would result in massive flooding of highly populated areas the world over. Again, there is limited data so these are still rough estimates. But to my mind, the state of the large ice and snow areas (the polar caps, Greenland, glaciers, and mountain tops) are things that we should watch carefully, and the signs there are not good.

4. How reliable are the theories and computer models that are being used study this question?

The IPCC report points out that "The basic understanding of the energy balance of the Earth system means that quite simple models can provide a broad quantitative estimate of some globally averaged variables." But only numerical models can provide the kinds of detailed quantitative projections into the future that we need in order to make informed decisions. "The complexity of the processes in the climate system prevents the use of extrapolation of past trends or statistical and other purely empirical techniques for projections." In other words, just having data about the past is insufficient to project to the future. We also need computer models based on the science and mathematics of climate change. "Climate models can be used to simulate the climate responses to different input scenarios of future forcing agents. . .Similarly, projection of the fate of emitted CO2. . .and other greenhouse gases requires an understanding of the biogeochemical processes involved and incorporating these into a numerical carbon cycle model." (For details on how the computer models used to predict future trends in climate work, see here.)

The IPCC report concludes that "In general, [the computer models] provide credible simulations of climate, at least down to sub-continental scales and over temporal scales from seasonal to decadal. Coupled models, as a class, are considered to be suitable tools to provide useful projections of future climates."

5. What are the causes of global warming? Is human activity responsible and can the process be reversed?

Several of the greenhouse gases that influence global temperatures, referred to as "climate forcing agents" (carbon dioxide, methane, nitrous oxide) have recently shown dramatic increases in concentrations in the atmosphere. This graph is perhaps the one that alarms me the most.

figts-8.gif

These sharp increases in greenhouse gas concentrations are clearly correlated with rapid increases in the rate of industrialization and energy consumption within the two last centuries. It seems to me that while individual changes in behavior (such as using less stuff and reusing and recycling more) are important, they must be accompanied by concerted international governmental actions to reverse the trends.

We have a precedent for this kind of concerted international action to solve an important environmental problem. Recall the recent time when there was concern that the ozone layer was being damaged by the extensive use of chlorofluorocarbons (CFCs). International action led to the complete ban on its use worldwide. Now there is some good news.

While ozone degradation continues despite global bans on ozone-depleting pollutants imposed more than a decade ago, the rate has slowed markedly enough in one layer of the atmosphere that scientists believe ozone could start to be replenished there within several years.

"There is compelling evidence that we are seeing the very first stages of ozone recovery in the upper atmosphere," said Michael Newchurch, an atmospheric chemist with the National Space Science and Technology Center at the University of Alabama in Huntsville.

Evidence suggests that international efforts to reduce chlorofluorocarbon (CFC) pollution are working.

Of course, greenhouse gases are produced by a much more extensive and powerful group of industries than those producing ozone depleting ones, and require greater changes in our own lifestyles. So achieving international cooperation on this will not be easy, as the difficulties implementing the Kyoto treaty suggests. That treaty committed industrialized nations to commit to reducing their emissions of greenhouse gases within the next decade to a level of about 5% below their 1990 levels. Although the US produces about 36% of the world's output of greenhouse gases (the largest single producer), George W. Bush said in 2001 that the US would not sign the treaty.

Next: The danger of complacency

POST SCRIPT: And sure enough, right on cue. . .

Just last week, I said that the lack of public understanding that climate questions such as global warming only deal with averages over long times and large areas inevitably lead to people drawing the wrong conclusions from short term fluctuations.

Sure enough, yesterday's Plain Dealer has the following letter to the editor:

We constantly are subjected to news about the coming devastating effects of global warming, which includes the recent story on how it is going to dramatically change Lake Erie and its shoreline. So it's a bit perplexing to me to see in my most recent FirstEnergy electric bill that during my past 30-day billing cycle, the average temperature in Cleveland was 69 degrees, versus 72 degrees last year. Now, if we are to believe the global-warming doomsayers, a three-degree swing in temperature is cataclysmic. So when will The Plain Dealer begin printing articles about how Cleveland is at risk of entering an ice age if we don't change our behavior?

Why does the Plain Dealer even print such nonsense? Either they know it is flat out wrong, which means they are deliberately propagating erroneous information, or even the editors don't know the basics about climate. I don't know which is more disturbing.

August 01, 2006

Global warming-6: The public and the paradigm

In the previous post, I discussed how after a paradigm is adopted, scientists tend to communicate only with each other. They are now freed from the need to explain and justify the basic premises of the field to a lay public, and no longer have to make a political case to justify what they are doing. This results in them developing a more technical, insider language and jargon that is opaque to nonscientists, and the technical paper addressed to similarly trained scientists and published in specialized journals becomes the chief means of communication.

But while this rapidly speeds up the pace of scientific progress, the general public gets left behind and unable to comprehend the language of the scientists. This can result in a disconnect arising between what the public knows and understands about the topics that scientists are investigating. Communicating with the general public and explaining the science to them in laymen's terms now becomes delegated to a new class of people, the popularizers of science, who are either journalists or scientists (like Carl Sagan) who have chosen to play that role. In scientific quarters, such people are in danger of not being considered 'real' scientists, the sole yardstick by which to identify the latter being the publication of technical papers in technical journals.

But these popularizers play a valuable role as translators, by taking the papers that are written in esoteric and mathematical language and published in technical journals, and making at least the results intelligible to lay people, even if the complex science and mathematics that lead to those results remain incomprehensible.

Eventually, the general public becomes used to the ideas underlying scientific paradigms and goes along with them. For example, no nonscientist today really questions the scientific paradigm that the Earth revolves around the Sun, even though their senses argue the opposite. People have just accepted that piece of scientific knowledge as a fact. Similarly, no one contests the paradigm that there exist positive and negative electric charges and that electric current consists of the flow of these charges, even though they cannot see it and really have no reason to believe it. People also do not question the fact that continents move, even though that idea is really, on the surface, quite preposterous and it is quite amazing that people nowadays accept it without question.

This just shows that eventually people will believe anything if they are told it over and over again by authority figures. In this case, they have been told something by scientists, who have based their assertions on data and evidence. But data and evidence are not necessary to achieve these ends. Religions get the same result simply by repeatedly telling people myths that have no basis.

But it does take some time for the general public to come to terms with the scientific consensus and during that transition there can be tensions, especially if the scientific paradigm goes counter to strong beliefs based on non-scientific sources. For example, the initial reaction to Darwinian ideas was negative as the mutability of species is not something readily seen in everyday life, and the idea that humans and chimpanzees share a common ancestor is anathema to those who see human beings as special creations of god. In the rest of the world, the scientific paradigm in biology that is called the neo-Darwinian synthesis was eventually largely accepted, but this is not the case in the US where a particular variant of Christianity-based thinking challenges the very premise of that paradigm.

The global warming paradigm is in its infancy, barely a decade old, and one should not be surprised that it encounters considerable resistance. Just a couple of decades ago, global warming was only slightly better than a conjecture. The coalescing of scientists around the consensus view has only occurred very recently so one should not be surprised that the general public is still lagging behind. This lag-time had little consequence when it came to ideas such a planetary motion or evolution or continental drift, since nothing could be done about those phenomena and there were no adverse consequences associated with whether the public accepted them or not. But getting the public on board quickly on the global warming issue is important because it is only action by them that can solve the problem. Scientists can study the problem and suggest how it can be fixed but it is only mass action that can produce changes.

The global warming paradigm is being resisted by some not because of strong pre-existing beliefs (who really knew or cared about the average temperature of the planet before this became a topic of conversation?) but because it goes counter to the economic interests of some powerful groups, notably the energy, automobile, and other greenhouse gas producing industries. They are well aware of the power of public opinion on this issue and they have attempted to try and argue that there is a scientific controversy in order to forestall any government action that might have a negative impact of their financial interests.

We have seen before these kinds of attempts to create in the public's mind the idea that scientists have strong disagreements on an issue and that therefore no action should be taken until further studies are done to 'resolve' the outstanding questions. This strategy is similar to what the tobacco industry tried to do with the health hazards of smoking. There too the paradigm that smoking is responsible for a whole variety of health problems took some time to be accepted and it took repeated litigation and legal losses by the tobacco industry to show the fraudulence of their claims that there was a scientific controversy about whether smoking caused cancer and other diseases. Their attempts to deny that scientific consensus eventually failed and hardly anyone anymore questions that smoking causes cancer, emphysema, and a host of other diseases.

We have also seen such an attempt at creating a fictitious scientific controversy in the case of evolution. This attempt has been more successful, partly because the fundamentalist religious mindset in much of America makes people predisposed to wanting to believe that evolution is not a fact.

In both smoking and evolution, the courts have played a major role in the discussions, The attempts by the industries to challenge the scientific consensus on global warming may not end up in courts because the impact is not on individuals or in the short term but on the long-term health of the planet as a whole. So it is not clear who has the legal standing to sue governments and industries to do something about the problem.

Hence the debate is going to have to be fought in the public and political arena and that is why is so important that the general public understand the science behind it.

Next: The current status of scientific knowledge on global warming.

POST SCRIPT: Ohio Board of Education, district seven

Many members of the Ohio's state Board of Education are elected. District Seven (comprising Summit, Portage, Ashtabula and Trumbull Counties) is currently represented by Deborah Owens Fink, one of the most ardent advocates of inserting intelligent design creationism into Ohio's science standards and curriculum. She is being challenged by Dave Kovacs who opposes her on this issue.

I have been asked to help publicize Kovacs' challenge. I don't know anything about him other than what is on his campaign website so this is not an endorsement. All I know, from my past experience with Ohio's science standards advisory board, is that Owens Fink has been a very negative influence on the Board.

Those who live in that region and care about this issue might want to look more closely into this contest.

July 31, 2006

Global warming-5: The emergence of a paradigm

The need to take global warming seriously is not slam-dunk obvious to most people. In my own case, over time I have slowly became convinced that there was an emerging consensus among scientists studying the issue that planetary warming was a serious matter. Like most people, I do not have the time or the expertise to have studied the question in detail, but I have enough respect for the scientific process and the way that scientists make collective judgments as a community that when I see a scientific consensus emerging on anything, I tend to take it seriously. In fact the global warming issue is a great example of seeing, before our very eyes, a transition in science from a pre-paradigmatic state to a paradigmatic state.

In his book The Structure of Scientific Revolutions, Thomas Kuhn argued that during the early, pre-paradigmatic days of any scientific field, one has different schools of thought and different theories underlying them. These schools exist and function almost independently of one another. They investigate different problems, operate under different rules, and have different criteria for evaluating their successes and failures. Each develops along its own path and has its own adherents and practitioners. But at some stage, for a variety of reasons, the community of scientists coalesce around one school of thought and this becomes the dominant paradigm in that field, and all scientists start working within the framework of that paradigm.

This transition occurs at different times for different sciences. For optics, Newton's corpuscular theory was the first paradigm. For electricity, it was Franklin's theory. For geology, it was Lyell's work. In biology, the Darwinian theory was the first paradigm in evolution. It should be noted that the adoption of a paradigm does not mean that the paradigms are true or that the problems in that field were solved once and for all. Newton's optics paradigm and Franklin's electricity paradigm were completely overthrown later, and the advent of molecular genetics resulted in the early Darwinian theory being modified to what is now called the neo-Darwinian synthesis. But the adoption of a paradigm significantly alters the way that the scientific community does its work.

Once a scientific community adopts a paradigm, the way its members work changes. Before the adoption of a paradigm, each school of thought challenges the basic premises of the others, examines different problems, uses different tools and methods, and uses different criteria for evaluation of problems. Once a paradigm is adopted however, there are no more controversies over such basics. The scientific community now tends to agree on what problems are worth focusing on, they tend to use the same terminology and tools, and they share a common understanding of what constitutes an acceptable solution to a problem. Scientists who do not adapt to the dominant paradigm in their field become marginalized and eventually disappear.

The conversion of the scientific community to a new paradigm is usually a long drawn out process with many scientists resisting the change and some never breaking free of the grip of the old paradigm. Historian of Science Naomi Oreskes gives an example:

In the 1920s, the distinguished Cambridge geophysicist Harold Jeffreys rejected the idea of continental drift on the grounds of physical impossibility. In the 1950s, geologists and geophysicists began to accumulate overwhelming evidence of the reality of continental motion, even though the physics of it was poorly understood. By the late 1960s, the theory of plate tectonics was on the road to near-universal acceptance.

Yet Jeffreys, by then Sir Harold, stubbornly refused to accept the new evidence, repeating his old arguments about the impossibility of the thing. He was a great man, but he had become a scientific mule. For a while, journals continued to publish Jeffreys' arguments, but after a while he had nothing new to say. He died denying plate tectonics. The scientific debate was over.

So it is with climate change today. As American geologist Harry Hess said in the 1960s about plate tectonics, one can quibble about the details, but the overall picture is clear.

It should emphasized that adoption of a paradigm does not mean that scientists think that everything has been solved and that there are no more open questions. What it does mean, among other things, is that the methods used to investigate those questions are usually settled. For example, in evolution and geology, establishing the age of rocks and fossils and other things are important questions. Dating those items uses, among other methods, radioactivity. This field assumes that radioactive elements decay according to certain laws, that the decay parameters have not changed with time, and that the laws of physics and chemistry that we now work with have been the same for all time and all over the universe. This common agreement with the basic framework enables geologists and evolutionists to speak a common language and arrive at results that they can agree on and build upon.

Some creationists, in order to preserve their notion of the universe being 10,000 years old or less, have either rejected radioactive dating entirely or jettisoned parts of it, such as that the radioactive decay constants have stayed the same over time. In doing so, they have stepped outside the framework of the paradigm and this is partly why they are not considered scientists. Kuhn's book discusses many other cases of this sort.

Kuhn argues that once a science has created its first paradigm, it never goes back to a pre-paradigm state where there is no single paradigm to guide research. Once a paradigm has been established, future changes are from thenceforth only from an old paradigm to a new one.

A key marker that a science has left a pre-paradigmatic state and entered a paradigmatic state can be seen in the way that scientists communicate with each other and with the general public. In the pre-paradigmatic stage, the book is the primary form of publication, and these books are aimed at the general public as well as other scientists, with an eye to gaining more support among both groups. As a result, the books are not too technical and there is an ongoing dialogue between scientists and the public.

But after a paradigm is adopted, scientists are freed from the need to explain and justify the basic premises of the field to a lay public, and no longer have to make a political case to justify what they are doing. They now tend to communicate only with each other. This results in them developing a more technical, insider, language and jargon that is opaque to nonscientists, and the chief means of communication becomes the technical paper addressed to similarly trained scientists and published in specialized journals. They start addressing their arguments to only those who work within their own narrow field of specialization. As a result of this increased efficiency in communication, science then tends to start making very rapid progress and the rules by which scientific theories get modified and changed become different. It now becomes much harder to overthrow an established paradigm, although it can and does still happen

But one consequence of this change in communication patterns is that, as in the global warming case, a disconnect can emerge between the consensus beliefs of scientists and the general public, and how to combat this is an interesting question.

Next: What happens to the public after a science becomes paradigmatic.

POST SCRIPT: Request for information

During the week of August 14, I will be driving with my daughter to San Francisco. Driving across the US is something I have always wanted to do to get a chance to personally experience the vastness of this country and some of its natural beauty.

We will be stopping near Denver to visit some friends on the way. I was wondering if people had any recommendations about the sights we should see between Denver and San Francisco. Here are some constraints:

1. I would like to see natural beauty as opposed to human creations. So suggestions about which national parks are worth a visit and what specific things should be seen in those parks would be most welcome.

2. We don't have much time and I cannot hike, so the sights should be such that they are accessible using an ordinary car (not an SUV or other type of off-road vehicle).

July 28, 2006

Global warming-4: Is there a scientific consensus on global warming?

Is there a scientific consensus on global warming? Naomi Oreskes from the Department of History and Science Studies Program, University of California at San Diego, thinks so. She published a study in the journal Science (December 3, 2004, volume 306, p. 1686) which argued that the scientific community had arrived at a consensus position on "anthropogenic climate change." i.e. that global warming was occurring, and that “Human activities . . . are modifying the concentration of atmospheric constituents . . . that absorb or scatter radiant energy. . . . [M]ost of the observed warming over the last 50 years is likely to have been due to the increase in greenhouse gas concentrations”

Her study looked at the scientific databases of the Institute for Scholarly Information (ISI) and searched on the keywords "climate change." She then examined the abstracts of the 928 papers that were returned and classified them under six categories: explicit endorsement of the consensus position, evaluation of impacts, mitigation proposals, methods, paleoclimate analysis, and rejection of the consensus position.

Her results were that "75% fell into the first three categories, either explicitly or implicitly accepting the consensus view; 25% dealt with methods or paleoclimate, taking no position on current anthropogenic climate change. Remarkably, none of the papers disagreed with the consensus position." (my italics)

She is careful to point out that some of the authors of the minority 25% may not have agreed with the consensus view but none of those papers explicitly took such a stand. She also pointed out that scientific bodies such as the Intergovernmental Panel on Climate Change (IPCC, created in 1988 by the World Meteorological Organization and the United Nations Environmental Programme), the National Academy of Sciences (2001), The American Meteorological Society (2003) , the American Geophysical Union (2003), and the American Association for the Advancement of Science (AAAS) had all issued statements endorsing the consensus viewpoint.

This does not necessarily mean that there is complete unanimity among scientists about all aspects of this issue. Richard Lindzen, who is an MIT professor of meteorology and a member of the NAS panel on climate change that issued the report cited by Oreskes, argued in a Wall Street Journal op-ed on June 11, 2001, that as far as he was concerned, all he was agreeing with was that "(1) that global mean temperature is about 0.5 degrees Celsius higher than it was a century ago; (2) that atmospheric levels of carbon dioxide have risen over the past two centuries; and (3) that carbon dioxide is a greenhouse gas whose increase is likely to warm the earth (one of many, the most important being water vapor and clouds)." But he went on "we are not in a position to confidently attribute past climate change to carbon dioxide or to forecast what the climate will be in the future" and he argued that the case for reducing the level of carbon dioxide emissions, as called for by the Kyoto treaty in 1997, was not compelling. He argues that the process of warming we currently observe may be part of the normal cyclical variations of the Earth, and that other greenhouse gases (such as water vapor and methane) may be more important players in producing warming than carbon dioxide.

Lindzen repeated much of the same arguments in a critical review of the documentary An Inconvenient Truth, that appeared in the op-ed pages of the Wall Street Journal on June 27, 2006, where he also explicitly challenged Oreskes' 2004 study.

In a previous post on belief preservation I wrote about the fact that there are many strategies that can be adopted to preserve one's existing beliefs. The latest issue of Physics and Society, published by the American Physical Society (vol. 35, no.3, July 2006) illustrates this. It has a letter (p. 25) by a global warming skeptic who also argues for the "natural cycles" theory and also adds that the Earth is so big that human activity is unlikely to have an impact on it. Looking on the bright side, the author argues that some parts of the Earth are too cold now anyway, and that even if global warming should occur, we might be better off figuring out better crops that can be grown in warmer conditions, and taking steps to protect ourselves from the flooding that would ensue from the rise of ocean levels.

This raising of alternative speculative ideas against a scientific consensus is not uncommon and can confuse non-scientists into asking "Well, is there a scientific consensus or not?" This sense of confusion is encouraged by those industries (such as automobile and energy) that are the chief producers of carbon dioxide, and who oppose actions that would require them to reduce emissions. Such people know that if there is a sense of controversy over an issue, and especially if that issue has economic costs associated with it, the natural impulse of the general public is to wait until the dust settles and a clear policy emerges. So kicking up dust is a good strategy if you want nothing to be done. This is not unlike what was done by the tobacco industry concerning the adverse health effects of smoking (an effort which ultimately failed) and by intelligent design creationists concerning evolution (which is ongoing). These people take advantage of the media's propensity to do "one the one hand, on the other hand" type stories, balancing the quotes of scientists warning of the dangers of warming with those of skeptics. This results in there being a much wider divergence in media coverage of the global warming issue than there is in the scientific community.

All these interests have used such strategies to dispute the conclusion that there is a scientific consensus that anthropogenic global warming is occurring. Oreskes addresses these arguments head-on in a recent Los Angeles Times op-ed on July 24, 2006:

[S]ome climate-change deniers insist that the observed changes might be natural, perhaps caused by variations in solar irradiance or other forces we don't yet understand. Perhaps there are other explanations for the receding glaciers. But "perhaps" is not evidence.

The greatest scientist of all time, Isaac Newton, warned against this tendency more than three centuries ago. Writing in "Principia Mathematica" in 1687, he noted that once scientists had successfully drawn conclusions by "general induction from phenomena," then those conclusions had to be held as "accurately or very nearly true notwithstanding any contrary hypothesis that may be imagined. . . "

Climate-change deniers can imagine all the hypotheses they like, but it will not change the facts nor "the general induction from the phenomena."

None of this is to say that there are no uncertainties left - there are always uncertainties in any live science. Agreeing about the reality and causes of current global warming is not the same as agreeing about what will happen in the future. There is continuing debate in the scientific community over the likely rate of future change: not "whether" but "how much" and "how soon." And this is precisely why we need to act today: because the longer we wait, the worse the problem will become, and the harder it will be to solve.

The fact that you never run out of alternative hypotheses and explanations for anything is an important point to realize. Philosopher of science Pierre Duhem addressed this way back in 1906 in his book The Aim and Structure of Physical Theory when he pointed out that you can never arrive at a correct theory by a process of eliminating all the possible alternatives because "the physicist is never sure that he has exhausted all the imaginable assumptions."

It is easy to come up with alternative explanations for any phenomenon. That is why evidence plays such an important role in evaluating theories and scientists use published research in peer-reviewed journals as indicators of whether an idea has any merit or not. And Oreskes' 2004 (peer reviewed) study in Science, showing that in the technical (peer-reviewed) journals a scientific consensus exists on anthropogenic climate change, has to be taken seriously. As she says in that paper:

The scientific consensus might, of course, be wrong. If the history of science teaches anything, it is humility, and no one can be faulted for failing to act on what is not known. But our grandchildren will surely blame us if they find that we understood the reality of anthropogenic climate change and failed to do anything about it.

Sensible words. But if you prefer, you can always listen to George Bush's ideas about global warming, courtesy of Will Ferrell.

July 27, 2006

Global warming-3: The science behind global warming

To understand the science behind global warming, it may be helpful to look at a simplified version of the science behind it.

Consider two objects, one that is luminous (i.e., an object that we can see without the aid of a light source) and another that is not luminous. Examples of luminous objects are the Sun (which generates energy due to nuclear reactions within it and sends a lot of that energy out as light) or a light bulb (that converts electrical energy into light energy). Examples of non-luminous objects are the Earth or a person in a room. The energy radiated by the luminous source spreads out in all directions and some of it will fall on the non-luminous object.

What is important to understand is that even what looks like a non-luminous object also radiates energy into space. In fact every object radiates energy. So in a sense, every object is 'luminous' in the sense that it sends out energy, but we usually reserve that term for objects that emit visible light. Not all radiated energy is visible. A human being radiates energy at a rate of about 500 watts, or the equivalent of five 100 watt bulbs, but the reason we do not "see" the radiation energy emitted by people is due to it being outside the visible range

The rate of energy emission of an object radiates depends to a large extent on its temperature (it actually goes as the fourth power of the temperature) and the nature of its surface (such as color, texture, material). So just as the Sun radiates energy into space, so does the Earth, except that the Sun's radiation is much greater since it is at a much higher temperature.

The important thing about global warming is understanding what happens when the energy radiated by a luminous source (say the Sun) falls upon a non-luminous object (say the Earth). Part of it is immediately reflected back into space, and does not affect the temperature of the Earth. But the rest is absorbed by the Earth and, in the absence of anything else happening, will tend to cause the Earth's temperature to rise. The relative amounts of the Sun's energy that are absorbed and reflected by the Earth depends on the nature of the Earth's surface. (As an example, a person in a room absorbs energy from the surroundings at a rate of about 400 watts, thus adding a person to a room is the net heat equivalent of turning on a 100 watt bulb.)

But as the temperature of the object rises due to it absorbing energy, the amount it radiates out again also increases, and at some point the object reaches equilibrium, which occurs when the energy absorbed by it from outside equals the energy it radiates away. Once an object reaches this state of thermal equilibrium, its temperature stays steady.

If for some reason we alter the ratio of energy absorbed by the Earth to the energy reflected, then the state of equilibrium is disturbed and the Earth's temperature will shift to a new equilibrium temperature. If relatively more energy gets absorbed, then the equilibrium temperature will rise until the energy radiated again becomes equal to the energy absorbed. Conversely, if relatively more energy now gets reflected, then the equilibrium temperature will drop, i.e., the Earth will cool. The people warning of global warming argue that human activity is causing the former situation and they say that the reason for this is that we are changing the nature of the Earth's surface, especially its atmosphere.

To understand what is happening at the Earth's surface and atmosphere, we need to understand something about the energy radiated by the Sun. This comes largely in the form of "electromagnetic energy." This is an umbrella term that encompasses X-rays, ultraviolet, light waves, infrared, microwaves, radio waves, etc. All these types of radiation are identical except for one single factor, which is called the wavelength of the radiation. The items in the list differ only in their wavelengths, with X-rays having the smallest wavelength and radio waves having the longest. (Similarly, all colors of visible light are also identical except for the wavelength, which increases as you go from blue to green to yellow to red.)

When this broad range of electromagnetic radiation from the Sun hits the Earth's atmosphere, almost all of it, except the visible light portion, gets absorbed by the atoms and molecules in the atmosphere and does not reach us on the ground. Of the portion that does reach the ground, some of it gets directly reflected unchanged and escapes back into space. The remainder gets absorbed by the ground. It is the energy that is absorbed by the ground that is the source of concern.

Recall that the Earth, like any object, also radiates energy away. But since the temperature of the Earth is different from the temperature of the Sun, the distribution of the wavelengths in the energy radiated by the Earth is different from the distribution that we receive from the Sun (although the total energy involved is the same in both cases for an object in equilibrium). This affects how much is absorbed by the atmosphere as it passes through it. Some of the Earth's radiation will get absorbed by the gases in the atmosphere (i.e., is trapped), while the rest passes through and goes off into space.

This is a crucial point. If the gases in the atmosphere change significantly, then you can change the relative amounts of the Earth's radiated energy that escapes into space and the amount that is trapped by the atmosphere . The so-called 'greenhouse gases' (carbon dioxide, water vapor, methane, nitrous oxide, and others) are those that are very good at absorbing the energy at the wavelengths radiated by the Earth, preventing them from escaping into space.

Global warming scientists argue that human activity is increasing the concentration of greenhouse gases (especially carbon dioxide) in the atmosphere. Hence more of the energy radiated by the Earth is being absorbed and less of the energy is escaping into space. Note that the incoming visible light from the Sun is not affected much by the concentrations of greenhouse gases since they are at a different wavelength, and the greenhouse gases do not absorb them as much. As a result of this increase in the absorption levels of the outbound radiation, the equilibrium temperature of the Earth will rise.

At this point, there are various scenarios that can unfold. One is that we arrive at a new and higher but stable equilibrium temperature. If the change in equilibrium temperature is small, the consequences might not be too disastrous, although there will be some adverse effects such as some temperature-sensitive organisms (such as coral reefs) becoming destroyed or some species going extinct if they cannot evolve mechanisms to cope. If the change is large, then there could be massive floods and droughts and other catastrophes.

The worst case scenario is a kind of runaway effect, where a rise in temperature results in effects that cause an even more rapid rise in temperature and so on, in a series of cascading effects.

Some argue that we are already seeing some signs of runaway effects, and point to the melting of the polar ice caps and the general decrease in glaciers and snow coverage worldwide. Snow is white and thus reflects back unchanged into space almost all the sunlight that hits it at the Earth's surface. When this snow melts and becomes water, not only is the amount of reflected energy decreased but water absorbs light energy. Hence the major loss of snow cover (apart from adverse environmental and ecological consequences) has a major effect on the reflection/absorption balance of the Earth, shifting it towards greater absorption. So more energy is absorbed by the Earth, resulting in even greater warming, resulting in further snow loss, and so on.

Another possible runaway factor is the amount of green cover. On balance, plants, because of photosynthesis, tend on average to be net absorbers of carbon dioxide and emitters of oxygen. Thus they reduce one of the greenhouse gases. If global warming results in less green cover of the Earth (say caused by prolonged droughts), then that would result in more greenhouse gases remaining in the atmosphere and causing yet more warming and more droughts. Human activity such as deforestation can accelerate this process.

Those are the basic elements of the science underlying global warming and the factors that go into building the models that try to predict long term climate change.

Next: The emerging scientific consensus over global warming.

POST SCRIPT: Colbert takes media apart again

As you may recall, the mainstream media did not take kindly to Stephen Colbert's demolishing them at the White House Correspondents Association Dinner. Now he takes them apart again.

July 26, 2006

Global warming-2: Understanding the problem

Understanding global climate concerns is not easy because it is a complex issue which involves many factors and theories, is based on data that span millennia and is not easy to extract, involves sophisticated theories and computer modeling, and requires long chains of inferential reasoning to arrive at conclusions. Compared to it, evolution, that other anathema of Bush and his anti-science Christian base, is a model of clarity.

At least with evolution, the progression shows a clear pattern, with life evolving from simple single cell organisms to the wide array of complex multi-cell systems we see today. If we started discovering anomalous organisms that seem to violate that temporal ordering, that would require a major restructuring of evolutionary theory.

With global warming, on the other hand, there isn't such a steady progression. It is not as if global warming implies that the temperature at each and every location on the Earth rises steadily with time. If it did, then people might be more easily convinced. But that is not how it works. Instead, the relevant data always deal with averages that are calculated (1) over very long time scales (involving tens and hundreds and thousands and even millions of years) and (2) over the whole planet or at least large areas of it.

It is quite possible to have wide fluctuations over shorter time periods and in localized areas that go counter to the long-term trend. Unfortunately, this means that there are plenty of opportunities for those who either do not understand that only averages are relevant, or who are deliberately trying to mislead others, to seize upon these fluctuations to argue that global warming is either not occurring or is not a serious problem. I can surely predict that if, for example, the next winter is colder than average in Cleveland, there will be many snickering comments to the effect that this 'proves' that global warming is a myth. Similarly, the current heat wave in France and California cannot, by themselves, be used, to argue in favor of global warming either. Scientists' conclusions will be unaffected since they know that data from a single year or location has only a tiny effect on averages.

These are the questions that need to be considered when we evaluate whether global warming is serious or not.

1. Is warming occurring? In other words, are average temperatures rising with time?

2. If so, is it part of normal cyclical warming/cooling trends that have occurred over geologic time or is the current warming going outside those traditional limits?

3. Are the consequences of global warming such that we can perhaps live with them (slightly milder winters and warmer summers) or are they going to be catastrophic (causing massive flooding of coastal areas due to rising ocean levels, severe droughts, blistering heat waves, total melting of the polar regions, widespread environmental and ecological damage)?

4. How reliable are the theories and computer models that are being used study this question?

5. What are the causes of global warming? Is human activity responsible and can the process be reversed?

My own ideas on this issue have changed over time. I started out by being somewhat neutral on this issue, not sure whether warming was occurring or not. Like most people, I didn't really understand questions about climate and tended to make the mistake of equating climate with weather. My understanding of weather was strongly influenced by the one feature about weather that we all grow up with, and that is its variability and unpredictability. This tends to create a strongly ingrained belief that we cannot really predict weather and I am sure this spills over into thinking that climate is also highly variable and so should not worry too much about warming since it might just as easily reverse itself.

But the key difference between weather and climate is that while weather systems are chaotic, climate change is not, at least as far as I am aware. In everyday language, chaos means just mess and disorder and confusion. But chaos, in science, is a technical term with a precise meaning. A chaotic system is one that progresses according to particular kinds of mathematical equations, usually coupled non-linear ones, such that the end state of the system is highly sensitive to initial conditions.

With non-chaotic systems, like a thrown ball, a small change in the initial conditions results in small changes in the final state. If I throw the ball slightly faster or at a slightly different angle, the end point of its trajectory will be only slightly different as well. This is what enables us to have expert athletes in any sport involving thrown or struck balls, because based on previous attempts, the professionals know how to make slight adjustments to hit a desired target. The reason that they can do so is because the ball's trajectory obeys non-chaotic dynamical equations.

But with a chaotic system, that is no longer true. A change in the initial conditions, however small, can result in the end state being wildly different, with the divergence increasing with time. But in order to predict the future of any system, we need to specify the current conditions. Since we can never know the initial conditions with perfect accuracy, this means that reliable long-term predictions are impossible. An analogy of a chaotic system might be river rapids. If you place a leaf at one point in the rapids, it might end up at some point further down the river. But making even a tiny change in your initial position will result in you ending up in a completely different place, even if the river flow itself is unchanged.

For example, suppose the mathematical quantity pi enters into a calculation. We know that the value of pi=3.1415927. . . , a sequence that goes on forever. But in performing actual calculations we cannot punch in an infinite sequence of digits into our computers and need to truncate the sequence. Usually for most problems (which are non-chaotic) we can treat pi as being equal to 3.14 or 22/7 or even just 3 and get fairly good results. We can adjust the precision of this input depending on the required precision of the output. But if pi was a particular part of a chaotic system of equations, then using 3.1415927 or rounding up to 3.141593 would give wildly different results. This is why this kind of chaos is better described as "extreme sensitivity to initial conditions."

Weather is thought to obey a chaotic system of equations. This is why, despite "Doppler radar" and other innovations that can give quite accurate measures of the state of weather-related parameters at any given time, weather forecasts become notoriously unreliable after three or four days, or even fewer. There is a reason that your local TV newscasts do not go beyond five-day weather forecasts. They are at the limits of predictability and already pushing their luck.

But the equations that drive climate calculations are not believed to be chaotic. Hence, given a model, one can hope to make reasonable predictions about global temperatures in the next century with some confidence in their reliability, even though one does not know if it is going to rain next week.

(In the terminology of chaos theory, sometimes climate is referred to as a "strange attractor" of the weather system, or a "boundary value problem," whereas weather is an "initial value problem." Basically, weather and climate are thought to evolve according to different kinds of mathematics.)

It is important to realize that the predictability of the results is possible only once a particular model of climate change has been chosen. One could get different results by choosing a different model altogether, although the range of possible models is strongly limited because they have to conform to the fundamental laws of science and be compatible with what we know about the behavior of related systems. The difference with weather is that with weather one can very different results while using the same model, simply because of our inability to specify exactly the initial values of the problem.

Next: The emerging scientific consensus over global warming.

July 25, 2006

Global warming

It is undoubtedly true that, while the increasing level of warfare in the Middle East in the immediate issue of concern, the question of global warning is the preeminent long term issue facing the planet today. It represents one of the rare situations when the health of the entire planet is at stake. The only other thing that has similar global consequences is an all-out nuclear war between major nuclear powers since that could also unleash an atmospheric catastrophe that could destroy the planet.

But while we can avoid a nuclear winter by simply doing nothing, i.e. not using the weapons, global warming is an issue where doing nothing is the problem. A strong case has been made that if we continue on the present course, the planet is going to suffer irrevocable harm, changing its climate and weather patterns in ways that will dramatically affect our lives, if not actually destroy them.

One would think that global warming is one scientific question where politics would play a minor role, and where the debate would be based on purely scientific evidence and judgments. Unlike issues like stem cell research and cloning where the scientific questions have to contend with religion-based arguments, as near as I can tell the Bible, Koran, and other religious texts are pretty much agnostic (so to speak) on the issue of whether global warming is something that god has strong views on. While god has a lot to say about things like the proper ways to sacrifice animals or how sinners should be put to death, he seems to not be concerned about the weather, expect for using it as a tactical weapon, like unleashing the occasional deluge to drown everyone but Noah and his family or creating a storm to chastise his prophet Jonah.

Hence it is surprising that some people (including the Bush administration) perceive the case being made that global warming is a serious problem as some kind of 'liberal' plot, tarring the proponents of the idea that global warming is real and serious as political enemies, seeking to somehow destroy truth, justice, and the American way. Glenn Greenwald argues that this is the standard mode of operation of the Bush administration, saying "What excites, enlivens, and drives Bush followers is the identification of the Enemy followed by swarming, rabid attacks on it."

Once that bugle call of politics sounded, Bush devotees dutifully fell into line. They know the script and exactly what they must do and have rallied to the cause, trying to discredit the scientific case and the scientists behind it, arguing that the whole global warming thing is a fabricated crisis, with nothing more to be worried about than if we were encountering just a warm summer's day. Senator James Inhofe (R-OK) says "With all of the hysteria, all of the fear, all of the phony science, could it be that man-made global warming is the greatest hoax ever perpetrated on the American people? It sure sounds like it." And this man is the Chair of the Senate's Committee on 
Environment and Public Works.

The administration and its supporters have gone to surprisingly extreme methods to suppress alarms about climate change, such as changing the wording of reports by government scientists in order to play down the threat of global warming and muzzling government climate experts, in order to prevent information from getting to the public.

Take another example in which the administration has sought to divert government's scientist's focus from global warming:

From 2002 until this year, NASA's mission statement, prominently featured in its budget and planning documents, read: "To understand and protect our home planet; to explore the universe and search for life; to inspire the next generation of explorers. . .as only NASA can."

In early February, the statement was quietly altered, with the phrase "to understand and protect our home planet" deleted. In this year's budget and planning documents, the agency's mission is "to pioneer the future in space exploration, scientific discovery and aeronautics research."

David E. Steitz, a spokesman for the National Aeronautics and Space Administration, said the aim was to square the statement with President Bush's goal of pursuing human spaceflight to the Moon and Mars.

But the change comes as an unwelcome surprise to many NASA scientists, who say the "understand and protect" phrase was not merely window dressing but actively influenced the shaping and execution of research priorities. Without it, these scientists say, there will be far less incentive to pursue projects to improve understanding of terrestrial problems like climate change caused by greenhouse gas emissions.

"We refer to the mission statement in all our research proposals that go out for peer review, whenever we have strategy meetings," said Philip B. Russell, a 25-year NASA veteran who is an atmospheric chemist at the Ames Research Center in Moffett Field, Calif. "As civil servants, we're paid to carry out NASA's mission. When there was that very easy-to-understand statement that our job is to protect the planet, that made it much easier to justify this kind of work."

Several NASA researchers said they were upset that the change was made at NASA headquarters without consulting the agency's 19,000 employees or informing them ahead of time.
. . .
The "understand and protect" phrase was cited repeatedly by James E. Hansen, a climate scientist at NASA who said publicly last winter that he was being threatened by political appointees for speaking out about the dangers posed by greenhouse gas emissions.

The attempts to downplay the extent of the problem, divert attention away from actions to study and remedy it, and distort the science behind the global warming issue has been helped by the fact that although the consensus conclusions of the scientific community are pretty straightforward (that global warming is occurring, it is largely caused by human activity, and that we need to take steps to reverse it or face disastrous consequences), the actual science behind it is complicated. This enables those who wish to blur the issue to find ways to cast doubt on that scientific consensus.

Next: Understanding the problem

July 14, 2006

The origin of life

Darwin's theory of evolution by natural selection deals with the question of how life evolves and does not directly address the question of the origin of life itself. The fields of cosmology and physics and chemistry have provided models of how the universe evolved and created the solar system, among other things. But those theories do not explain how organic molecules, the basic building blocks of life, came about.

An article by Gareth Cook in the August 14, 2005 issue of the Boston Globe examined this question in the light of an initiative (known as the ''Origins of Life in the Universe Initiative") by then Harvard president Lawrence Summers to invest millions to investigate this important question, partly in an effort to have Harvard try and catch up the leaders in this field at the University of Arizona, the California Institute of Technology, and the Scripps Research Institute in La Jolla, Calif..

Cook says that the questions to be addressed are: "How can life arise from nonlife? How easy is it for this to happen? And does the universe teem with life, or is Earth a solitary island?"

Scientists generally work on the assumption that the laws of physics and chemistry that we work with on Earth should also apply everywhere in the universe. But those laws need not result in the same environment being created on different planets and since it is the environment that will determine the nature of the life forms that come into being, the laws of biology could be, and in fact would likely be, quite different from planet to planet, depending on the environment that was in existence at the time that living organisms came into being there.

Of course, we have no evidence right now that life forms exist on other planets. But "biologists have been finding that life can survive in much more hostile environments than thought possible -- such as microbes that live deep in rock or in searingly acidic water -- meaning that planets with more extreme environments might support life", lending support to the idea that life is likely to be found elsewhere.

Hence an important related question would be studying how different environments came into existence in the different planets and how the nature of life is related to the environment that produced it.

Cook's article summarizes some theories for the origins of life. The first is the famous 1953 Miller-Urey experiment: "A flask, containing elements of the early Earth's atmosphere, was jolted with electricity, like bolts of lightening. This simple setup created a wealth of organic molecules, but since [then], the prevailing view of the makeup of the early atmosphere has changed, and the experiment doesn't work well with the new recipe."

Others have suggested that "organic molecules could have been carried to Earth in the icy core of comets" (which presupposes the existence of life elsewhere and does not really answer the question of how life began, only how it began on Earth), or that "life began near the intense heat of deep sea vents, an environment that drives unusual reactions."

Yet other possibilities exist, such as the idea of chemist Scot Martin, who

"believes that ultraviolet light from the sun, shining down on tiny mineral crystals floating near the surface of the early ocean, may have generated organic compounds.

In his flask, he has shown that molecules of bicarbonate, common in the early ocean, attach themselves to a mineral called sphalerite. When the ultraviolet light hits the sphalerite, it sets off a chain of events that makes the bicarbonate more reactive, and that leads to a wide range of organic compounds in Martin's flask."

I find reports of this kind of research exciting because of the deep questions they address. It is undoubtedly challenging work, and finding answers will require intense effort by many dedicated scientists over many years. What will keep them going, apart from funding, is the belief that scientists have in methodological naturalism, the idea that the only thing that stands between them and answers to these important questions is their ingenuity.

As David R. Liu, a professor of chemistry and chemical biology at Harvard, says: ''We start with a mutual acknowledgment of the profound complexity of living systems" and he continues ''my expectation is that we will be able to reduce this to a very simple series of logical events that could have taken place with no divine intervention."

Of course not everyone is happy with that last thought. Those who seek to preserve a role for god are hoping that this effort fails, as their claim for the inexplicability of the origin of life is almost their last refuge, perhaps behind only consciousness and the mind. Cook reports:

Michael Behe, a biologist at Lehigh University in Pennsylvania and one of the leading proponents of intelligent design, said he was glad that Harvard was going to try to address the issue.

''If, as I suspect will happen," Behe said, ''they fail to find a plausible answer without invoking intelligence, then maybe science will be less hostile to folks who see intelligent direction in the history of life," he said.

To my mind, this sentiment captures perfectly the anti-science view of the intelligent design creationism (IDC) people, and shows very clearly why IDC should never be part of science. When Behe says he "suspects" that answers won't be found, he really means "hopes," since he has no basis for his suspicions except his faith that god created life. The IDC people actually want to see science fail to answer an important question in order to preserve their religious beliefs.

People with such attitudes can never do really good science because they will willingly and happily give up at the first sign of difficulty and let god do the explanatory heavy lifting. To do science at the frontiers requires one to be willing to work very hard, overcoming setback after setback, spurred on by your belief that an answer exists and is discoverable. The IDC people, always eager to pull god out of their hip pockets to answer tough questions, just do not have what it takes.

We can let Richard Dawkins have the last word on this (thanks to MachineLikeUS.com):

You see, if you say something positive like the whole of life – all living things – is descended from a single common ancestor which lived about 4,000 million years ago and that we are all cousins, well that is an exceedingly important and true thing to say and that is what I want to say. Somebody who is religious sees that as threatening and so I am represented as attacking religion, and I am forced into responding to their reaction. But you do not have to see my main purpose as attacking religion. Certainly I see the scientific view of the world as incompatible with religion, but that is not what is interesting about it. It is also incompatible with magic, but that also is not worth stressing. What is interesting about the scientific world view is that it is true, inspiring, remarkable and that it unites a whole lot of phenomena under a single heading. And that is what is so exciting for me.

POST SCRIPT: No Joementum!

The indispensable Stephen Colbert looks at democracy and the Democratic senatorial primary in Connecticut.

June 22, 2006

What the neuroscience community thinks about the mind/brain relationship

The idea that the mind is purely a product of the material in the brain has profound consequences for religious beliefs, which depend on the idea of the mind as an independent controlling force. The very concept of 'faith' implies an act of free will. So the person who believes in a god is pretty much forced to reject the idea that the mind is purely a creation of the brain. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Pope John Paul II struck a similar theme in a 1996 address focusing on science, in which he said theories of evolution that "consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person."

As I wrote yesterday, the flagging intelligent design creationism (IDC) movement seems to be hoping for some fresh energy to emerge from the work of psychiatric researcher Dr. Schwartz. Or at the very least they may be hoping that they can persuade the public that the mind does exist independently of the brain. But they are going to have a hard time getting traction for this idea within the neurobiology community. There seems to be a greater degree of unanimity among them about the material basis of the mind than there is among biologists about the sufficiency of natural selection.

Stephen F. Heinemann, president of the Society for Neuroscience and a professor in the molecular-neurobiology lab at the Salk Institute for Biological Studies, in La Jolla, Calif., echoed many scientists' reactions when he said in an e-mail message, "I think the concept of the mind outside the brain is absurd."

But the ability of the neurobiology community to do their work unfettered by religious scrutiny may be coming to an end as increasing numbers of people become aware of the consequences of accepting the idea that the mind is purely a product of the brain. People might reject this idea (and be attracted to the work of Dr. Schwartz), not because they have examined and rejected the scientific evidence in support of it, but because it threatens their religious views. As I discussed in an earlier posting, people who want to preserve a belief system will accept almost any evidence, however slender or dubious, if it seems to provide them with an option of retaining it. As the article says:

Though Dr. Schwartz's theory has not won over many scientists, some neurobiologists worry that this kind of argument might resonate with the general public, for whom the concept of a soul, free will, and God seems to require something beyond the physical brain. "The truly radical and still maturing view in the neuroscience community that the mind is entirely the product of the brain presents the ultimate challenge to nearly all religions," wrote Kenneth S. Kosik, a professor of neuroscience research at the University of California at Santa Barbara, in a letter to the journal Nature in January.
. . .
Dr. Kosik argues that the topic of the mind has the potential to cause much more conflict between scientists and the general public than does the issue of evolution. Many people of faith can easily accept the tenets of Darwinian evolution, but it is much harder for them to swallow the assumption of a mind that arises solely from the brain, he says. That issue he calls a "potential eruption."

When researchers study the nature of consciousness, they find nothing that persuades them that the mind is anything but a product of the brain.

The reigning paradigm among researchers reduces every mental experience to the level of cross talk between neurons in our brains. From the perspective of mainstream science, the electrical and chemical communication among nerve cells gives rise to every thought, whether we are savoring a cup of coffee or contemplating the ineffable.
. . .
Mr. [Christof] Koch [a professor of cognitive and behavioral biology at the California Institute of Technology] collaborated for nearly two decades with the late Francis Crick, the co-discoverer of DNA's structure, to produce a framework for understanding consciousness. The key, he says, is to look for the neural correlates of consciousness - the specific patterns of brain activity that correspond to particular conscious perceptions. Like Crick, Mr. Koch follows a strictly materialist paradigm that nerve interactions are responsible for mental states. In other words, he says, "no matter, never mind."

Crick summed up the materialist theory in The Astonishing Hypothesis: The Scientific Search for the Soul (Scribner, 1994). He described that hypothesis as the idea that "your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules."

What many people may find 'astonishing' about Crick's hypothesis is that among neurobiologists it is anything but astonishing. It is simply taken for granted as the way things are. Is it surprising that religious believers find such a conclusion unsettling?

Next: What does "free will" mean at a microscopic level?

POST SCRIPT: Why invading Iraq was morally and legally wrong

Jacob G. Hornberger, founder and president of The Future of Freedom Foundation has written a powerful essay that lays out very clearly the case of why the US invasion and occupation of Iraq is morally and legally indefensible, and why it has inevitably led to the atrocities that we are seeing there now, where reports are increasingly emerging of civilians being killed by US forces. Hornberger writes "I do know one thing: killing Iraqi children and other such “collateral damage” has long been acceptable and even “worth it” to U.S. officials as part of their long-time foreign policy toward Iraq."

The article is well worth reading.

June 21, 2006

IDC gets on board the brain train

An article titled Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) examined what neuroscientists are discovering about religion and the brain. It is a curious article. The author (Richard Monastersky) seems to be trying very hard to find evidence in support of the idea that brain research is pointing to the independent existence of a soul/mind, but it is clear on reading it that he comes up short and that there is no such evidence, only the hopes of a very small minority of scientists.

He reports that what neuroscientists have been doing is studying what happens in the brain when religious people pray or meditate or think about god or have other similar experiences.

At the University of Pennsylvania, Andrew B. Newberg is trying to get at the heart - and mind - of spiritual experiences. Dr. Newberg, an assistant professor of radiology, has been putting nuns and Buddhist meditators into a scanning machine to measure how their brains function during spiritual experiences.

Many traditional forms of brain imaging require a subject to lay down in a claustrophobia-inducing tube inside an extremely loud scanner, a situation not conducive to meditation or prayer, says Dr. Newberg. So he used a method called single-photon-emission computed tomography, or Spect, which can measure how a brain acted prior to the scanning procedure. A radioactive tracer is injected into the subjects while they are meditating or praying, and the active regions of the brain absorb that tracer. Then the subjects enter the scanner, which detects where the tracer has settled.

His studies, although preliminary, suggest that separate areas of the brain became engaged during different forms of religious experience. But both the nuns and the meditators showed heightened activity in their frontal lobes, which are associated in other studies with focused attention.

The experiments cannot determine whether the subjects were actually in the presence of God, says Dr. Newberg. But they do reveal that religious experiences have a reality to the subjects. "There is a biological correlate to them, so there is something that is physiologically happening" in the brain, he says.

The finding that certain parts of the brain get activated during 'spiritual experiences' is not surprising. Neither is the fact that those experiences have a 'reality to the subjects.' All acts of consciousness, even total hallucinations, are believed to originate in the brain and leave a corresponding presence there, and why the researcher ever expected this to demonstrate evidence for god is not made clear in the article.

It is clear that intelligent design crationism (IDC) advocates are concerned about the implication of brain studies for religious beliefs. It seems plausible that as we learn more and more about how the brain works and about consciousness in general, the idea of a mind independent of the brain becomes harder to sustain. Hence IDC advocates are promoting meetings that highlight the work of those few researchers who think they see a role for god within the brain. But these meetings are being held in secret.

Organizers of the conference, called "Research and Progress on Intelligent Design," had hoped to keep its existence out of public view. The university held a well-advertised public debate about ID that same week, but Michael N. Keas, a professor of history and the philosophy of science at Biola who coordinated the private meeting, would not confirm that it was happening when contacted by a reporter, nor would he discuss who was attending.

But one of the people doing this work is not shy about talking about his research.

When the leaders of the intelligent-design movement gathered for a secret conference this month in California, most of the talks focused on their standard concerns: biochemistry, evolution, and the origin of the universe. But they also heard from an ally in the neurosciences, who sees his own field as fertile ground for the future of ID.

Jeffrey M. Schwartz, a research professor of psychiatry at the University of California at Los Angeles, presented a paper titled "Intelligence Is an Irreducible Aspect of Nature" at the conference, held at Biola University, which describes itself as "a global center for Christian thought." Dr. Schwartz argued that his studies of the mind provide support for the idea that consciousness exists in nature, separate from human brains.

Michael Behe, the author of Darwin's Black Box which suggested five 'irreducibly complex' systems on which the IDC people have long hung their hopes for evidence of god, may be losing his status as the IDC movement's scientific standard bearer. His book came out in 1996 and nothing new has been produced since then. It is clear that you cannot dine forever on that meager fare, especially since evolutionary biologists keep churning out new results all the time. The need for a new poster child is evident and it seems as if the IDC movement has found one in psychiatrist Schwartz.

Leaders of the intelligent-design movement, though, see clear potential for Dr. Schwartz's message to resonate with the public.

"When I read Jeff's work, I got in touch with him and encouraged him to become part of this ID community," says William A. Dembski, who next month will become a research professor in philosophy at the Southwestern Baptist Theological Seminary, in Texas. "I regard him as a soul mate," says Mr. Dembski.

This may be a sign that the real science-religion battle is shifting away from biological evolution to brain research. This new battle will not be as high profile as the evolution one simply because brain studies are not part of the school curriculum and thus not subject to the policies of local school boards. So the evolution battle will likely continue to dominate the news headlines for some time.

Tomorrow we will see what neurobiologists think of this attempt to find god in their area of study. If the IDC advocates thought that the biologists were a tough foe to convince, they are going to find that the brain research community is even more resistant to their overtures.

POST SCRIPT: War profiteers

One of the underreported stories of the Iraq invasion is the enormous amount of money that is being made by some people because of it. Coming in fall 2006 is a new documentary by Robert Greenwald titled Iraq for Sale: The War Profiteers.

Greenwald's marketing strategy for his documentaries has been to bypass the main distribution networks and put his documentaries out straight to video for a low price. He did this with is earlier productions Outfoxed: Rupert Murdoch's war on journalism (a look at the bias of Fox news), Uncovered: The war on Iraq (which exposed the fraudulent case made for the Iraq invasion), and Walmart: The high cost of low prices.

Look out for the release of Iraq for Sale. You can see the preview here.

June 20, 2006

Religion's last stand-2: The role of Descartes

In the previous posting, I discussed two competing models of the mind/brain relationship.

It seems to me that the first model, where the physical brain is all there is and the mind is simply the creation of the brain, is the most persuasive one since it is the simplest and accepting it involves no further complications. In this model, our bodies are purely material things, with the brain's workings enabling us to think, speak, reason, act, and so forth. The idea of 'free will' is an illusion due to the brain being an enormously complicated system whose processes and end results cannot be predicted. (A good analogy would be classically chaotic systems like the weather. Because of the specific non-linearity of the equations governing weather, we cannot predict long-term weather even though the system is a deterministic and materialistic.)

The second model, that of an independently existing non-material mind/soul, separate from the brain and directing the brain, immediately raises all kinds of problems, which have long been recognized. The scientist-philosopher Rene Descartes (1596-1650) of "I think, therefore I am" fame was perhaps the first person to formulate this mind-body dualism (or at least he is the person most closely associated with the idea) and it is clear that he felt that it was necessary to adopt this second model if one was to retain a belief in god.

But he realized immediately that it raises the problem of how the non-material mind/soul can interact with the material brain/body to get it to do things. Princess Elizabeth of Bohemia, with whom Descartes had an extended correspondence, was unable to understand Descartes' explanation of this interaction and kept prodding him on this very question. Descartes had no adequate answer for her, even though both clearly wanted to believe in the existence of god and the soul. In the introduction to his translation of Descartes' Meditations and other Metaphysical Writings (which contains extended segments of the Elizabeth-Descartes correspondence), Desmond Clarke writes:

After repeated attempts to answer the question, how is it possible for something which is not physical to interact with something else which, by definition is not physical?, Descartes concedes that he cannot explain how it is possible.

But he tried, using the best scientific knowledge available to him at that time. He argued that the location of the soul's interaction with the body occurred in the pineal gland.

As is well known, Descartes chose the pineal gland because it appeared to him to be the only organ in the brain that was not bilaterally duplicated and because he believed, erroneously, that it was uniquely human. . . By localizing the soul's contact with body in the pineal gland, Descartes had raised the question of the relationship of mind to the brain and nervous system. Yet at the same time, by drawing a radical ontological distinction between body as extended and mind as pure thought, Descartes, in search of certitude, had paradoxically created intellectual chaos.

Although Descartes failed in his efforts to convincingly demonstrate the independent existence of the soul, research into the relationship of religious beliefs to the central nervous system of the brain has continued.

Descartes is an interesting character. Much of his scientific work, and even his temperament, seem to indicate a materialistic outlook. But at the same time, he took great pains to try and find proofs of god's existence. One gets the sense that he was a person trying to convince himself of something he did not quite believe in, and had he lived in a different time might have rejected god with some relief. The article on Descartes in Encyclopaedia Britannica Online, 13 June 2006 says:

Even during Descartes's lifetime there were questions about whether he was a Catholic apologist, primarily concerned with supporting Christian doctrine, or an atheist, concerned only with protecting himself with pious sentiments while establishing a deterministic, mechanistic, and materialistic physics.

The article points to reasons for the ambiguousness of his views, which could be due to the fact that there was, at that time, considerable fear of the power of the Catholic Church and this may have guided the way he presented his work.

In 1633, just as he was about to publish The World (1664), Descartes learned that the Italian astronomer Galileo Galilei (1564–1642) had been condemned in Rome for publishing the view that the Earth revolves around the Sun. Because this Copernican position is central to his cosmology and physics, Descartes suppressed The World, hoping that eventually the church would retract its condemnation. Although Descartes feared the church, he also hoped that his physics would one day replace that of Aristotle in church doctrine and be taught in Catholic schools.

Descartes definitely comes across as somewhat less than pious, and non-traditional in his religious beliefs.

Descartes himself said that good sense is destroyed when one thinks too much of God. He once told a German protégée, Anna Maria van Schurman (1607–78), who was known as a painter and a poet, that she was wasting her intellect studying Hebrew and theology. He also was perfectly aware of - though he tried to conceal - the atheistic potential of his materialist physics and physiology. Descartes seemed indifferent to the emotional depths of religion. Whereas Pascal trembled when he looked into the infinite universe and perceived the puniness and misery of man, Descartes exulted in the power of human reason to understand the cosmos and to promote happiness, and he rejected the view that human beings are essentially miserable and sinful. He held that it is impertinent to pray to God to change things. Instead, when we cannot change the world, we must change ourselves.

Clearly he was not orthodox in his thinking. Although he tried to believe in god, it was his emphasis on applying the materialistic principles that he used in his scientific work to try and identify the mechanism by which the mind interacts with the brain that has the potential to create the big problem for religion.

To sum up Descartes' argument, following sound scientific (methodological naturalistic) principles, he felt that if the mind interacted with the brain, then there had to be (1) some mechanism by which the non-material mind could influence the material brain, and (2) some place where this interaction took place. Although he could not satisfactorily answer the first question, he at least postulated a location for the interaction, the pineal gland. We know now that that is wrong, but the questions he raised are still valid and interesting ones that go to the heart of religion.

Next: What current researchers are finding about the brain and religion.

POST SCRIPT: Documentary on Rajini Rajasingham-Thiranagama

I have written before about the murder of my friend Rajini Rajasingham-Thiranagama, who had been an active and outspoken campaigner for human rights in Sri Lanka. I have learned that a documentary about her life called No More Tears Sister is the opening program in the 2006 PBS series P.O.V.

In the Cleveland area, the program is being shown on Friday, June 30, 2006 at 10:00pm on WVIZ 25. Airing dates vary by location, with some PBS stations showing it as early as June 27. The link above gives program listings for other cities. The synopsis on the website says:

If love is the first inspiration of a social revolutionary, as has sometimes been said, no one better exemplified that idea than Dr. Rajani Thiranagama. Love for her people and her newly independent nation, and empathy for the oppressed of Sri Lanka - including women and the poor - led her to risk her middle-class life to join the struggle for equality and justice for all. Love led her to marry across ethnic and class lines. In the face of a brutal government crackdown on her Tamil people, love led her to help the guerrilla Tamil Tigers, the only force seemingly able to defend the people. When she realized the Tigers were more a murderous gang than a revolutionary force, love led her to break with them, publicly and dangerously. Love then led her from a fulfilling professional life in exile back to her hometown of Jaffna and to civil war, during which her human-rights advocacy made her a target for everyone with a gun. She was killed on September 21, 1989 at the age of 35.

As beautifully portrayed in Canadian filmmaker Helene Klodawsky's "No More Tears Sister," kicking off the 19th season of public television's P.O.V. series, Rajani Thiranagama's life is emblematic of generations of postcolonial leftist revolutionaries whose hopes for a future that combined national sovereignty with progressive ideas of equality and justice have been dashed by civil war - often between religious and ethnic groups, and often between repressive governments and criminal rebel gangs. Speaking out for the first time in the 15 years since Rajani Thiranagama's assassination, those who knew her best talk about the person she was and the sequence of events that led to her murder. Especially moving are the memories of Rajani's older sister, Nirmala Rajasingam, with whom she shared a happy childhood, a political awakening and a lifelong dedication to fighting injustice; and her husband, Dayapala Thiranagama, who was everything a middle-class Tamil family might reject - a Sinhalese radical student from an impoverished rural background. Also included are the recollections of Rajani's younger sisters, Vasuki and Sumathy; her parents; her daughters, Narmada and Sharika; and fellow human-rights activists who came out of hiding to tell her story. The film rounds out its portrayal with rare archival footage, personal photographs and re-enactments in which Rajani is portrayed by daughter Sharika Thiranagama. The film is narrated by Michael Ondaatje, esteemed author of The English Patient and Anil's Ghost.

I knew Rajini well. We were active members of the Student Christian Movement in Sri Lanka when we were both undergraduates at the University of Colombo. It does not surprise me in the least that she threw herself with passion into the struggle for justice. She was brave and spoke the truth, even when it was unpalatable to those in power and with guns, and backed up her words with actions, thus putting her life on the line for her beliefs. Such people are rare. I am proud to have known her.

June 19, 2006

Religion's last stand: The brain

As almost everyone is aware, the science-religion wars have focused largely on the opposition of some Christian groups to the teaching of evolution. The religious objections to Darwin's theory of natural selection have been based on the fact that if the universe and the diversity of life that we see around us could have come about without the guidance of a conscious intelligence like god (even operating under the pseudonym of 'intelligent designer'), then what need would we have for believing in a god?

But while evolution has been the main focus of attention, I see that as more of a preliminary skirmish to the real final battle battleground for religion, which involves the brain.

The crucial question for the sustaining of religious beliefs is the relationship of the mind to the brain. Is the mind purely a creature of the brain, and our thoughts and decisions merely the result of the neurons firing in our neuronal networks? If so, the mind is essentially a material thing. We may have ideas and thoughts and a sense of consciousness and free will that seem to be nonmaterial, but that is an illusion. All these things are purely the products of interactions of matter in our brains. In this model, the mind is entirely the product of the physical brain. This premise underlies the articles selected for the website MachinesLikeUs.com.

Or is the mind a separate (and non-material) entity, that exists independently of the brain and is indeed superior to it, since it is the agent that can cause the neurons in our brain to fire in certain ways and thus enable the brain to think and feel and make decisions? In this model, the 'mind' is who 'I' really am, and the material body 'I' possess is merely the vehicle through which 'I' am manifested. In this model, the mind is synonymous with the soul.

If we are to preserve the need for god, then it seems that one must adopt the second model, that human beings (at the very least among animals) are not merely machines operating according to physical laws. We need to possess minds that enable us to think and make decisions and tell our bodies how to act. Most importantly, our minds are supposed to have the capacity of free-will. After all, what would be the value of an act of 'faith' if the mind were purely driven by mechanical forces in the brain?

It should be immediately obvious why the nature of the mind is a far more disturbing question for religion than evolution is or ever will be. With evolution, the question centers around whether the mechanism of natural selection (and its corollary principles) is sufficient to explain the diversity of life and changes over time. As such, the debate boils down to the question of weighing the evidence for and against and determining whether which is more plausible.

But plausibility lies in the eye of the beholder and we have seen in a previous posting how the desire to preserve beliefs one holds dear leads people to adopt intellectual strategies that enable them to do so.

Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) says that the strategies adopted are: "1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender."

In the discussions about evolution, people who wish to preserve a role for god have plenty of viable options at their disposal. They can point to features that seem to have a low probability of occurring without the intervention of an external, willful, and intelligent guidance (aka god). These are the so-called 'irreducibly complex' systems touted by intelligent design creationism (IDC) advocates. Or they can point to the seeming absence of transitional fossils between species. Or they can point to seemingly miraculous events or spiritual experiences in their lives.

Scientists argue that none of these arguments are valid, that plausible naturalistic explanations exist for all these things, and that the overwhelming evidence supports evolution by natural selection as sufficient to explain things, without any need for any supernatural being.

But in one sense, that argument misses the point. As long as the debate is centered on weighing the merits of competing evidence and arriving at a judgment, van Gelder's point is that it does not matter if the balance of evidence tilts overwhelmingly to one side. People who strongly want to believe in something will take the existence of even the slenderest evidence as sufficient for them. And it seems likely that the evolution debate, seeing as it involves complex systems and long and subtle chains of inferential arguments, will always provide some room to enable believers to retain their beliefs.

But the mind/brain debate is far more dangerous for religion because it involves the weighing of the plausibility of competing concepts, not of evidence. The fundamental question is quite simple and easily understood: Is the brain all there is and the mind subordinate to it, a product of its workings? Or is the mind an independently existing entity with the brain subordinate to it?

This is not a question that scientific data and evidence has much hope of answering in the near future. Eliminating the mind as an independently existing entity has all the problems associated with proving a negative, and is similar to trying to prove that god does not exist.

But since the mind, unlike god, is identified with each individual and is not necessarily directly linked to god, discussing its nature carries with it less religious baggage, and its nature can be examined more clinically

Next: Descartes gets the ball rolling on the mind and the brain.

POST SCRIPT: Choosing god

I came across this story (thanks to onegoodmove) that illustrates the point that I was trying to make on the way people choose what kind of god to believe in. I have no idea if the events actually occurred, though, or if the story has been embellished to make the point.

The subject was philosophy. Nietzsche, a philosopher well known for his dislike of Christianity and famous for his statement that 'god is dead', was the topic. Professor Hagen was lecturing and outside a thunderstorm was raging. It was a good one. Flashes of lightning were followed closely by ominous claps of thunder. Every time the professor would describe one of Nietzsche's anti-Christian views the thunder seemingly echoed his remarks.

At the high point of the lecture a bolt of lightning struck the ground near the classroom followed by a deafening clap of thunder. The professor, non-plussed, walked to the window, opened it, and starting jabbing at the sky with his umbrella. He yelled, "You senile son of a bitch, your aim is getting worse!"

Suffice it to say that some students were offended by his irreverent remark and brought it to the attention of the Department Head. The Department Head in turn took it to the Dean of Humanities who called the professor in for a meeting. The Dean reminded the professor that the students pay a lot of tuition and that he shouldn't unnecessarily insult their beliefs.

"Oh," says the professor, "and what beliefs are those?"

"Well, you know" the Dean says, "most students attending this University are Christians. We can't have you blaspheming during class."

"Surely" says the professor, "the merciful God of Christianity wouldn't throw lightning bolts. It's Zeus who throws lightning bolts."

Later the Dean spoke with the Department Head, and said, "The next time you have a problem with that professor, you handle it, and let him make an ass out of you instead."

June 16, 2006

The desire for belief preservation.

In the previous post we saw how human beings are believed to not be natural critical thinkers, preferring instead to believe in the first plausible explanation for anything that comes along, not seeing these initial explanations as merely hypotheses to be evaluated against competing hypotheses.

But one might think that when we are exposed to alternative hypotheses, we might then shift gears into a critical mode. But Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) argues that what foils this is the human desire for belief preservation.

He quotes seventeenth century philosopher Francis Bacon who said:

The mind of man is far from the nature of a clear and equal glass, wherein the beams of things should reflect according to their true incidence; nay, it is rather like an enchanted glass, full of superstition and imposture, if it be not delivered and reduced.

In other words, van Gelder says, "the mind has intrinsic tendencies toward illusion, distortion, and error." These arise from a combination of being hard-wired in our brains (because of evolution), natural growth of our brains as we grow up in the Earth's environment, and the influence of our societies and cultures. "Yet, whatever their origin, they are universal and ineradicable features of our cognitive machinery, usually operating quite invisibly to corrupt our thinking and contaminate our beliefs."

All these things lead us to have cognitive biases and blind spots that prevent us from seeing things more clearly, and one of the major blind spots is that of belief preservation. van Gelder says that "At root, belief preservation is the tendency to make evidence subservient to belief, rather than the other way around. Put another way, it is the tendency to use evidence to preserve our opinions rather than guide them."

van Gelder says that when we strongly believe some thing or desire it to be true, we tend to do three things: "1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender."

This would explain why (as vividly demonstrated in the popular video A Private Universe) people hold on to their erroneous explanations about the phases of the moon even after they have been formally instructed in school about the correct explanation.

This would also explain the question that started these musings: Why for so long had I not applied the same kinds of questioning to my religious beliefs concerning god, heaven, etc. that I routinely applied to other areas of my life? The answer is that since I grew up in a religious environment and accepted the existence of god as plausible, I did not seek other explanations. Any evidence in favor of belief (the sense of emotional upliftment that sometimes occurs during religious services or private prayer, or some event that could be interpreted to indicate god's action in my life or in the world, or scientific evidence that supported a statement in the Bible) was seized on, while counter evidence (such a massive death and destruction caused by human or natural events, personal misfortunes or tragedies, or scientific discoveries that contradicted Biblical texts) was either ignored or explained away. It was only after I had abandoned my belief in god's existence that I was able to ask the kinds of questions that I had hitherto avoided.

Did I give up my belief because I could not satisfactorily answer the difficult questions concerning god? Or did I start asking those questions only after I had given up belief in god? In some sense this is a chicken-and-egg problem. Looking back, it is hard to say. Probably it was a little of both. Once I started taking some doubts seriously and started questioning, this probably led to more doubts, more questions, until finally the religious edifice that I had hitherto believed in just collapsed.

In the series of posts dealing with the burden of proof concerning the existence of god, I suggested that if we use the common yardsticks of law or science, then that would require that the burden of proof lies with the person postulating the existence of any entity (whether it be god or a neutrino or whatever), and that in the absence of positive evidence in favor of existence, the default assumption is to assume the non-existence of the entity.

In a comment to one of those postings, Paul Jarc suggested that the burden of proof actually lay with the person trying to convince the other person to change his views. It may be that we are both right. What I was describing was the way that I thought things should be, while Paul was describing the way things are in actual life, due to the tendency of human beings to believe the first thing that sounds right and makes intuitive sense, coupled with the desire to preserve strong beliefs once formed.

van Gelder ends up his article with some good advice:

Belief preservation strikes right at the heart of our general processes of rational deliberation. The ideal critical thinker is aware of the phenomenon, actively monitors her thinking to detect its pernicious influence, and deploys compensatory strategies.

Thus, the ideal critical thinker
• puts extra effort into searching for and attending to evidence that contradicts what she currently believes;
• when “weighing up” the arguments for and against, gives some “extra credit” for those arguments that go against her position; and
• cultivates a willingness to change her mind when the evidence starts mounting against her.

Activities like these do not come easily. Indeed, following these strategies often feels quite perverse. However, they are there for self-protection; they can help you protect your own beliefs against your tendency to self-deception, a bias that is your automatic inheritance as a human being. As Richard Feynman said, “The first principle is that you must not fool yourself - and you are the easiest person to fool.”

The practice of science requires us to routinely think this way. But it is not easy to do and even scientists find it hard to give up their cherished theories in the face of contrary evidence. But because scientific practice requires this kind of thinking, this may also be why science is perceived as 'hard' by the general public. Not because of its technical difficulties, but because you are constantly being asked to give up beliefs that seem so naturally true and intuitively obvious.

POST SCRIPT: The people who pay the cost of war

I have nothing to add to this powerful short video, set to the tune of Johnny Cash singing Hurt. Just watch. (Thanks to Jesus' General.)

June 15, 2006

Why religious (and other) ideas are so persistent

When people are asked to explain the phases of the moon, the response given most frequently is that they are caused by the shadow of the Earth falling on the moon. They are not aware that this explanation holds true only for rare cases of eclipses, and not for the everyday phases.

When the people making these responses are asked to consider the alternative (and correct) model in which the phases are caused by one part of the moon being in the shadow thrown by the other part (which can be easily seen by holding up any object to the light and seeing that parts of it are in its own shadow), such people quickly recognize that this alternative self-shadow model is more plausible than the Earth-shadow model.

So the interesting question is why, although the correct model is not hard to think up, people stick for so long with their initial erroneous model. The answer is that they did not even consider the possibility that the Earth-shadow explanation they believed in was just a hypothesis that ought to be compared with other, alternative, hypotheses to see which was more consistent with evidence. They simply accepted uncritically as true the first hypothesis they encountered and stayed with it. Why is this?

Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46), looks into why this kind of critical thinking is rare among people and his article (summarizing the insights gleaned from cognitive science research) sheds some light on my own puzzlement as to why it took me so long to question the implausible aspects of my beliefs in heaven and immortality.

van Gelder points out that critical thinking does not come naturally to people, that it is 'a highly contrived activity' that is hard and has to be deliberately learned and cultivated. He says that "[e]volution does not waste effort making things better than they need to be, and homo sapiens evolved to be just logical enough to survive, while competitors such as Neanderthals and mastodons died out."

But if we are not by nature critical thinkers, what kind of thinkers are we? To answer this question, van Gelder refers to Michael Shermer's 2002 book Why people believe weird things: Pseudoscience, superstition, and other confusions of our time and says:

We like things to make sense, and the kinds of sense we grasp most easily are simple, familiar patterns or narratives. The problem arises when we do not spontaneously (and do not know how to) go on to ask whether an apparent pattern is really there or whether a story is actually true. We tend to be comfortable with the first account that seems right, and we rarely pursue the matter further. Educational theorist David Perkins and colleagues have described this as a “makes-sense epistemology”; in empirical studies, he found that students tend to

act as though the test of truth is that a proposition makes intuitive sense, sounds right, rings true. They see no need to criticize or revise accounts that do make sense - the intuitive feel of fit suffices.

Since for most of us, the religious 'explanations' of the big questions of life, death, and meaning are the ones we are first exposed to as children, and they do provide a rudimentary explanatory pattern (even if in a selective and superficial way), we tend to accept them as true and thus do not actively look for, and even avoid, alternative explanations.

But what happens when alternative explanations thrust themselves on us, either in school or elsewhere? Do we then go into critical thinking mode, evaluating the alternatives, weighing the competing evidence and reasoning before forming a considered judgment?

Alas, no. But the reasons for that will be explored tomorrow.

POST SCRIPT: That bad old AntiChrist

I wrote before about the new video game Left Behind: Eternal Forces. Their website has an interesting FAQ page which says:

The storyline in the game begins just after the Rapture has occurred - when all adult Christians, all infants, and many children were instantly swept home to Heaven and off the Earth by God. The remaining population - those who were left behind –-are then poised to make a decision at some point. They cannot remain neutral. Their choice is to either join the AntiChrist - which is an imposturous one world government seeking peace for all of mankind, or they may join the Tribulation Force - which seeks to expose the truth and defend themselves against the forces of the AntiChrist.

So the goal of the AntiChrist is to create a one world government seeking peace for all of mankind! What a dastardly plan. So naturally they must be massacred in the name of Jesus to prevent this awful fate from occurring.

For those who might be concerned that this game goes counter to the message of love preached by Jesus in the Bible, Jesus' General thoughtfully provides the relevant text of the inexplicably overlooked Gospel of Left Behind, which provides the justification for the violent philosophy of the game.

Also, don't forget to check out the animation "Don't dis Elisha!" which shows the story of how the prophet Elisha cursed children who teased him, who were then killed by bears sent by god. (Again, thanks to the ever-vigilant General.)

Who knows, the Elisha story could form the basis for another video game marketed in Christmas 2007 by the same people behind the Left Behind: Eternal Forces game. In the new game the players could represent bears and the goal is to attack and kill as many children as possible.

June 05, 2006

Seeing the world through Darwin's eyes

It is good to be back and blogging again!

On my trip to Australia, I had the chance to see some of the marsupial animals that are native to that continent, and as I gazed at these strange and wondrous creatures, I asked myself the same question that all visitors to the continent before me must have asked: Why are these animals so different from the ones I am familiar with? After all, Australia's environment is not that different from that found in other parts of the world, but the fact that most marsupials (like kangaroos, wallabies, koalas, and wombats) are found only on that continent is remarkable. I was stunned to learn that when a kangaroo is born, it weighs less than one gram. This is because much of the development of the newborn (which occurs in other animals inside the womb of the mother) takes place in the pouch for marsupials.

The Encyclopedia Brittanica says that marsupials are:

a mammalian group characterized by premature birth and continued development of the newborn while attached to the nipples on the lower belly of the mother. The pouch, or marsupium, from which the group takes its name, is a flap of skin covering the nipples. Although prominent in many species, it is not a universal feature - in some species the nipples are fully exposed or are bounded by mere remnants of a pouch. The young remain firmly attached to the milk-giving teats for a period corresponding roughly to the latter part of development of the fetus in the womb of a placental mammal (eutherian).

The largest and most varied assortment of marsupials - some 200 species - is found in Australia, New Guinea, and neighbouring islands, where they make up most of the native mammals found there. In addition to the larger species such as kangaroos, wallabies, wombats, and the koala, there are numerous smaller forms, many of which are carnivorous, the Tasmanian devil being the largest of this group (family Dasyuridae). About 70 species live in the Americas, mainly in South and Central America, but one, the Virginia opossum (Didelphis virginiana), ranges through the United States into Canada.

The significance of the way that animals are distributed in the world was a key insight that Charles Darwin obtained as result of his voyage on the Beagle from 1831 to 1836. He noted that although the environment in the Galapagos Islands was very similar to that of the Cape Verde islands (off the coast of West Africa), the animal life found is each of these islands were quite dissimilar to one another and more similar to the wildlife in their immediately neighboring continents (South America and Africa respectively). This made his speculate that a few animals had arrived at the islands from the nearby continents and then changed over time to become distinctive species.

This line of reasoning caused him to doubt the dominant belief of his time (called 'special creation') that said that god had created each species to fit into their environmental niches. (Darwin had at one time been contemplating joining the priesthood and one can assume that he would have initially been quite comfortable with this belief.)

What would have further fuelled Darwin's doubts about special creation was the increasing awareness, even in his own time, that large numbers of species had already gone extinct. It is now estimated that over 90% of all species that ever existed are no longer around. If god was creating each species specially to suit the available environmental niches, explaining extinction becomes problematic.

On a side note, the nature parks I visited in Australia were surprisingly relaxed about visitors. They did not keep the animals in pens separated from people, except for dangerous animals like the Tasmanian Devil. You walked around in the same area as the animals and could get up close and pet wallabies and wombats and koalas if you so wished and they were nearby. You could even enter the cages housing birds and there was no one checking to see that the doors were kept closed to prevent the birds from escaping. The rangers assumed that park visitors would not keep the doors open.

I could not imagine such a relaxed attitude in the US where people are scared that if a bird pecked someone or an animal bit or scratched a visitor, lawsuits would follow. A park ranger told me that if an animal showed signs of aggression or unwonted interest in people, they would take some action but they did not, as a rule, try to shield themselves from any chance of being sued by putting up barriers, as is the case here. He asked me where I was from and when I said the USA he nodded understandingly and said that Australia was not as litigious a country as the US, although he feared that eventually Australian nature parks would follow the US model and put up barriers between animals and visitors. (I did see tremendous American cultural dominance in their TV stations, where the programs and news formats seemed indistinguishable from their US counterparts, except for the accents.)

Seeing strange new animals in their natural habitat was very intriguing for me, provoking different feelings than seeing them in a zoo here. I can well understand how Darwin's trip the Galapagos Islands would have triggered similar questions in his own mind and lead to his own investigations and groundbreaking theory of evolution.

POST SCRIPT: Barry's new blog

If you have been reading the comments to this blog, you would have found some interesting and thought-provoking by Barry. Barry has now started his own blog called Those Who Can't Teach Wish They Could where he chronicles the path of his career switch from engineering to teaching, and his observations about how the whole certification process may be discouraging otherwise talented and knowledgeable teachers from entering the classroom

Barry's comments on my blog were always thoughtful and lively, and his blog is the same. You should visit.

June 02, 2006

Why scientific theories are more than explanations

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

At its heart, intelligent design creationism (IDC) advocates adopt as their main strategy that of finding phenomena that are not (at least in their eyes) satisfactorily explained by evolutionary theory and arguing that hence natural selection is a failed theory. They say that adding the postulate of an 'intelligent designer' (which is clearly a pseudonym for God) as the cause of these so-called unexplained phenomena means that they are no longer unexplained. This, they claim, makes IDC the better 'explanation.' Some (perhaps for tactical reasons) do not go so far and instead say that it is at least a competing explanation and thus on a par with evolution.

As I discussed in an earlier posting, science does purport to explain things. But a scientific explanation is more than that. The explanations also carry within themselves the seeds of new predictions, because whenever a scientist claims to explain something using a new theory, the first challenge that is thrown invariably takes the form "Ok, if your theory explains X under these conditions, then it should predict Y under those conditions. Is the prediction confirmed?"

If the prediction Y fails, then the theory is not necessarily rejected forever but the proponent has to work on it some more, explain the failure to predict Y, and come back with an improved theory that makes better predictions.

If the prediction Y is borne out, then the theory is still not automatically accepted but at least it gains a little bit of credibility and may succeed in attracting some people to work on it.

Theories become part of the scientific consensus when their credibility increases by these means until they are seen by the scientific community to be the exclusive framework for future investigations. A scientist who said things like "My new theory explains X but makes no predictions whatsoever" would be ignored or face ridicule. Such theories are of no use for science.

And yet this is precisely the kind of thing that IDC proponents are saying. To see why this cannot be taken seriously, here is something abridged from the book Physics for the Inquiring Mind by Eric Rogers (p. 343-345), written way back in 1960. In it Rogers looks at competing claims for why an object set in motion on a surface eventually comes to rest:


The Demon Theory of Friction

How do you know that it is friction that brings a rolling ball to a stop and not demons? Suppose you answer this, while a neighbor, Faustus, argues for demons. The discussion might run thus:

You: I don't believe in demons.
Faustus: I do.
You: Anyway, I don't see how demons can make friction.
Faustus: They just stand in front of things and push to stop them from moving.
You: I can't see any demons even on the roughest table.
Faustus: They are too small, also transparent.
You: But there is more friction on rough surfaces.
Faustus: More demons.
You: Oil helps.
Faustus: Oil drowns demons.
You: If I polish the table, there is less friction and the ball rolls further.
Faustus: You are wiping the demons off; there are fewer to push.
You: A heavier ball experiences more friction.
Faustus: More demons push it; and it crushes their bones more.
You: If I put a rough brick on the table I can push against friction with more and more force, up to a limit, and the block stays still, with friction just balancing my push.
Faustus: Of course, the demons push just hard enough to stop you moving the brick; but there is a limit to their strength beyond which they collapse.
You: But when I push hard enough and get the brick moving there is friction that drags the brick as it moves along.
Faustus: Yes, once they have collapsed the demons are crushed by the brick. It is their crackling bones that oppose the sliding.
You: I cannot feel them.
Faustus: Rub your finger along the table.
You: Friction follows definite laws. For example, experiment shows that a brick sliding along a table is dragged by friction with a force independent of velocity.
Faustus: Of course, the same number of demons to crush however fast you run over them.
You: If I slide a brick among a table again and again, the friction is the same each time. Demons would be crushed on the first trip.
Faustus: Yes, but they multiply incredibly fast.
You: There are other laws of friction: for example, the drag is proportional to the pressure holding the surfaces together.
Faustus: The demons live in the pores of the surface: more pressure makes more of them rush out and be crushed. Demons act in just the right way to push and drag with the forces you find in your experiments.

By this time Faustus' game is clear. Whatever properties you ascribe to friction he will claim, in some form, for demons. At first his demons appear arbitrary and unreliable; but when you produce regular laws of friction he produces a regular sociology of demons. At that point there is a deadlock, with demons and friction serving as alternative names for sets of properties - and each debater is back to his first remark.


Faustus's arguments are just like those of the IDC advocates, and the reason why they are consistently rejected by the scientific community. Scientists ask for more than just explanations from their theories. They also need mechanisms that make predictions. They know that that is the only way to prevent being drowned in an ocean of 'explanations' that are of no practical use whatsoever.

You can't really argue with people like Faustus who are willing to create ad hoc models that have no predictive power. Such explanations as he gives have no value to the practicing scientist. But when you walk away from this kind of fruitless pseudo-debate, you do allow the other side to charge that you are afraid to debate them, at which point, they may jump up and down and shout "See they cannot refute us. We win! We win!", however illogical the charge.

It reminds me of the duel scene in Monty Python and the Holy Grail in which King Arthur chops off the arms and legs of the Black Knight, leaving just his torso and attached head on the ground, totally defenceless. The Black Knight refuses however to concede defeat and offers a compromise: "Oh? All right, we'll call it a draw." When Arthur and his assistant walk away from this offer, the