THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries for January 2008

January 31, 2008

Extra Terrestrial Intelligence-4: What if we get a signal?

One of the big problems with ETIs is that it is very unlikely that we can make actual physical contact with them. One reason is just statistics, as I said earlier. While the odds of life existing elsewhere in the universe need not be too small, the chances that any one ETI will cross signals or even paths with another is very small, due just to the immense size of the universe compared to the speed of our travel and communications.

But there is another problem working against an actual meeting between an alien life form and ours. Although we believe that the laws of physics and chemistry are universal in their application, the laws of biology are not believed to be so. All the life forms on the Earth have evolved in its peculiar mix of oxygen-nitrogen atmosphere, along with its abundance of water. Life on other planets would have evolved in completely different environments and are unlikely to resemble the forms we are familiar with except for the broad constraints laid down by the laws of physics and chemistry. It would be an absolutely stunning discovery if the life forms we encounter were also oxygen-breathing, water-drinking, cell-phone using beings like us. That would imply that the range of conditions under which life can occur is far more restrictive, and the laws of biology far more universal, than we had anticipated. It would also mean that the probability of life originating on other planets is even lower, since they would require environments similar to ours in many ways.

Even universal laws like gravity can cause problems. If an organism has evolved on a planet that has a gravity field much different from ours, that could pose problems for an actual meeting. Organisms that have evolved to survive in a field of a certain size would find it hard to move and maneuver in fields that are much greater.

In any event, even if the time-space-technology barriers are somehow overcome and an actual direct encounter takes place, any face-to-face encounter between an ETI and us will likely have to take place with either or both being encased in spacesuits that can simulate the required environment.

For an atheist, the discovery that ETIs exist, like any other scientific discovery, brings with it only wonder and curiosity. There is no dogma to be disturbed. But for religious people, questions about life and origins are inextricably bound up with religious doctrines and are bound to cause problems. Most religions, although making claims of universality, are really quite parochial, basing their entire theology on claims of what has happened here. There will have to be some scrambling to try and incorporate the new facts of the existence of other intelligences into an Earth-based theology.

Nowadays we tend to forget the fact that it was much easier during the pre-Copernican times to believe in a personal god with whom one was in direct contact. A finite and fairly small universe with the star-embedded heavens not too far away made it easy to think of god as a human-like entity keeping an eye on us from heaven. All such a god would need were heightened human powers, like extremely good eyesight to be able to see everything and some form of ESP to read our minds. Since the distance from heaven to Earth was not that great, it was possible for god to act quickly and easily everywhere.

The realization that the universe was vast and possibly infinite raised issues that were far trickier, and this has been dealt with by emphasizing more the notion that 'god is everywhere.' While this solves the problem of how god can know everything instantaneously, it also makes it harder to visualize a human-like personal god. The advance of science and the notion that everything must obey the laws of science has caused other problems for the idea of a personal, human-like god. For example, the restriction that no information can travel faster than the speed of light means that a god who is everywhere and knows everything 'at the same time' must be violating this law somehow, even if one overcomes the problem that simultaneity is no longer a universal quality under the laws of relativity. So we now have the conundrum of a god who violates his own laws. This is why religion needs to indoctrinate children into religious beliefs at an early age and surround them with communities where such questions are not raised, and where meaningless platitudes such as 'god is everywhere' are accepted as deep truths, beyond the reach of reason and logic.

It seems to me that if life were to be discovered on distant planets, and not just any old life but a society with vastly superior capabilities, surely the man-made nature of religion and god would be obvious to everyone?

But that may be just my prejudice. I suspect that the discovery of ETIs would cause theologians to put in overtime to come up with some rationale as to why this is consistent with whatever their respective religious texts say. Organized religion is too much of a profitable business for its beneficiaries to allow their cash cow to go under due to the emergence of inconvenient facts. They will dust off the writings of some previously obscure religious mystic whose words could be construed to mean that he had anticipated this discovery, and the mystic's words would be used to show how the religious texts are correct and even prophetic and scientific. Thus the discovery of ETIs will be portrayed as a triumph for religion. This is similar to the way that St. Augustine's words are now interpreted by some to suggest that he had anticipated the big-bang model of the universe.

I actually do hope that we receive a signal from outer space. To my mind, it will confirm what I have long held: that all the differences that we dwell on here such as ethnicity, religion, geography, nationality, are just tiny and superficial and largely artificial, not worth fighting and killing over. Furthermore, it should give us hope that societies can deal effectively with advanced technology and need not end up destroying themselves with it, either by blowing themselves up or by slowly strangling their own planet, the way we are currently risking things.

But while that is my hope, that may not happen. It is possible that while there may be a spurt of such forward thinking in the immediate aftermath of receipt of a signal, eventually that knowledge will become part of our background knowledge. When people realize that there is going to be no practical consequence to this discovery and that we will not be able to actually meet the aliens, they will go back to their usual ways, listening to their preachers explaining how all this fits in with god's mysterious plan, and why their own group of people is still very special in god's eyes, so special that killing people who are different is a virtuous act.

The only benefit we may get from receiving ETI signals might be if we could decipher the signals to get information that might provide some insights into new scientific and technological breakthroughs that might help us deal with some problems on Earth, such as global warming or the rapid depletion of energy and other natural resources.

That is not as exciting as being able to meet and chat with other intelligences, but it is not an insignificant benefit.

POST SCRIPT: The deep mind of George Bush

British comedians John Bird and John Fortune explain how everything is going according to George Bush's grand plan.


January 30, 2008

Extra Terrestrial Intelligence-3: The most likely contact scenario

What is likely to be our reaction if we did receive an unambiguous signal that there existed ETI somewhere else in the universe?

The reaction would be hard to predict because it is not a topic that is not publicly discussed much. This is a bit surprising because it is not such a stretch to think that we could wake up one morning to find out that we have received some signal from an alien civilization. I suspect that the reason why we don't speculate on this question is that any such occurrence might be extremely difficult for most people to absorb into their existing worldviews, so they avoid thinking about it.

Take religious believers. If there is life elsewhere, what does that do to the common idea that humans somehow have a special relationship to god? For Christians, if Jesus died on Earth as a redemptive act for all humankind, did a similar event take place with these other alien civilizations? Would Jews still be able to see themselves as some god's chosen people? Why did the angel Gabriel not reveal this information of ETI to Mohammed during one of their chats?

I suspect that religious apologists would quickly get to work to come up with new doctrines that would keep the faithful loyal. After all, very similar theological challenges have been encountered before although they are not remembered now. After Copernicus's ideas about a heliocentric solar system sank in and it became clear that the Earth was merely one of several planets, theologians worried that this meant that other planets could also have inhabitants and this would cause problems for religious doctrines.

As Thomas Kuhn pointed out in book The Copernican Revolution:

When it was taken seriously, Copernicus' proposal raised many gigantic problems for the believing Christian. If, for example, the earth were merely one of six planets, how were the stories of the Fall and of the Salvation, with their immense bearing on Christian life, to be preserved? If there were other bodies essentially like the earth, God's goodness would surely necessitate that they, too, be inhabited. But if there were men on other planets, how could they be descendents of Adam and Eve, and how could they have inherited the original sin, which explains man's otherwise incomprehensible travail on an earth made for him by a good and omnipotent deity? Again, how could men on other planets know of the Savior who opened to them the possibility of eternal life? … Worst of all, if the universe is infinite, as many of the later Copernicans thought, where can God's Throne be located? In an infinite universe, how is man to find God or God man? (p. 193)

The Copernican model, once its implications were fully appreciated by the theologians, thus raised some serious problems. Fortunately for the theologians, no life was found on the other planets so they did not have to deal with the implications of original sin, of Adam and Eve as being created in god's image, of the fall from grace, of Jesus as savior, and so on.

Similar problems were encountered with the early exploration of the world. Before the arrival of Europeans in the New World, for example, St. Augustine went so far as to argue that there could not be human beings already there because the Bible said that all humans were descended from Adam and Eve, and since there was no way that their descendants could have got to the other side of the Earth (what they called the Antipodes), that meant there could not be any humans there. Such was the typical approach of one of the Catholic Church's great thinkers: start with a doctrinal belief and then use that to predict what the data should reveal. In science, we may start with a paradigmatic model to make a prediction but we never depend upon any revelation or a special text.

Of course, Augustine was wrong. But as always, the theologians managed to absorb the discovery that the New World was indeed inhabited and devise ways to incorporate these new and awkward scientific facts in ways to keep the faithful loyal.

Next: The potential benefit of receiving signals an ETI.

POST SCRIPT: But seriously,…

Our wise pundits start to discuss the really important issues.

January 29, 2008

Extra Terrestrial Intelligence-2: The chances of ETI existing

I thought of ETIs because of recent sudden reports of their appearance. About 40 residents of the town of Stephenville in Texas reported seeing a UFO a few weeks ago. And then the website Machines Like Us highlighted the reception of a mystery signal by a radio telescope in Puerto Rico. Nothing definitive has been said about the source of either signal, leaving the field ripe for speculation by ETI believers.

Now I think it is likely that there is life somewhere out there in the universe. The huge number of stars in the universe seem to imply that as long as the probability of life emerging spontaneously is not zero (and we know this is true since we are here), then we should not be surprised at it occurring in other places, perhaps in many places. The catch is that given the size of the universe, the probability of any one of these forms of life encountering another is very small. The most likely way that we will detect their presence is by accident, if they happen to send out a signal strong enough in all directions so that is it detectable by us even at these huge distances. Even then, although we would know the direction from which the signals came, it would be hard to know how far away they are. The premise of Contact was that a planet fairly close to us (near the star Vega just 25.3 light years away) containing ETI had received our old TV signals, thus discovering our existence, and then decided to reach out to us.

But although I think that it is likely that ETIs exist, what I am really skeptical about are the usual reports of UFOs and other sightings, where alien spacecraft dart hither and thither at high speed, playing peek-a-boo with us. If intelligent life evolved near other stars long enough ago that they could travel the likely millions of years necessary to get to Earth, they must be possessed of a vastly superior science and technology than us simply in order to even find us.

After going to all that trouble, why would they then start playing the fool, scaring the daylights out of rural Americans? And why is it that it seems like it is mostly rural Americans who get these visits? Why don't they drop in on Central Park in New York City?

While it seems likely that the present kinds of UFO sightings are nothing more than misidentifications, the idea that we could receive a signal from ETIs is intriguing and worth mulling over. The most likely thing to happen is that we do get some sort of identifiable, non-noise, intelligently created electromagnetic signal from outer space, broadcast by the inhabitants of some distant planet without any specific intention of contacting anyone, just the way our own radio signals have been beamed out to the universe for the last 100 years or so and TV signals for about 70 years. Electromagnetic waves have some huge advantages as communication devices: they can carry detailed information, can travel through the vacuum of empty space, and travel at the fastest possible speed allowed by the laws of science, which is the speed of light. But that very fact shows how limited our reach is, since it would take about 100,000 years for these waves to just cross our own Milky Way galaxy.

Even if we did get such an unambiguous signal about the existence of an ETI from some source and could decipher it, there is little that we could do with it, just the way that a distant civilization would be baffled if, millions of years from now, they were to pick up the weak signal from a broadcast of American Idol. We would not be able to communicate back and the long times involved in sending and receiving messages would sap the enthusiasm of the most ardent believer in ETI. In science fiction, this limitation is overcome by invoking speculative scientific exotica like black holes and worm holes that enable space travelers to circumvent the speed-of-light limitation and somehow 'tunnel' to distant locations in very short times. But while that meets the plot needs of authors, there is no hard evidence that such things exist or, if they do, could be used for such kinds of travel.

But if we leave all these kinds of exotica aside, what intrigues me is what would happen if we simply experience the absolute minimum, which is the receipt of some signal that unambiguously indicates that somewhere out there, however far away and unreachable, there exists intelligent life. Would that change anything here? Would it influence the way we think and behave amongst ourselves, even if there was no possibility of actually communicating with that intelligent life? Or would the novelty soon wear off, and we go back to our usual practice of killing each other?

Next: How should we react to receiving a signal?

POST SCRIPT: Wisdom beyond any price

What would be do without our profoundly wise national commentariat?

January 28, 2008

Extra Terrestrial Intelligence-1: Getting a signal

In the years 2002 and 2003, during the peak of the intelligent design creationism (IDC) movement, I was invited to a few meetings of that movement to provide the opposing view. This was the time when the IDC side was promoting such debates as a means of increasing visibility for IDC ideas.

During those meetings I heard over and over again about the significance of the film Contact, based on the novel of the same name by astronomer Carl Sagan. This surprised me because I knew Sagan was a self-described agnostic. Why was the work of such a well-known skeptic being shown so much love at gatherings of religious believers? I was intrigued by this question but didn't get around to reading the book or seeing the film until I did both last month.

I now understand the IDC people's fascination with Contact. The book and film deal with extra-terrestrials making contact with people on Earth. The signal of their existence is that radio telescopes on Earth start receiving a series of pulsed signals from outer space that are the sequence of prime numbers, which are numbers that can only be divided by themselves or one. (i.e., the numbers, 1,2,3,5,7,11,13,17,19,23,…)

While prime numbers are a source of great fascination for mathematicians and are used by them in a wide variety of ways (cryptography being one), there is no naturally occurring physical process that generates those numbers. Hence the reception of prime numbers is an unambiguous signal of a real intelligence out there manufacturing these artifacts, unlike the earlier false alarms created by the detection of pulsars in 1967. Those earlier signals consisted of regular pulses of energy with very precise times between each pulse and initially were thought to be signals sent by an extra-terrestrial intelligence (ETI) but were later found to be caused by rotating neutron stars. But one would be hard pressed to find naturally occurring physical explanations for signals that had the pattern of the prime numbers

The IDC people used this idea from Contact to argue that the existence of certain biological systems could not occur naturally and hence were similarly unambiguous signals for the existence of an 'outside' intelligence. While this intelligence could also be extra-terrestrial (as postulated by the Raelians), the IDC people preferred to believe that it was caused by god. This was the Paley's Watch and Mount Rushmore metaphors modernized.

I found both book and film interesting but mildly dissatisfying. Sagan's weaknesses as a novelist show, though his knowledge and command of science help to make the book readable.

All books and films that deal with contact with ETIs suffer from the same problem, that the really exciting part is the thrill of discovering the existence of extra-terrestrial intelligence and the anticipation of what aliens look like, are like, and their attitudes towards us. But we just do not have any data at all on which to base our conceptions of these alien beings, so any choice the authors make is bound to be seen as deficient. Whatever the creators dream up about the actual encounter cannot help but be a bit of an anticlimax.

All the novels that I have read on this topic (admittedly not that many) suffer from the fact that the plot's dynamic requires a revelation of the ETI at the end but the actual realization of the concept is almost always disappointing. I don't see any way around it.

Next: More on ETIs.

POST SCRIPT: How to become a New York Times columnist

All you have to do is be consistently wrong.

January 25, 2008

Anniversary reflections on this blog

Today's post will mark the completing of three years since this blog began. Although I tend to ignore anniversaries of any kind, they do provide convenient points at which to step back and look at the big picture, to reflect on what was achieved, what was not, and where one should be going.

I have been on a regimen of writing five op-ed type essays a week, resulting over the last three years in over 700 essays and close to 900,000 words. The blog has registered about three million hits.

While it is not easy to produce this level of output, it is not that hard either, provided one is interested in what one is writing about. One of the consequences of producing this output is that I now have extreme contempt for most of the well-known columnists (david Brooks, Maureen Dowd, Charles Krauthammer, David Broder, Richard Cohen, etc.) that occupy the pages of newspapers and magazines. Many of the better known ones are employed full time and have paid researchers to help them gather material for their columns. Given all those resources, it is remarkable how vapid and lacking in content their columns are.

Let me make clear that I am not saying that I am better than them. But given that I have a full-time job and have to do all my own research and edit my own work on my own time, I feel that these columnists should be producing far better output, instead of the superficial dreck they currently do that wastes so much newsprint. In fact, there are very many writers on the web (Glenn Greenwald, Matt Taibbi, Matthew Yglesias, Steve Benen, Juan Cole, Stephen Zunes, Robert Jensen, the pseudonymous Digby, Justin Raimondo, Jim Lobe, Ray McGovern, Greg Sargent, Paul Craig Roberts, Alexander Cockburn, to name just a few off the top of my head) who produce far, far better political analyses than the so-called elite columnists, and many of them are also writing on their own and on their own time. These good web writers not only have sharper intellects and biting prose styles, they provide links to the sources so one can see if the facts on which they base their analyses warrant their conclusions. In contrast, the well-known op-ed writers tend to rely on Villager cocktail party chatter and unnamed sources, making their output more like political gossip columns

On the basis of the quality of the content, the traditional columnist should have long ago become extinct. But we must remember that these columnists serve a much more important purpose than informing readers and it is this that keeps them around. These columnists are like the goal posts on a football field, they define the boundaries within which the political game must be played, with the so-called liberals at one end and the so-called conservatives at the other end. To be considered 'respectable' and be invited to play, one must tacitly agree to stay within these defined boundaries. Step outside those boundaries, or even question the rules of the game, and you are out of the game and summarily excluded. You are no longer 'serious', just some kind of wild-eyed, irrational ideologue.

Furthermore, the so-called liberals and so-called conservatives are both part of the one pro-business/pro-war party that rules this country. They are all Villagers.

Writing this blog has been of benefit to me personally. The sheer discipline that it forces on me to write daily has resulted in greater productivity. Last year I had five articles accepted for publication, three of which started out as extensive blog entries, which meant that I had done much of the research and writing and editing long before I considered submitting it to a journal. The other two articles were also to easy to write because of the discipline that has been imposed on me by trying to meet the demands of the blog. The blog has made me a far more efficient writer, if not necessarily a better one.

There is one thing about the blog that I have still not quite come to terms with, and that is the personal exposure. I am by nature a private person and initially saw myself only writing about abstract ideas in a coolly analytical way, without revealing much about myself. But it is hard to maintain that level of detachment when one is passionate about something. Although I do not dwell on the details of my personal life (which is very boring anyway), I have discovered what writers know, that you cannot help but reveal things about yourself whenever you write about anything you care about. You inevitably reveal your attitudes and values.

I have tried to come to terms with the fact that regular readers of this blog must have a pretty good idea about what drives me as a person. I still find it disconcerting, however, when I meet someone for the first time and that person says "Oh, I read your blog", because I realize that that person knows quite a bit about me while I know nothing about that person.

But that is a minor discomfort. The blog has been a source of intellectual stimulus for me. It has not yet reached a stage where I have run out of new ideas to write about and start repeating myself, although reading some of the old entries I find myself surprised at some of the things I had forgotten I said. But so far, I have not regretted anything that I have posted or found anything completely wrong, except for predictions for the winners of political contests where I am almost always wrong.

The blog is still fun for me, which I why I keep writing. Thanks for reading.

January 24, 2008

Review: God Is Not Great by Christopher Hitchens

I finally got around to reading Hitchens' book debunking all forms of religion. I must say that I found it curiously unsatisfying. It is hard to put my finger on the reasons since I agreed with almost all the things he said.

The book seeks to show that religions (he focuses mainly on Judaism, Christianity, Islam, and Mormonism) are basically frauds initiated by charlatans and con-men, perpetrated on gullible people, and perpetuated by huge religious vested interests that either make a lot money out of the religion racket and/or use it as a form of coercion to suppress dissent (both in thought and practice) often in collusion with corrupt governments.

The book looks at the sacred texts of these religions (Bible, Koran, Book of Mormon) and shows how they are riddled with contradictions and inaccuracies and downright barbarisms, are very parochial in their thinking, of extremely doubtful historicity, and the product of many writers and editors, polishing and changing to suit their own needs and to achieve largely self-serving political and social goals.

The book also looks at the founders of these religions (Moses, Jesus, Muhammad, Joseph Smith) and either finds little or no evidence for their actual existence (no evidence at all for Moses and little for Jesus) or if they occurred later enough that their existence could be at least partially corroborated (Muhammad in the 7th century) or fully corroborated (Joseph Smith in the 19th century), contemporaneous records indicate that they were likely self-serving con-men who founded movements and doctrines that conveniently coincided with their own interests and personal gain.

All this is well and good and I have no quarrel with any of it. I think that what bothered me about the book was the unevenness of its writing, coupled with a certain amount of pretentiousness. Everyone, including critics of his views, says that Hitchens is a brilliant writer and I get the feeling that this has gone to his head, so that he tries too hard to live up to that reputation, dropping esoteric references to erudite works and inserting unfamiliar phrases in French and Latin without translations. I find him to be a good writer when he is in good form but have never been overwhelmed by his alleged brilliance. In this book, there are some very good passages mixed with others that seem to lack coherence, a product of either laziness or bad editing.

He also has some annoying verbal tics. For example, he frequently refers to human beings (especially those he does not approve of) as 'mammals' instead of 'people'. This is, of course, true but it is still jarring to read.

The book also flits from topic to topic, not going into much depth, and taking shots all over the place. It is a polemical book, which is fair enough. But it seems to be simply a collection of pot shots taken at religion. Let's face it, religion is an easy target: it is full of internal contradictions, free of evidence for its preposterous claims, lacking contact with reality, riddled with barbarities, profoundly anti-science, and its history is awful. Taking broad swipes at it as Hitchens does is bound to hit the target somewhere, just like firing a shotgun at a dense flock of birds is sure to bring down something as long as one aims in the general direction. But it is not pretty.

I personally prefer the rapier skills of writers like Richard Dawkins or Daniel Dennett or Victor Stenger. They are the authors of more tightly argued books, which carefully lay out the premises and claims of religion, and then proceed to systematically demolish them.

Perhaps it is no accident that these other writers are scientists while Hitchens is not, and I am partial to science-based critiques of religion. I believe that it is science that is steadily demolishing the case for religion and god and thus scientists are best situated to deliver these blows. Science is advancing all the time, explaining the previously inexplicable and giving ever more reasons to not believe in god. In contrast religious apologists have no new arguments and still trot out those proposed by apologist religious philosophers from centuries or millennia ago, people who could only plausibly claim make their cases at a time before Newton and Darwin and Einstein, when the world seemed a lot less comprehensible than it does now. Even then, these philosophers' claims have to be reinterpreted and limited to take into account modern scientific developments.

So while Hitchens' book is a quick and easy read (I finished its nearly 300 pages over a weekend) and I can recommend it, it is not a book that will be on my reference shelf to be periodically sought for fresh insights.

When reading a book I like to mark out for future reference good passages that make a point tellingly. There are some in Hitchens' book that are very good and I have used them in previous posts. But sadly, he had only a very few passages that struck me as worth preserving.

God is Not Great is a good book, worth reading, but I expected much better. Perhaps that is my fault.

POST SCRIPT: Dan Savage in South Carolina

Dan Savage reports from South Carolina just before the Republican primary, and then has an amusing discussion about his experiences there with religion on Bill Maher's show.

January 23, 2008

Our inner fish and other evolution fun facts

Even though I am not a biologist, I find evolution to be an endlessly intriguing subject, constantly throwing up intriguing new facts. Here are some recent items that caught my eye.

Stephen Colbert has a fascinating interview with evolutionary biologist Neil Shubin, discoverer of the fish-land animal transitional fossil Tiktaalik, about how much of our human biology came from fish. In his 2008 book Your Inner Fish: A Journey Into the 3.5 Billion-Year History of the Human Body, Shubin points out that although superficially we may look very different, many of our human features can be found in to have analogous forms in fish and thus probably existed from the time that we shared common fish-like ancestors with them. (Incidentally, Shubin was one of the expert witnesses in the Dover intelligent design trial in which he discussed the theory of evolution and the role that Tiktaalik played in clarifying the link between fish and land animals.)

For me, one of the most surprising things in learning about evolution was that whales, dolphins, and porpoises evolved from land mammals that returned to the sea from which their own ancestors had emerged. In fact, hippos are the animals most closely related to modern day whales.

Researchers have now discovered in the Kashmir region the fossils of a land-based ancestor to whales, dolphins, and porpoises. The fox-sized Indohyus, as it has been called, lived 48 million years ago and is an even closer relative to the whales than hippos, and sheds more light on how whales cam to be.

"The new model is that initially they were small deer-like animals that took to the water to avoid predators," Professor Thewissen told BBC News. "Then they started living in water, and then they switched their diet to become carnivores."

And then there was the New Scientist report last week of the discovery of a two million year old Uruguayan fossil of a rodent (Josephoartigasia monesi) that weighed about a thousand kilograms, which makes it the world's largest rodent, about the size of a large bull. Of course, this species of giant rodent is extinct. The largest rodents now are the capybara, also found in South America, which clock in at a mere 50 kilos.

In reading the report, I discovered something else that I had not known, that North and South America had once split apart, and that this may explain how the giant rodent came into being. Later the huge landmasses joined again, causing the extinction.

South America saw a huge explosion in the diversity of rodents after the continent split from North America and became an island some 65 million years ago. Dinosaurs had just been wiped out and many animal groups were filling the void they left behind.

Without competition from other mammals which were diversifying on the other side of the water in North America, rodents of all sizes emerged in South America.
. . .
Around the time that the recently discovered J. monesi was alive, the two Americas were joined once more.

Sánchez speculates that the connecting land bridge may have helped bring about the demise of the giant rodents. Animals, among them the sabre-toothed cat, crossed the bridge in both directions bringing diseases, and competition for food and territory.

It is likely that changes in the climate will have also rendered the rodents' home less hospitable. J. monesi was found in what is now an arid region, but was then lush and forested.

"Our work suggests that 4 million years ago in South America, 'mice' that were larger than bulls lived with terror birds, sabre-toothed cats, ground sloths, and giant armoured mammals," say the Uruguayan researchers.

Of course, these explanations for the rise and fall of the giant rat are speculative and need to be corroborated with further research.

This is what I love about science: the constant discovery of exciting new findings, the challenge of fitting them into a theoretical framework while maintaining consistency with other scientific theories. All these things stimulate new research and ideas.

POST SCRIPT: Scott Ritter and Edward Peck

Scott Ritter is the US Marine who was a member of Hans Blix UN team that searched Iraq for WMDs prior to the invasion. He concluded that Iraq did not have any and repeatedly said so. For being correct, he was vilified by those anxious to go to war and almost completely banished from the media while those who were wrong on everything are still there, now pushing for war with Iran.

Scott Ritter and Edward Peck (former chief of mission for the US in Iraq) will speak tomorrow (Thursday) at at 7:30 pm at Trinity Cathedral in Cleveland. They have returned from a fact-finding mission in Iran.

Suggested donation: $10 general, $5 students. Trinity Cathedral is at 2230 Euclid Ave., across from CSU. Free parking is available in the Trinity Cathedral lot: entrance on Prospect Ave at E. 22nd.

January 22, 2008

Religion and gullibility

Here are some video clips of people claiming to have supernatural powers.

In the first, magicians Penn and Teller debunk a person who claims that she can talk to dead people. (Language advisory)

Notice how, when she interviews the black man at the end about whom she has no inside information, she resorts to inferences based on racial stereotypes and simple hereditary similarities in order to make her guesses. She is clearly hoping that he has a father, uncle, or other father figure who died from heart disease. Such 'mediums' often play the odds this way.

In the next clip, Penn and Teller take a look at someone who claims that she can talk to animals using telepathy

In the third clip, Penn and Teller and fellow magician James Randi debunk Nostradamus-based predictions.

In the final one Penn and Teller take a look at an exorcist at work. (Language advisory)

What do all these things have in common? They all share one feature and that is that unscrupulous people are taking advantage of people's gullibility about the existence of the supernatural and using their emotional needs to con them. A lot of people would love to talk with their dead loved ones, they would love to talk to their pets, they would love to know what lies in the future, they would love to think that their problems are caused by demons that can be removed by a simple procedure. Thus they are only too eager to believe charlatans who promise them that they can do these things.

But all this rampant naïve credulity about the supernatural has to have a source. Why are there so many people who are so willing to believe things for which there is no evidence? I think that it is because religion has softened their minds up since childhood, weakening their powers of reasoning and logic. It has taught them that there are mysterious things out there that are beyond the reach of normal logic and evidence and science, and that one must simply believe in them. Such people are easy prey to all the charlatans out there, out to make a quick buck.

It is necessary for their very survival that religious organizations cultivate a deliberate naivete in their flock. They may say they appealing to the virtues of unthinking faith for noble reasons but they are effectively making their religious followers susceptible to fraud.

In Christopher Hitchens' book God is Not Great, he describes how religions depend upon and take advantage of people's credulity.

It is not snobbish to notice the way in which people show their gullibility and their herd instinct, and their wish, or perhaps their need, to be credulous and to be fooled. This is an ancient problem. Credulity may be a form of innocence, and even innocuous in itself, but it provides a standing invitation for the wicked and the clever to exploit their brothers and sisters, and is thus one of humanity's great vulnerabilities. No honest account of the growth and persistence of religion, or the reception of miracles and revelations, is possible without reference to this stubborn fact. (p. 160)

Without people being indoctrinated early on by religion, these other fraudsters would have a much harder time making a go of it. They depend on the dulling of reason and the intellect produced by religion in order to ply their trade.

January 21, 2008

The later Martin Luther King

(Today is the official day to commemorate the life of Dr. Martin Luther King, Jr. I am reposting (updated and edited) something I wrote two years ago.)

In reflecting on the life and message of Martin Luther King, I feel there is a need to resurrect an essential aspect of his message that he articulated during the last phase of his life. Over time, layers of gauze have covered this portion of his legacy and blurred the increasingly hard-edged and accurate vision that characterized the last years of his life.

Most people focus primarily on his "I have a dream speech" given at the March on Washington in 1963. It is important to realize that he did not retire after that oratorical triumph but went on to speak and act in ways that were often different from the emphases of his pre-1963 positions. His later emphasis on a class-based analysis of American society, his drive to unite the problems of black people with poor and working class white people, coupled with his opposition to the war in Vietnam, were a radical departure from a purely race-based civil rights struggle. All these moves cost him some support and alienated some former allies, and are what some believe precipitated his assassination.

Since his death in 1968, the mass media have increasingly portrayed King as primarily a visionary and a dreamer of a non-racial America, and some have even argued that that his dream has essentially come true, apart from some minor remaining problems. To read his last book Where Do We Go From Here: Chaos or Community is to be jolted by the piercing clarity of his wide-ranging analysis of the real problems, what needed to be done to resolve them, and the immense obstacles that lay in the way of reaching the goal of a free and fair society. It is also important (and rather chastening) to note that nearly everything that he said four decades ago is still relevant today.

What is particularly striking about King's writings is his ability to keep in balance the tension between a hard-eyed and realistic appraisal of the problems faced in trying to achieve justice (derived from his study of politics, economics, history, and philosophy) and his deep-rooted optimism in the innate decency of human beings (derived from his religious faith).

He saw that the successful multiracial coalitions that formed in the civil rights struggles and which culminated in Selma and the Voting Rights Act were just the first phase of the struggle and that these focused around the issues of treating African-Americans decently but not necessarily equally. People of all races were appalled at the lynchings and beatings, and the legal remedies that were proposed to address these horrors did not cost anything and could be supported fairly easily. It did not cost much to repeal Jim Crow segregation laws either. "There are no expenses, and no taxes are required, for Negroes to share lunch counters, libraries, parks, hotels, and other facilities with whites." But he pointed out that "the absence of brutality and unregenerate evil is not the same thing as the presence of justice."

King noted that when the issue switched to the second phase, from that of simple decency to one of equality, much of the multiracial support evaporated as the cost of the remedies for generations of injustice became clear. "The discount education given to Negroes will in the future have to be purchased at full price if quality education is to be realized. Jobs are harder and costlier to create than voting rolls. The eradication of slums housing millions of people is complex far beyond integrating lunch counters."

King praised the thousands who rushed to battle the brutalities of Selma, "heedless of danger and of differences in race, class, and religion." But he also realized that they represented "the best of America, not all of America" and "Justice at the deepest level had but few stalwart champions. . .The great majority of Americans are uneasy with injustice but unwilling yet to pay a significant price for eradicating it." He realized that while equality was the common goal of everyone, even the word was interpreted differently by whites and blacks. "Negroes have proceeded from the premise that equality means what it says. . .but most whites. . .proceed from the premise that equality is a loose expression for improvement.''

It is startling to see how well King's analyses of the status of African-Americans in US society hold up four decades later, despite all the other changes that have taken place during that time. King realized that generations of slavery and other forms of discrimination and subjugation had taken its toll on the financial, intellectual, and other resources of the African-American and thus required an enormous and concerted effort from within their own community in order to "overcome his deficiencies and his maladjustments." But he rejected out of hand the suggestion (currently enjoying a resurgence) that the poor conditions under which they lived "can be explained by the myth of the Negro's innate incapacities, or by the more sophisticated rationalization of his acquired infirmities (family disorganization, poor education, etc.).''

He was no sentimental believer that this appalling state of affairs would disappear by itself once the institutionalized roadblocks had been removed and a legally 'color blind' society had been created. He saw that the problems went much deeper than that. "Depressed living standards for Negroes are not simply the consequence of neglect. . .They are a structural part of the economic system in the United States. Certain industries and enterprises are based upon a supply of low paid, under-skilled and immobile nonwhite labor. Hand assembly factories, hospitals, service industries, housework, agricultural operations using itinerant labor would suffer economic trauma, if not disaster, with a rise in wage scales.''

In other words, powerful economic and political interests benefited from the depressed state of poor people and would strenuously resist any attempts to improve things.

He realized that achieving equality for African Americans required a massive expenditure in education, housing, and employment for blacks, but always emphasized that this must be done within the context of a general anti-poverty program meant for all poor people, of all races and religions. It is a big mistake to think of King as a leader of only black people. When he was killed, he was becoming an outspoken progressive national leader of all people, which was what made him really dangerous.

The late 60s was a time of ferment and there was a divergence between those who called for violent action to combat repression and those advocating non-violence. The main criticism leveled against the non-violence movement led by King (by critics such as those in the Black Power movement) was that it reinforced the stereotype of African-Americans as passive and meek. They argued that changing this perception required African-Americans to separate from whites and forge a more militant identity. King disagreed strongly with this analysis. In an interview, King said that "there is great deal of difference between nonresistance to evil and nonviolent resistance.'' He pointed out that anyone who had been involved in the civil rights struggles would know that nonviolent resistance, far from being passive, was a strong, determined, and effective response to injustice.

He pointed out that violent resistance was futile because its ultimate goal, the total separation of blacks and whites in the US, was absurdly unrealistic. The power of the state was overwhelming and could brutally crush any serious challenge to its authority. If the general public, black and white, did not personally identify with the struggle for justice, then they would passively stand by while this power was unleashed to crush any opposition. He knew from the history of wars in general (and World War II and the Vietnam war in particular) that the general public could and would passively accept massive injustice and cruelty and horrific destruction on even innocent civilians unless they identified in some way with those at the receiving end of the violence. We see that even now with the way most Americans are unperturbed by the deaths, injuries and displacements of millions of Iraqis because of the invasion of the country by the US. He felt that you needed public opinion on your side and the only way "that the pressure of public opinion becomes an ally in your just cause" was if they themselves were touched by the struggle, at some deep level.

King argued that while some notable victories had been won by violence (for example, the American revolution among many independence struggles in former colonial countries), such models were not applicable to the civil rights struggle because "those fighting for independence have the purpose to drive out their oppressors." King argued that blacks and whites had to live together in a post-racist US, and the only way they could do that with any sense of common community was if they joined together in the struggle to create such a just society. And he saw a united, non-violent struggle as the way to get everyone involved.

It is this firm conviction in the power of non-violence as an effective strategy, coupled with a basic sense of generosity and fairness in his outlook, his desire to see the best in even those who opposed him, that was the key to his success as a coalition builder. He was always inclusive in his thinking, trying to find ways in which to form a common cause with those who shared his basic belief in justice and equality. But he could also be scathing in his appraisal of those with whom he felt he had nothing in common, and fierce in his denunciation of the few deep-rooted racists who could not be won over.

Martin Luther King was always conscious of the importance of trying to maintain balance between the tensions pulling in different directions. He said that "a strong man must be militant as well as moderate. He must be a realist as well as an idealist." Even the subtitle of his book Chaos or Community shows his realization that the future of society lay in a delicate balance. King's murder removed from our midst someone who could hold people and movements together while moving towards a common goal and thus take us towards community. While we have not quite reached chaos in his absence, there is an urgent and deep need for a new generation of leadership that can point us towards community again.

Martin Luther King seemed to draw his strength from two sources: his wide reading and scholarship, which enabled him to always place people and events in a deeper and more meaningful context; and his ability to see the best in people. After the march in Montgomery, observing the demonstrators who were crowded together in an airport terminal, he noted "As I stood with them and saw white and Negro, nuns and priests, ministers and rabbis, labor organizers, lawyers, doctors, housemaids and shopworkers brimming with vitality and enjoying a rare comradeship, I knew I was seeing a microcosm of the mankind of the future in this moment of luminous and genuine brotherhood."

His vision of what a society should be and what must be done to achieve it is as relevant and vibrant as ever. His call to action is as compelling now as it was when he first made it.

January 18, 2008

The candidates that corporate executives like

In my two posts titled Meet the Villagers (see here and here), I argued that the oligarchy that runs the US decide early on who they approve of to be political leaders and then use the media to make sure that everyone else is eliminated from the race early. The high-handedness of the media in deciding what views we should be exposed to was on extraordinary display when MSNBC first invited Dennis Kucinich to appear in the Nevada debate because he had met their criteria, and then at the last minute changed their criteria to exclude him. (I suspect that when they first set the rule about including only the top four candidates, they assumed that the fourth would be a Villager-acceptable candidate like Biden or Dodd or Richardson.) They thus ensured that issues like single-payer health insurance and the immediate withdrawal of US troops from Iraq and the closing of bases there (to name just a few issues) would not be raised in the debate.

Recall that I said that the Villagers want election campaigns in which there is a consensus amongst the candidates to support the issues that the oligarchy care about. We are almost there.

Now a January 11, 2008 Reuters article by Kevin Drawbaugh has actually asked corporate executives which candidates they like and fear and the results bear out what I said. Of the leading candidates, they like Clinton and McCain and Romney and think they can deal with Obama.

The corporate suits fear John Edwards most. In fact, the title of the piece is "Corporate Elite Fear Candidate Edwards." Mike Huckabee is the most feared candidate on the Republican side. This is because although both their policy platforms are hardly radical, neither is strictly following the pro-business script that gets Villager approval.

The media has, as always, dutifully picked up on these cues, especially on the Democratic side. The Washington Post dismisses Edwards as "angry" (anyone who highlights and attacks the corporate control of US politics is almost invariably described as "angry" or as otherwise irrational) and insists that this is already a two-person race between Clinton and Obama. Unsurprisingly, the more overtly right-wing corporate mouthpiece Fox News also attacks Edwards.

But the best way to undermine a candidacy is by ignoring it, especially in the early stages, the way that the Edwards, Dodd, Biden, Richardson, Kucinich, and Gravel candidacies were largely ignored. If not for the televised debates, these people would have been largely invisible. Another way you eliminate those whom you don't feel deserve to be in the race is to give extraordinary attention to trivial differences among the Villager-approved candidates so that all the oxygen is used up discussing absurdly unimportant issues.

For example, by focusing heavily on spats between Obama and Clinton (Did she 'play the race card'?), and on trivialities (Did she really cry after Iowa? Is he secretly a Muslim?), and on topics like gender and ethnicity and Clinton's 'likability' and even the way she laughs (!) rather than on policy issues, the media effectively avoid talking about other candidates and thus give voters the impression that this is now a two-person contest. This despite the fact that some polls suggest that Edwards is the Democrat most likely to beat any Republican in the presidential race, while Hillary Clinton fares much worse than Edwards and Obama.

As the pro-Democratic blog Firedoglake summarizes: "If Hillary's the Democratic nominee, we could very easily lose to any likely GOP nominee. If Obama's the nominee, he does OK so long as he doesn't face McCain. But if Edwards is the nominee, we're sitting pretty. Which, I suspect, is one reason why Big Media hates John Edwards so much and does everything it can to destroy him. (Speaking of which: KingOneEye at DailyKos pointed out this morning how the NYT is ignoring a key result of its own poll on the race -- namely, that as more people get to know him, Edwards' favorability rating keeps going up.)" Greg Sargent also notes a study that supports the contention that the media is underreporting Edwards.

On the Republican side the Villagers have not been able to narrow the contest as effectively, to focus just on McCain and Romney. Huckabee keeps bobbing up to the surface although they seem to have effectively buried Ron Paul's candidacy, although the latter is doing as well as, or better than, Villager-approved candidates Fred Thompson and Rudy Giuliani who still get a lot of media play. It will be interesting to see if and for how long Huckabee can withstand the media pressure to disappear.

David Sirota tries to combat the "just a two-person Democratic race" narrative fostered by the Villagers:

For those of you who think the Democratic presidential nomination fight is just a two-way race between Obama and Clinton, check out this brand new poll from the Reno Gazette-Journal. Yup, that's right - it shows the Nevada caucus race [which will be held on Saturday-MS] a three-way, dead heat with John Edwards right in the mix.

Interestingly, this poll comes right on the heels of the Establishment viciously ratcheting up its angry attacks on the Edwards candidacy. Late last week, we saw a Reuters story headlined "Corporate Elite Fear Candidate Edwards" detailing how Wall Street moneymen and K Street lobbyists are frightened about Edwards populist, power-challenging message against greed and corruption. We also saw self-anointed Democratic "expert" Lawrence O'Donnell pen a fulminating screed demanding Edwards get out of the race - not surprising coming from a man who made his name running the U.S. Senate Finance Committee - long the most corrupt, lobbyist-ravaged panel in all of Washington (somehow, running the U.S. Congress's version of a pay-to-play casino now makes people credible "experts" in campaign strategy and political morality).

According to the nonpartisan Project for Excellence in Journalism, Edwards has long faced a media blackout - one that at least some honest media brokers like Keith Olbermann have noted. As I said a long time ago, that Edwards has even been able to compete in such a hostile environment is a testament to the power of his message.

The question we should ask is what the hostility and media blackout is really all about? I'd say the media's behavior is motivated by the same impulses that moves lobbyists to whine and cry to Reuters and self-important bloviators like O'Donnell to publicly burst a blood vessel on the Huffington Post - the people who have gotten used to the status quo are truly terrified by any candidates who they really believe will change things and threaten their power and status. Edwards is just such a candidate - one who threatens to muck up what the media and political elite want to be a race between two "nonthreatening," Wall Street-approved candidates. Obviously, it's a three-way race at this very moment - whether the Establishment likes that or not.

Incidentally, this is why efforts to broaden the base of voters are almost always done by grass-roots activist groups working independently of the major parties. These new voters are unpredictable and hence undesirable to the Villagers. The pro-business/pro-war single party is quite comfortable with the way the current political system works since it gives a huge advantage to the status quo.

POST SCRIPT: Religion and politics in the US

British comedian Pat Condell gives us his take on this topic, in a clip he calls "Pimping for Jesus." Condell's home page cheerfully describes his attitude to religion: "Hi, I'm Pat Condell. I don't respect your beliefs and I don't care if you're offended. Cheers."


January 17, 2008

Trying to assuage guilt

On my return from Sri Lanka last week, I read the back issues of the Cleveland newspapers and found that a big story was the vicious beating of a middle-aged white man by a group of six black teenagers who had accosted him while he was on a walk in his neighborhood. The man was saved from possible death because of the alarm raised by a resident (who is a faculty member at Case) who had observed the assault from his home front window and raised the alarm, which caused the attackers to flee.

The neighborhood happens to be in the same community of Shaker Heights that I live in, an inner-ring suburb of Cleveland. This neighborhood is a rarity in the US, one that is ethnically integrated and has been so since the era of civil rights legislation. The youths lived a few miles away in the city of Cleveland.

The assault was cowardly and deplorable and was condemned by everyone. But what really caused a furor was a Plain Dealer newspaper column on Sunday, January 6, 2008 written by local media personality Dick Feagler in which he argued that the message of such events was clear: integrated neighborhoods were an impossible concept in practice and white people who could afford to should simply move out of places like Shaker Heights and into almost exclusively white communities, where they would be safe from such attacks. He said that such an attitude should be called 'realism', not 'racism.'

Condemnation of his column has been swift and widespread both from his fellow columnists in the Plain Dealer and from the general public, including the victim, the person who raised the alarm, and neighborhood community leaders.

While the attack itself was indisputably awful, the reaction of people to such incidents is a kind of Rorschach test, revealing a lot about them.

Feagler's reaction was a classic example of someone using external events to assuage their own guilt. To understand it one must know that Feagler is an old-style journalist who models himself on legendary columnists Mike Royko in Chicago and Jimmy Breslin in New York. They were people from gritty, urban, working class backgrounds, hard-drinking and smoking, who would frequent the bars and other nightspots of their town and be friendly and familiar with the people of the streets, such as construction and other blue-collar workers, beat cops, petty criminals, pimps, and hustlers. From this wealth of diversity, they would draw the stories and language that filled their columns and gave their newspaper readers a glimpse into a rich world that lay just beneath the surface.

Feagler is a Royko/Breslin wannabee who proudly recounts his childhood experiences growing up in a working class Cleveland neighborhood and still tries to portray himself as a Clevelander. Many of his columns are complaints about the decline of the city and the schools from the time when he was young. But his problem is that he long ago moved away from the working class neighborhoods of his upbringing and into the well-to-do and predominantly white suburb of Bay Village, which is far from Cleveland, both literally and figuratively. He thus faces the dilemma of all those who like to portray themselves as just regular, working class folks, men of the people, at home with all classes and ethnicities, but have chosen to live in places that have little such diversity and are more like enclaves for the wealthy. Such people have a sense of guilt at abandoning the places that they grew up in. Running away from a situation rather than staying and trying to improve things gives one a feeling of being a coward and it is hard to live with that.

For many such people, the only way to salvage their self-image is to argue that they were forced into this action and that it was eminently sensible to move. When others take the same action that you did, you feel a little more vindicated. So Feagler's call for other white people to leave integrated areas is really a plea for others to not judge him harshly for having left.

Although I disagree strongly with what Feagler said, I think I understand what is driving it because I live with feelings of guilt and cowardice similar to Feagler's. I had always wanted to live and work in Sri Lanka, amongst the family and friends that I had grown up with, to try and improve the conditions in that country. But in 1983, following an attack on an army truck, a vicious anti-Tamil pogrom was unleashed by the then government of Sri Lanka in which unchecked mobs rampaged the streets, killing Tamils and setting fire to their homes and buildings, while the government's police and security forces stood idly by, sometimes even egging the mobs on. I saw these things first hand and they required me my family and me (because we were Tamils) to go into hiding for about a week to escape possible death.

I was furious at the government for abandoning its basic function of maintaining order and instead handing power over to its mobs and goons. (When I saw the film Hotel Rwanda I relived the awful sensation of what it feels like when you are completely powerless and unprotected from mobs who had been your neighbors the previous day, and when even the government is against you, although what happened in Rwanda was a massacre on a far larger scale than occurred in Sri Lanka.) I felt that such a society was not one in which I could bring up my older daughter, who was then just three months old. So in sorrow and anger I emigrated to the US and have stayed here ever since, returning to Sri Lanka only for brief visits. The ethnic violence in Sri Lanka continues to this day, ebbing and (currently) flowing.

But when you leave a bad situation, you are essentially weakening the side that is trying to make the situation better, and strengthening the hand of the bad elements. Those who leave may try to justify it as 'realism' but there is undoubtedly an element of cowardice involved. The guilt I felt for essentially running away from a problem and leaving others to deal with it rather than facing up to it personally has never gone away, although I have come to terms with it. I still wonder if I did the right thing, although the people I know in Sri Lanka keep saying that I was smart to move away. I did notice that when others I knew also took the step of leaving the country, it seemed to justify my decision, making it seem to me like I did the right thing. When others stayed or even returned to Sri Lanka, it made my decision seem wrong.

For people who leave a bad situation, there is a temptation to eagerly highlight reports of horrible events because it seems to retrospectively justify their decision to leave. In my case, it is easy to resist that temptation because I still have close family and friends living in Sri Lanka and every incident of violence there triggers alarm about their safety. But if you have no remaining real links to the place you left behind, it is tempting to view events through the lens of one's own emotional needs and focus only on the bad things that happen.

I think that this is what is driving Feagler's views. Every person who continues to live in integrated neighborhoods is a silent living rebuke to his decision to move away from them into an ethnically and economically exclusive enclave, while every person who moves away is a vindication of his own decision. Every sign that Cleveland is getting better implies that he made a mistake in moving. Every sign of its decline shows his prescience in abandoning the city.

Feagler is old enough that he should have the self-awareness to realize that his advice to other white people to move out of integrated communities is largely self-serving. He has drawn the wrong lesson from this deplorable event. It is not about him and his need to assuage his own guilt.

The vicious beating was a terrible thing to have had happen but it was an isolated incident and the youths involved were not even people from the neighborhood where the assault took place, so it was not a reflection on ethnic relations within the community. Unfortunately, such things can, and do, happen everywhere.

POST SCRIPT: Jesus Christ Superstar

If you haven't seen this 1973 rock musical, you have missed a treat. This is Andrew Lloyd Webber's best work, raised to a high level by the superb lyrics of Tim Rice and the magnificent performance of Carl Anderson who sang the part of Judas. The film exploded into life whenever Anderson appeared and he stole every scene in which he appeared. Anderson died in 2004 of leukemia.

Here is Anderson in the opening sequence:

And here he is singing the title song:


January 16, 2008

Jury service and jury nullification-2

The fourth time I was empanelled was for a criminal case involving charges of felonious assault where the defense said that it would argue self-defense. Once again, there was an oral voir dire, which included questions about whether we had ever been involved in any physical altercation.

It was during the voir dire that I ran into a problem. One of the prosecuting counsel asked if the jurors would be willing to convict a person on the facts of the case even if they felt the law under which the person was being prosecuted was unjust. It was clear that he expected you to answer 'yes' to this question. We have all seen at least some courtroom dramas where the judge instructs the jury on the law to be applied and the jury is asked to judge based only on the facts of the case, and not to judge the validity of the law itself.

What is not well known is that the jury has the right, in criminal cases, to acquit the accused even if he or she is clearly guilty on the facts, if the jury feels that the law that was used to convict was unjust. This procedure is known as jury nullification and I have written about it before. (See here and here.) In the past, juries have nullified laws and brought in acquittals in cases involving freedom of assembly, freedom of the press, harboring fugitive slaves, and so on, and their repeated refusal to convict have led to the repeal of those unjust laws and given us some of the basic freedoms we now take for granted. But despite this fundamental right that juries possess, courts do not inform juries of this right and are actively hostile to doing so.

I was placed in a quandary by the prosecutor's question. What I knew about the case at hand was such that it seemed highly unlikely that it would involve an unjust law. But since I knew about jury nullification, I could not in good conscience agree to a blanket statement that I would convict even if I felt the law to be unjust. But if I said in open court that I could not convict based on an unjust law, then I would have to explain the whole business of jury nullification. While that might have been educational for my fellow jurors, it might have prejudiced the entire jury panel and thrown a spanner in the works for a case that did not involve a high principle. So I asked the judge if I might talk to him privately. This was allowed and the judge, all the counsel, the court recorder, and I moved to his chambers next door where I explained my problem. We then went back to the courtroom where the prosecutor asked me a few more questions. Then she dismissed me from the panel.

I expected this to happen. Prosecutors do not like jury nullification because it works only one way, and that is against them, since it only gives jurors the right to acquit on the basis on an unjust law.

This is the problem currently with jury nullification. It is a right of juries that is not only not publicized but actively hidden from jurors by the court system. If someone is aware of it and says so, he or she is likely to be struck from the pool of potential jurors in criminal cases. There are ways to get around this, apparently based upon the fact that the oath or affirmation one takes during voir dire is not enforceable, so one can apparently say that one will convict on the basis of the law even if one has no intention of doing so. Whether one takes this route has to be up to the individual. But I am uncomfortable doing this, especially in a case where there is no high principle involved. When one swears or affirms that one is going to tell the truth, one is obliged to follow the spirit as well as the letter of the law. Politicians use the careful parsing of statements to lie to us and make us think they mean one thing while intending to do another. But it is this kind of behavior that leads to bad governments and gets us into wars. There is no reason for ordinary people to copy that kind of disgraceful behavior. I much prefer that the issue of jury nullification become public knowledge so that juries routinely know about it, even without being told by the judge, the way we currently know about our Miranda rights. The best way to do this might be for the popular courtroom dramas in TV and film to deal with it frequently. For example, I think the William Penn trial would make a terrific film, that would put the focus on jury deliberations even better than the classic Twelve Angry Men, both for its dramatic content and for its educational value

As things stand now, it looks like I will never be able to serve on a criminal jury because of my knowledge of jury nullification.

It is important that everyone know about jury nullification because we have entered an era in which there are increasing violations of those rights we have long taken for granted. Laws are being passed that are taking away many cherished and hard-won rights, such as habeas corpus. We are already having trials in which ordinary people are being subject to harsh treatment and even torture and tried under draconian and unjust laws, all in the name of security and fighting the so-called war on terror, but in reality to serve the power needs of the state.

If we are called as jurors for such trials, we have to be willing to uphold our constitutional right to acquit people who are accused of crimes under unjust laws, when in reality what they may have been doing is standing up for fundamental rights. If we are asked to convict someone on the basis of evidence that has been obtained under torture, we should be willing to acquit, simply on the grounds that using torture to acquire evidence is cruel and unjust and any information gleaned from such practices is inherently suspect.

POST SCRIPT: Catholic priests caught in the lingerie section

I came across this funny clip from the 1990s British TV comedy series called Father Ted.

January 15, 2008

Jury service and jury nullification-1

By a coincidence, while writing and posting my series on the law and religion in public schools, I was also called for jury service and spent the better part of the week of November 5, 2007 in the Cuyahoga County Common Pleas Court in downtown Cleveland.

I feel strongly that the jury system is one of the greatest inventions of modern society and has been the foundation for democracy and creating and preserving freedoms. So I feel that to serve on a jury is a privilege and do not resent doing my time on the jury though it does involve some minor inconveniences and disruptions in work and home routine.

This was the third time I have been called for jury duty but I have yet to actually sit in on a case. For those not familiar with how it works, at least in Cuyahoga County where I live, when you are called for jury duty to the Common Pleas Court, it is not for a particular case but to be part of a large pool of jurors that serve many courts. So much of the time is spent waiting until your name is randomly called as needed if a case cannot be settled and should need to go to trial.

There are forty courtrooms in the building so the pool of potential jurors is quite large. All the jurors wait in a large room until such time as they are called to serve on a panel in a particular trial. The court system treats the jurors well. The jury pool room is well-lighted and spacious, has comfortable chairs, carrels with electrical outlets for people to use computers (but no internet access), plenty of newspapers and magazines to read, jigsaw puzzles, three TVs tuned to different channels in different corners of the room, vending machines and a nearby reasonably-priced cafeteria and, most importantly, a quiet room for those who simply want silence in order to read or take a nap. The court employees who run the operation are courteous, friendly, and helpful and the whole system works very smoothly.

Furthermore, I have been impressed with my fellow jury panel members. They come from all walks of life and occupations and backgrounds, and although there is a lot of joking and kidding around while we are waiting in the jury pool room about what they would prefer to be doing, they all seem to have a sense of duty and seriousness about what they have been called to do. I always feel good about the experience and would not hesitate to put my own fate in the hands of a jury if an event should arise that I am put on trial.

I have been called for four jury panels so far. They usually call a panel of eighteen potential jurors for a civil trial (from which eight jurors and two alternates are finally selected) and twenty-two for a criminal trial (from which twelve jurors and two alternates are selected). Civil trials require only a ¾th majority (i.e. at least six votes) for a verdict while criminal trials require a unanimous verdict.

Once a panel is randomly selected from the large pool in the room, we first assemble in the jury deliberation room for that particular court, and the bailiff tells us the order in which to line up to enter the court and where to sit once we get there. As we march in, the judge and the attorneys and the litigants are already present and standing, and once we are all in our places, the judge tells us to sit. The courts have sense of friendly formality and dignity, with the judge in his robes and the counsel in suits.

Then the voir dire ("to speak the truth") process begins with the judge asking us to swear or affirm (the latter for the benefit of us atheists) an oath to tell the truth. He then tells us very briefly what the case is about and how long he expects it to last, and then he and the two attorneys ask each juror a lot of probing questions about our lives (such as where we work and what we do, what our hobbies are, how many children and what they do) plus questions about any life experiences or opinions that we may have had that are relevant to the particular case we are about to judge. For example, in one civil case involving an employee being fired, we were asked if any of us had even had problems with our employers or been fired or sued. In an assault case, we were asked if we had been assaulted. This voir dire process can take quite a while, and the rest of the jury panel listen while each potential juror is questioned. On the basis of the answers, jurors can be dismissed either for cause (because, say, they know someone involved with the case) or for no reason. The latter can be done by the attorneys for either side but each has only a limited number of such peremptory challenges at their disposal.

In my very first panel about eight years ago, the two sides made a deal and the case was settled just before the voir dire process even began. The second panel I was called for about four years ago was for a murder trial. There was an exceptionally large panel called (about 40 people) suggesting that the judge felt that it was going to be difficult to select an impartial jury. The voir dire in that case was a very detailed written questionnaire that ran to over twenty pages. I was dismissed from that panel. I had requested to be excused because the judge had said that the case would last at least three weeks and this was the week before the semester began which made it awkward for me. The fact that I had stated that I opposed the death penalty also may have contributed to my dismissal.

The third time I was empanelled (which was last November) involved a civil case, a contract dispute involving an employee who had been fired. I did not request to be excused but after the oral voir dire, I was the first person to be dismissed, by the attorney for the employer. No reasons need be given for such peremptory dismissals so I have no idea what reasons he might have had.

It was in the fourth case (also last November) that I ran into a problem because of my knowledge of the legal system and jury nullification. I will write about that in the next post.

POST SCRIPT: Textbook disclaimers

Some readers will recall how in Cobb County, GA the school board inserted stickers saying, "This textbook contains material on evolution. Evolution is a theory, not a fact, regarding the origin of living things. This material should be approached with an open mind, studied carefully, and critically considered" into the biology textbooks. This was ruled unconstitutional.

But why do advocates of such disclaimers limit themselves only to biology? Here are some other textbook disclaimer stickers that can be used. Here's one suitable for a physics textbook:

This textbook asserts that gravity exists. Gravity is a theory, not a fact, regarding a force that cannot be directly seen. This material should be approached with an open mind, studied carefully, and critically considered.

And here's a disclaimer that is suitable for almost any textbook:

This book teaches kids the difference between facts and myths. Because this erodes belief in Santa Claus, the Easter Bunny, and, well, other things, parents should homeschool their kids until the age of 27.

January 14, 2008

Why rush election results?

Ever since election day on November 6, 2007 news reports in Cleveland have been obsessing over the fact that the results were delayed by a couple of hours due to a systems crash that required the backup to kick in.

I am puzzled by this obsession with speed in elections. Why is there such a rush to get election results out so quickly? This drive for speed seems particularly paradoxical in the US where election campaigns are dragged out longer than in any other country and where there is a long time interval between voting day and the newly elected person actually taking office. New office holders typically take over at the beginning of the following year, allowing for a two-month transition period. The new president does not take the oath of office until January 20.

Because of the way election day is enshrined into the constitution, politicians can plan their campaigns years in advance. The campaign for the presidency in 2008 began nearly two years before and even before we have even a single primary election, candidates and voters are already experiencing election fatigue. And yet, as soon as the polls close, there is a desperate stampede to get the election results declared as soon as possible. Even though the actual results themselves are usually released within twelve hours of the polls closing, the media cannot wait even for that and set up elaborate exit polling systems so that they can call the results almost immediately after (or even before) the polls close.

The election debacle of 2000 showed what happens when people are in such a rush to declare the winner. But exactly the wrong lesson seems to have been drawn from that debacle. We are now going towards even more high-tech voting systems using computers that will presumably give the actual results even sooner.

I think this is the wrong way to go. We should go back to a completely paper ballot system, where people mark an X in a box next to their favored candidate. Then we should have human beings count and, if necessary, recount the votes. The counters should be given plenty of time, a week, two weeks, even a month, to do a thorough job and the rest of us should simply go about our normal business until they are ready to declare the winners. As I said before, nothing at all hinges on a quick release of the results.

In fact, in our desire for speed, we are decreasing public confidence in the credibility of elections. After all, anyone remotely familiar with computers knows that while their arithmetic powers are far superior to that of humans, they are highly susceptible to hacking and thus to fraud. What is worse, computer fraud is hard to detect and can be almost invisible except to very expert eyes looking closely. I have far more faith in humans counting paper to produce an honest result than I do in computers.

Of course, no system is perfect. Paper ballot elections can be manipulated too. One can have ballot box stuffing, stolen ballot boxes, and counting errors. But to do those things on a significant scale as to sway the results requires the collusion of a lot of people at a very low level, and such conspiracies are hard to keep secret and fairly easy to detect. Electronic fraud requires just a few sophisticated people working at a high level of expertise.

I am not a Luddite who wants to back to the old days for misplaced romantic reasons. I think elections are far too important to have anything but the best system. And in this particular case, the best system just happens to be one of the oldest systems.

POST SCRIPT: Paper ballots in Northeast Ohio?

I wrote the above post a couple of months ago when I was called for jury duty during the week of November 5, 2007 just after election day and was hanging around in the waiting room. I did not post it immediately due to the long 'evolution and the law' series that was running at that time. Hence I was pleasantly surprised to see a front page news headline in the Plain Dealer of December 15, 2007 that said that Ohio's secretary of state is pushing for paper ballots for our area because of all the trouble we have had with the electronic systems. On my return to the US last week from a trip to Sri Lanka, I read that the push for paper ballots in Ohio is gaining ground.

I think this is a good idea. I think that paper ballots are better and this might be a chance to demonstrate that fact to the nation.

January 11, 2008

What is science?

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Because of my interest in the history and philosophy of science I am sometimes called upon to answer the question "what is science?" Most people think that the answer should be fairly straightforward. This is because science is such an integral part of our lives that everyone feels that they intuitively know what it is and think that the problem of defining science is purely one of finding the right combination of words that captures their intuitive sense.

But as I said in my previous posting, strictly defining things means having demarcation criteria, which involves developing a set of necessary and sufficient conditions, and this is extremely hard to do even for seemingly simple things like (say) defining what a dog is. So I should not be surprising that it may be harder to do for an abstract idea like science.

But just as a small child is able, based on its experience with pets, to distinguish between a dog and a cat without any need for formal demarcation criteria, so can scientists intuitively sense what is science and what is not science, based on the practice of their profession, without any need for a formal definition. So scientists do not, in the normal course of their work, pay much attention to whether they have a formal definition of science or not. If forced to define science (say for the purpose of writing textbooks) they tend to make up some kind of definition that sort of fits with their experience, but such ad-hoc formulations lack the kind of formal rigor that is strictly required of a philosophically sound demarcation criterion.

The absence of an agreed-upon formal definition of science has not hindered science from progressing rapidly and efficiently. Science marches on, blithely unconcerned about its lack of self-definition. People start worrying about definitions of science mainly in the context of political battles, such as those involving so-called intelligent design creationism (or IDC), because advocates of IDC have been using this lack of a formal definition to try to define science in such a way that their pet idea be included as science, and thus taught in schools as part of the science curriculum and as an alternative to evolution.

Having a clear-cut demarcation criterion that defines science and is accepted by all would settle this question once and for all. But finding this demarcation criterion for science has proven to be remarkably difficult.

To set about trying to find such criteria, we do what we usually do in all such cases, we look at all the knowledge that is commonly accepted as science by everyone, and see if we can see similarities among these areas. For example, I think everyone would agree that the subjects that come under the headings of astronomy, geology, physics, chemistry, and biology, and which are studied by university departments in reputable universities, all come under the heading of science. So any definition of science that excluded any of these areas would be clearly inadequate, just as any definition of 'dog' that excluded a commonly accepted breed would be dismissed as inadequate.

This is the kind of thing we do when trying to define other things, like art (say). Any definition of art that excluded (say) paintings hanging in reputable museums would be considered an inadequate definition.

When we look back at the history of the topics studied by people in those named disciplines and which are commonly accepted as science, two characteristics stand out. The first thing that we realize is that for a theory to be considered scientific it does not have to be true. Newtonian physics is commonly accepted to be scientific, although it is not considered to be universally true anymore. The phlogiston theory of combustion is considered to be scientific though it has long since been overthrown by the oxygen theory. And so on. In fact, since all knowledge is considered to be fallible and liable to change, truth is, in some sense, irrelevant to the question of whether something is scientific or not, because absolute truth cannot be established.

(A caveat: Not all scientists will agree with me on this last point. Some scientists feel that once a theory is shown to be incorrect, it ceases to be part of science, although it remains a part of science history. Some physicists also feel that many of the current theories of (say) sub-atomic particles are unlikely to be ever overthrown and are thus true in some absolute sense. I am not convinced of this. The history of science teaches us that even theories that were considered rock-solid and lasted millennia (such as the geocentric universe) eventually were overthrown.)

But there is a clear pattern that emerges about scientific theories. All the theories that are considered to be science are (1) naturalistic and (2) predictive.

By naturalistic I mean methodological naturalism and not philosophical naturalism. The latter, I argued in an earlier posting where these terms were defined, is irrelevant to science.

By predictive, I mean that all theories that are considered part of science have the quality of having some explicit mechanism or structure that enable the users of these theories to make predictions, of saying what one should see if one did some experiment or looked in some place under certain conditions.

Note that these two conditions are just necessary conditions and by themselves are not sufficient. (See the previous posting for what those conditions mean.) As such they can only classify things into "may be science" (if something meets both conditions) or "not science" (if something does not meet either one of the conditions.) As such, these two conditions together do not make up a satisfactory demarcation criterion. For example, the theory that if a football quarterback throws a lot of interceptions his team is likely to lose, meets both naturalistic and predictive conditions, but it is not considered part of science.

But even though we do not have a rigorous demarcation criterion for science, the existence of just necessary conditions still has interesting implications.

January 10, 2008

Necessary and sufficient conditions

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

The problem of finding definitions for things that clearly specify whether an object belongs in that category or not has long been recognized to be a knotty philosophical problem. Ideally what we would need for a good definition is to have both necessary and sufficient conditions, but it is not easy to do so.

A necessary condition is one that must be met if the object is to be considered even eligible for inclusion in the category. If an object meets this condition, then it is possible that it belongs in the category, but not certain. If it does not meet the condition, then we can definitely say that it does not belong. So necessary conditions for something can only classify objects into "maybe belongs" or "definitely does not belong."

For example, let us try to define a dog. We might say that a necessary condition for some object to be considered as a possible dog is that it be a mammal. So if we know that something is a mammal, it might be a dog or it might be another kind of mammal, say a cat. But if something is not a mammal, then we know for sure it is not a dog.

A sufficient condition, on the other hand, acts differently. If an object meets the sufficient condition, then it definitely belongs. If it does not meet the sufficient condition, then it may or may not belong. So the sufficient condition can be used to classify things into "definitely belongs" or "maybe belongs."

So for the dog case, if a dog has papers certified by the American Kennel Association, then we can definitely say it is a dog. But if something does not have such papers it may still be a dog (say a mixed breed) or it may not be a dog (it may be a table).

A satisfactory demarcation criterion would have both necessary and sufficient conditions because only then can we say of any given object that it either definitely belongs or it definitely does not belong. Usually these criteria take the form of a set of individually necessary conditions that, taken together, are sufficient. i.e., Each condition by itself is not sufficient but if all are met they become sufficient.

It is not easy to find such conditions, even for such a seemingly simple category as dogs, and that it the problem. So for the dog, we might try define it by saying that it is a mammal, with four legs, barks, etc. But people who are determined to challenge the criteria could find problems (What exactly defines a mammal? What is the difference between an arm and a leg? What constitutes a bark? Etc. We can end up in an infinite regression of definitions.)

This is why philosophers like to say that we make such identifications ("this is a dog, that is a cat") based on an intuitive grasp of the idea of "similarity classes," things that share similarities that may not be rigidly definable. So even a little child can arrive at a pretty good idea of what a dog is without formulating a strict definition, by encountering several dogs and being able to distinguish what separates dog-like qualities from non-dog like qualities. It is not completely fool proof. Once in a while we may come across a strange looking animal, some exotic breed that baffles us. But most times it is clear. We almost never mistake a cat for a dog, even though they share many characteristics, such as being small four-legged mammals with tails that are domestic pets.

Anyway, back to science, a satisfactory demarcation would require that we be able to find both necessary and sufficient criteria that can be used to define science, and use those conditions to separate ideas into science and non-science. Do such criteria exist? To answer that question we need to look at the history of science and see what are the common features that are shared by those bodies of knowledge we confidently call science.

This will be discussed in the next posting.

January 09, 2008

Improving the quality of our snap judgments

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

In a previous post, I mentioned that my Race IAT results indicated that I had no automatic preference for black or white people. This surprised me, frankly. Although I am intellectually committed to thinking of people as equal, I am still subjected to the same kinds of images and stereotypes as everyone else in society so I expected to have at least a small automatic preference for white people. But the section on Malcolm Gladwell's book Blink on 'priming' experiments might give an explanation for the null result.

The priming experiments were done by psychologist John Bargh. What he did was give two randomly selected groups of undergraduate students a small test involving words. The results of the word test itself were not relevant. What was relevant was that the first set of students encountered words like "aggressively", "bold, "rude", "bother", etc. in their test while the second set encountered words like "respect", "considerate", "patiently", "polite", etc.

After they had done the word test, the students were asked to go down the hall to the person running the experiment to get their next assignment. This was the real experiment because it had been arranged to have a confederate blocking the doorway, carrying on an inane and seemingly endless conversation with the experimenter. The experiment was designed to see if the set of students who had been unknowingly 'primed' with aggressive words would take longer to interrupt this conversation than those who had been primed with polite words. Bargh expected to see a difference, but expected that difference to be measured in milliseconds. He said "I mean, these are New Yorkers. They aren't going to just stand there. We thought maybe a few seconds, or a minute at most."

What he found was that the people primed to be rude eventually interrupted after an average of five minutes, but 82% of the people primed to be polite did not interrupt at all, even after ten minutes which was the cut-off time that had been pre-set for the experiment, thinking that no one would ever wait that long.

What these and other priming experiments suggest is that the kinds of experiences we have carry their effects subconsciously over to the next events, at least for some time.

This may explain my negative result because for some time now I have been studying the achievement gap between black and white students in the US. The more I looked at it, the more I became convinced that the concept of race is biologically indefensible, that it cannot be the cause of the gap, and that the reasons for the gap have to be looked for elsewhere.

Since my book on the subject called The Achievement Gap in US Education: Canaries in the Mine came out in June 2005, I had been thinking a lot about these ideas at the same time as I took the test, and so I was probably 'primed' to think that there is no fundamental difference between the races, and hence my null result on the Race IAT test.

This ties in with other research that I quote in my book that deals with the role that teacher expectations of students play in student achievement. Teacher expectations are an important factor but a lot of the efforts to improve teacher expectations of low-achieving students have been along the lines "All children can learn!" sloganeering. But having teachers just saying this or plastering it on school walls may not help much, if they are not convinced of its truth. If people are conscious that they are being primed, then the priming effect disappears.

What is needed is for teachers to improve their overall expectations of students is for them to have opportunities to actually see for themselves traditionally underachieving students excelling. If they can have such experiences, then the inevitable snap judgments they make about students, and which can have an effect on student performance, may be more equitable than they are now.

I have long been in favor of diversity in our educational environments but my reasons were more social, because I felt that we all benefit from learning with, and from, those whose backgrounds and experiences differ from our own. But it seems that there is an added bonus as well. When we have a broader base of experience on which to base our judgments, our snap judgments tend to be better.

January 08, 2008

Snap judgments and prejudices

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

In an earlier post, I described Malcolm Gladwell's book Blink about the way we instinctively make judgments about people. The way we make snap judgments is by 'thin-slicing' events. We take in a small slice of the phenomena we observe and associate the information in those slices with other measures. People who make good snap judgments are those people who associate the thin-slice information with valid predictors of behavior. People who make poor or prejudicial judgments are those people who associate the thin-slice information with poor predictors.

Think about what you observe about a person immediately as that person walks into your view. Gender, ethnicity, height, weight, color, gait, dress, hair, demeanor, eyes, looks, physique, gestures, voice, the list just goes on. We sweep up all these impressions in a flash. And based on them, whether we want to or not, we make a judgment about the person. Different people will weigh different elements in the mix differently.

If someone comes into my office wearing a suit, my initial impression of the person is different than if she had come in wearing jeans. (If you were mildly surprised by my using the pronoun 'she' towards the end of the last sentence, it is because, like me, you implicitly associate suits with male attire, so that the first part of the sentence made you conjure up a mental image of a man.)

A personal example of snap judgments occurs when I read Physics Today which I get every month. The obituary notices in have the magazine have a standard form. There is a head-shot of the person, with the name as the header, and one or two column inches describing the person.

Almost all of the obituaries are of old white men, not surprising for physicists of the generation that is now passing away. I found myself looking at the photo and immediately identifying whether the person was of English nationality or not. And I was right a surprising number of times. And I was not reasoning it through in any conscious way. As soon as I saw the picture came into view, I'd find myself thinking "English" or "not English". I don't know the basis of my judgments. But as I said, I was right surprisingly often.

Gladwell describes a very successful car salesman who over the years has realized that gender, ethnicity, clothes, etc. are not good predictors of whether the person is likely to buy a car or not. Someone who his fellow salespeople might ignore or dismiss because he looks like a rustic farmer, this salesman takes seriously. And because this salesman has been able to shape his intuition to ignore superficial or irrelevant things, his senses are better attuned to pick up on those cues that really matter.

Some of the strongest associations we make are those based on ethnicity, gender, and age. We immediately associate those qualities with generalizations associated with those groupings.

People are not always comfortable talking about their attitudes on race, gender, and other controversial topics. This is why surveys on such topics are unreliable, because people can 'psyche out' the tests, answering in the way they think they are expected to, the 'correct' way, rather than what they actually feel. This is why opinion polls on such matters, or in elections where the candidates are of different races or ethnicities, are hard to rely on.

There is a website, developed by researchers at Harvard University, that recognizes this problem. They have designed a survey instrument that tries to overcome this feature by essentially (as far as I can tell) measuring the time taken to answer their questions. In other words, they are measuring the time taken for you to psyche out the test. Since we have much less control over this, the researchers believe that this survey gives a better result. They claim that you cannot change your score by simply taking the test over and over again and becoming familiar with it.

If you want to check it out for yourself, go to the test site, click on "Demonstration", then on "Go to Demonstration Tests", then on "I wish to proceed". This takes you to a list of Implicit Association Tests (or IAT) and you can choose which kinds of associations you wish to check that you make.

I took the Race IAT because that was what was discussed in Gladwell's book, and it took me less than five minutes to complete. This test looks at the role that race plays in making associations. In particular it looks at whether we instinctively associate black/white people with good/bad qualities.

It turns out that more than 80% of people who have taken this test have pro-white associations, meaning that they tend to associate good qualities with white people and bad qualities with black people. This does not mean that such people are racists. They may well be very opposed to any kind of racist thinking or policies. What these tests are measuring are unconscious associations that we pick up (from the media, the people we know, our community, etc.) without being aware of them, that we have little control over.

Gladwell himself says that the test "always leaves me feeling a bit creepy." He found himself being rated as having a moderate automatic preference for whites although he labels himself half black because his mother is Jamaican.

I can see why this kind of test is unnerving. It may shake our image of ourselves and reveal to us the presence of prejudices that we wish we did not have. But if we are unconsciously making associations of whatever kind, isn't it better to know this so that we can take steps to correct for them if necessary? The successful car salesman became so because he realized that people in his profession made a lot of the unconscious associations that were not valid and had to be rejected. And he used that knowledge in ways that benefited him and his customers.

Although you cannot change your Race IAT scores by simply redoing the test, there are other things that can change your score. When I took the Race IAT, the results indicated that I have no automatic preference for blacks or whites. In a later posting, I will talk about the effects that 'priming' might have on the test results, and how that might have affected my results.

POST SCRIPT: Saying Iraq and Iran

I noticed that President Bush pronounces Iran the same way that I do ("E-rahn") but pronounces Iraq as "Eye-rack" (instead of "E-rahk"), which really grates on me. He is not the only one who does this.

I don't know how the people who live in those two countries pronounce the names but it seems reasonable to me to pronounce the two names similarly except for the last letter. Merriam-Webster's online dictionary, which provides audio as well, agrees with me on this.

January 07, 2008

Snap judgments

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

I just finished reading Malcolm Gladwell's book Blink. It deals with how we all make snap judgments about people and things, sometimes within a couple of seconds or less. Gladwell reports on a whole slew of studies that suggest that we have the ability to 'thin-slice' events, to make major conclusions from just a narrow window of observations.

I first read about this as applied to teaching in an essay by Gladwell that appeared in the New Yorker (May 29, 2000) where he described research by psychologists Nalini Ambady and Robert Rosenthal who found that by showing observers silent videoclips of teachers in action, the observers (who had never met the teachers before) were able to make judgments of teacher effectiveness that correlated strongly with the evaluations of students who had taken an entire course with that teacher. (Source: Half a Minute: Predicting Teacher Evaluations From Thin Slices of Nonverbal Behavior and Physical Attractiveness, Journal of Personality and Social Psychology, 1993, vol. 64, No. 3, 431-441.)

This result is enough to give any teacher the heebie-jeebies. The thought that students have formed stable and robust judgments about you before you have even opened your mouth on the very first day of the very first class is unnerving. It seems so unfair that you are being judged before you can even begin to prove yourself. But, for good or bad, this seems to be supported by other studies, such as those done by Robert Boice in his book Advice for New Faculty Members.

The implication for this is that the cliché "You never get a second chance to make a first impression" is all too true. And what Gladwell's New Yorker article and book seem to suggest is that this kind of thin-slicing is something that all of us do all the time. But not all of us do it well. Some people use thin-slicing to arrive at conclusions that are valid, others to arrive at completely erroneous judgments.

Those who do it well tend to be people who have considerable experience in that particular area. They have distilled that experience into some key variables that they then use to size up the situation at a glance, often without even consciously being aware of how they do it.

Seen in this way, the seemingly uncanny ability of people to identify at a glance who the good and bad teachers are might not seem that surprising. Most people have had lots of experience with many teachers in their lives, and along the way have unconsciously picked up subtle non-verbal cues that they use to correlate with good and bad teaching. They use these markers as predictors and seem to be quite good at it.

I was self-consciously reflecting on this last week when I ran two mock-seminars for visiting high-school seniors as part of "Experience Case " days. The idea was to have a seminar class for these students so that they could see what a seminar would be like if they chose to matriculate here. I found that just by glancing around the room at the assembled students at the beginning, I could tell who was likely to be an active participant in the seminar and who would not.

It was easy for me to make these predictions and I was pretty confident that I would be proven right, and I usually was. But how did I do it? Hard to tell. But I have taught for many years and encountered thousands of students and this wealth of experience undoubtedly played a role in my ability to make snap judgments. If pressed to explain my judgments I might say that it was the way the students sat, their body language, the way they made eye contact, the expression on their faces, and other things like that.

But while I am confident about my ability to predict the students' subsequent behavior in the seminar, I am not nearly as confident in the validity of the reasons I give. And this is consistent with what Gladwell reports in his book. Many of the experts who made good judgments did not know how they arrived at their conclusions or, when they did give reasons, the reasons could not stand up to close scrutiny.

He gives the example of veteran tennis pro and coach Vic Braden. Braden found that when watching tennis players about to make their second serve, he could predict with uncanny accuracy (close to 100%) when they would double fault. This is amazing because he was watching top players (who very rarely double fault) perform on television, and many of the players were people he had never seen play before. But what drove Braden crazy was that he could not say how he made his predictions. He just knew in a flash of insight that they would, and no amount of watching slow-motion replays enabled him to pinpoint the reasons.

But Gladwell points out that we use thin-slicing techniques even is situations where we do not have much experience or expertise and these judgments can lead us astray. In later postings, I will describe the kinds of situations where snap judgments are likely to lead us to shaky conclusions and where we should be alert.

POST SCRIPT: Charlie Wilson's War

The film with the above name tries to make a comedy out of the role that the US played in creating the Taleban in Afghanistan. Stanley Heller points out that this was no laughing matter for the million Afghans who died as a result of the geostrategic games played by the Soviet Union and the Carter-Reagan governments.

January 04, 2008

Atheism and Agnosticism

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

In an interview, Douglas Adams, author of The Hitchhiker's Guide to the Galaxy, who called himself a "radical atheist," explains why he uses that term (thanks to onegoodmove):

I think I use the term radical rather loosely, just for emphasis. If you describe yourself as "Atheist," some people will say, "Don't you mean 'Agnostic'?" I have to reply that I really do mean Atheist. I really do not believe that there is a god - in fact I am convinced that there is not a god (a subtle difference). I see not a shred of evidence to suggest that there is one. It's easier to say that I am a radical Atheist, just to signal that I really mean it, have thought about it a great deal, and that it's an opinion I hold seriously…

People will then often say "But surely it's better to remain an Agnostic just in case?" This, to me, suggests such a level of silliness and muddle that I usually edge out of the conversation rather than get sucked into it. (If it turns out that I've been wrong all along, and there is in fact a god, and if it further turned out that this kind of legalistic, cross-your-fingers-behind-your-back, Clintonian hair-splitting impressed him, then I think I would chose not to worship him anyway.) . . .

And making the move from Agnosticism to Atheism takes, I think, much more commitment to intellectual effort than most people are ready to put in. (italics in original)

I think Adams is exactly right. When I tell people that I am an atheist, they also tend to suggest that surely I must really mean that I am an agnostic. (See here for an earlier discussion of the distinction between the two terms.) After all, how can I be sure that there is no god? In that purely logical sense they are right, of course. You cannot prove a negative so there is always the chance that not only that a god exists but, if you take radical clerics Pat Robertson and Jerry Falwell seriously, has a petty, spiteful, vengeful, and cruel personality.

When I say that I am atheist, I am not making that assertion based on logical or evidentiary proofs of non-existence. It is that I have been convinced that the case for no god is far stronger than the case for god. It is the same reasoning that makes me convinced that quantum mechanics is the theory to use for understanding sub-atomic phenomena or natural selection is the theory to be preferred for understanding the diversity of life. There is always the possibility that these theories are 'wrong' in some sense and will be superceded by other theories, but those theories will have to have convincing evidence in their favor.

If, on the other hand, I ask myself what evidence there is for the existence of a god, I come up empty. All I have are the assurances of clergy and assertions in certain books. I have no personal experience of it and there is no scientific evidence for it.

Of course, as long time readers of this blog are aware, I used to be quite religious for most of my life, even an ordained lay preacher of the Methodist Church. How could I have switched? It turns out that my experience is remarkably similar to that of Adams, who describes why he switched from Christianity to atheism.

As a teenager I was a committed Christian. It was in my background. I used to work for the school chapel in fact. Then one day when I was about eighteen I was walking down the street when I heard a street evangelist and, dutifully, stopped to listen. As I listened it began to be borne in on me that he was talking complete nonsense, and that I had better have a bit of a think about it.

I've put that a bit glibly. When I say I realized he was talking nonsense, what I mean is this. In the years I'd spent learning History, Physics, Latin, Math, I'd learnt (the hard way) something about standards of argument, standards of proof, standards of logic, etc. In fact we had just been learning how to spot the different types of logical fallacy, and it suddenly became apparent to me that these standards simply didn't seem to apply in religious matters. In religious education we were asked to listen respectfully to arguments which, if they had been put forward in support of a view of, say, why the Corn Laws came to be abolished when they were, would have been laughed at as silly and childish and - in terms of logic and proof -just plain wrong. Why was this?
. . .
I was already familiar with and (I'm afraid) accepting of, the view that you couldn't apply the logic of physics to religion, that they were dealing with different types of 'truth'. (I now think this is baloney, but to continue...) What astonished me, however, was the realization that the arguments in favor of religious ideas were so feeble and silly next to the robust arguments of something as interpretative and opinionated as history. In fact they were embarrassingly childish. They were never subject to the kind of outright challenge which was the normal stock in trade of any other area of intellectual endeavor whatsoever. Why not? Because they wouldn't stand up to it.
. . .
Sometime around my early thirties I stumbled upon evolutionary biology, particularly in the form of Richard Dawkins's books The Selfish Gene and then The Blind Watchmaker and suddenly (on, I think the second reading of The Selfish Gene) it all fell into place. It was a concept of such stunning simplicity, but it gave rise, naturally, to all of the infinite and baffling complexity of life. The awe it inspired in me made the awe that people talk about in respect of religious experience seem, frankly, silly beside it. I'd take the awe of understanding over the awe of ignorance any day.

What Adams is describing is the conversion experience that I described earlier when, suddenly switching your perspective seems to make everything fall into place and make sense.

For me, like Adams, I realized that I was applying completely different standards for religious beliefs than I was for every other aspect of my life. And I could not explain why I should do so. Once I jettisoned the need for that kind of distinction, atheism just naturally emerged as the preferred explanation. Belief in a god required much more explaining away of inconvenient facts than not believing in a god.

POST SCRIPT: The Noah's Ark horror

One of the great triumphs of Judeo-Christian propaganda is getting their followers to overlook the fact that the Biblical Noah story, which many of them believe to be true, would be the worst act of genocide ever, and committed by god to boot. Hellbound Alleee tries to correct this.

January 03, 2008

Precision in language

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Some time ago, a commenter to this blog sent me a private email expressing this view:

Have you ever noticed people say "Do you believe in evolution?" just as you would ask "Do you believe in God?" as if both schools of thought have equal footing? I respect others' religious beliefs as I realize I cannot disprove God just as anyone cannot prove His existence, but given the amount of evidence for evolution, shouldn't we insist on asking "Do you accept evolution?"

It may just be semantics, but I feel that the latter wording carries an implied affirmation just as "Do you accept that 2+2=4?" carries a different meaning than "Do you believe 2+2=4?"

I guess the point I'm trying to make is that by stating something as a belief, it opens the debate to the possibility that something is untrue. While this may fine for discussions of religion, shouldn't the scientific community be more insistent that a theory well supported by physical evidence, such as evolution, is not up for debate?

It's a good point. To be fair, scientists themselves are partly responsible for this confusion because we also say that we "believe" in this or that scientific theory, and one cannot blame the general public from picking up on that terminology. What is important to realize, though, is that the word 'believe' is being used by scientists in a different sense from the way it is used in religion.

The late and deeply lamented Douglas Adams, author of The Hitchhiker's Guide to the Galaxy, who called himself a "radical atheist" puts it nicely (thanks to onegoodmove):

First of all I do not believe-that-there-is-not-a-god. I don't see what belief has got to do with it. I believe or don't believe my four-year old daughter when she tells me that she didn't make that mess on the floor. I believe in justice and fair play (though I don't know exactly how we achieve them, other than by continually trying against all possible odds of success). I also believe that England should enter the European Monetary Union. I am not remotely enough of an economist to argue the issue vigorously with someone who is, but what little I do know, reinforced with a hefty dollop of gut feeling, strongly suggests to me that it's the right course. I could very easily turn out to be wrong, and I know that. These seem to me to be legitimate uses for the word believe. As a carapace for the protection of irrational notions from legitimate questions, however, I think that the word has a lot of mischief to answer for. So, I do not believe-that-there-is-no-god. I am, however, convinced that there is no god, which is a totally different stance. . .

There is such a thing as the burden of proof, and in the case of god, as in the case of the composition of the moon, this has shifted radically. God used to be the best explanation we'd got, and we've now got vastly better ones. God is no longer an explanation of anything, but has instead become something that would itself need an insurmountable amount of explaining…

Well, in history, even though the understanding of events, of cause and effect, is a matter of interpretation, and even though interpretation is in many ways a matter of opinion, nevertheless those opinions and interpretations are honed to within an inch of their lives in the withering crossfire of argument and counterargument, and those that are still standing are then subjected to a whole new round of challenges of fact and logic from the next generation of historians - and so on. All opinions are not equal. Some are a very great more robust, sophisticated and well supported in logic and argument than others.

When someone says that they believe in god, they mean that they believe something in the absence of, or even counter to, the evidence, and even to reason and logic. When scientists say they believe a particular theory, they mean that they believe that theory because of the evidence and reason and logic, and the more evidence there is, and the better the reasoning behind it, the more strongly they believe it. Scientists use the word 'belief' the way Adams says, as a kind of synonym for 'convinced,' because we know that no scientific theory can be proven with 100% certainty and so we have to accept things even in the face of this remaining doubt. But the word 'believe' definitely does not carry the same meaning in the two contexts.

This can lead to the generation of confusion as warned by the commenter but what can we do about it? One option is, as was suggested, to use different words, with scientists avoiding use of the word 'believe.' I would have agreed with this some years ago but I am becoming increasingly doubtful that we can control the way that words are used.

For example, there was a time when I used to be on a crusade against the erroneous use of the word 'unique'. The Oxford English Dictionary is pretty clear about what this word means:

  • Of which there is only one; one and no other; single, sole, solitary.
  • That is or forms the only one of its kind; having no like or equal; standing alone in comparison with others, freq. by reason of superior excellence; unequalled, unparalleled, unrivalled.
  • Formed or consisting of one or a single thing
  • A thing of which there is only one example, copy, or specimen; esp., in early use, a coin or medal of this class.
  • A thing, fact, or circumstance which by reason of exceptional or special qualities stands alone and is without equal or parallel in its kind.

It means, in short, one of a kind, so something is either unique or it is not. There are no in-betweens. And yet, you often find people saying things like "quite unique" or "very unique" or "almost unique." I used to try and correct this but have given up. Clearly, people in general think that unique means something like "rare" and I don't know that we can ever change this even if we all become annoying pedants, correcting people all the time, avoided at parties because of our pursuit of linguistic purity.

Some battles, such as with the word unique are, I believe, lost for good and I expect the OED to add the new meaning of 'rare' some time in the near future. It is a pity because then we would then be left with no word with the unique meaning of 'unique', but there we are. We would have to say something like 'absolutely unique' to convey the meaning once reserved for just 'unique.'

In science too we often use words with precise operational meanings while the same words are used in everyday language with much looser meanings. For example, in physics the word 'velocity' is defined operationally by the situation when you have an object moving along a ruler and, at two points along its motion, you take ruler readings and clock readings, where the clocks are located at the points where the ruler readings are taken, and have been previously synchronized. Then the velocity of the moving object is the number you get when you take the difference between the two ruler readings and divide by the difference between the two clock readings.

Most people (especially sports commentators) have no idea of this precise meaning when they use the word velocity in everyday language, and often use the word synonymously with speed or, even worse, acceleration, although those concepts have different operational meanings. Even students who have taken physics courses find it hard to use the word in its strict operational sense.

Take, for another example, the word 'theory'. By now, as a result of the intelligent design creationism (IDC) controversy, everyone should be aware that the way this word is used by scientists is quite different from its everyday use. In science, a theory is a powerful explanatory construct. Science depends crucially on its theories because they are the things that give it is predictive power. "There is nothing so practical as a good theory" as Kurt Lewin famously said. But in everyday language, the word theory is used as meaning 'not factual,' something that can be false or ignored.

I don't think that we can solve this problem by putting constraints on how words can be used. English is a wonderful language precisely because it grows and evolves and trying to fix the meanings of words too rigidly would perhaps be stultifying. I now think that we need to change our tactics.

I think that once the meanings of words enter mainstream consciousness we will not be successful in trying to restrict their meanings beyond their generally accepted usage. What we can do is to make people aware that all words have varying meanings depending on the context, and that scientific and other academic contexts tend to require very precise meanings in order to minimize ambiguity.

Heidi Cool has a nice entry where she talks about the importance of being aware of when you are using specialized vocabulary, and the need to know your audience when speaking or writing, so that some of the pitfalls arising from the imprecise use of words can be avoided.

We have to realize though that despite our best efforts, we can never be sure that the meaning that we intend to convey by our words is the same as the meaning constructed in the minds of the reader or listener. Words always contain an inherent ambiguity that allows the ideas expressed by them to be interpreted differently.

I used to be surprised when people read the stuff I wrote and got a different meaning than I had intended. No longer. I now realize that there is always some residual ambiguity in words that cannot be overcome. While we can and should strive for maximum precision, we can never be totally unambiguous.

I agree with philosopher Karl Popper when he said, "It is impossible to speak in such a way that you cannot be misunderstood." The best we can hope for is to have some sort or negotiated consensus on the meanings of ideas.

POST SCRIPT: Huckabee and Paul

Alexander Cockburn discusses why Mike Huckabee and Ron Paul are the two most interesting candidates on the Republican side.

January 02, 2008

Atheism and meaning

(As is my custom this time of year, I am taking some time off from writing new posts and instead reposting some old favorites (often edited and updated) for the benefit of those who missed them the first time around or have forgotten them. The POST SCRIPTS will generally be new. New posts will start again on Monday, January 5, 2009. Today's post originally appeared in October 2007.)

People often think that atheists do not have a life affirming philosophy. They have sometimes taken the quote by prominent atheist Richard Dawkins (Scientific American November 1995, p. 85) that "The universe that we observe has precisely the properties we should expect if there is, at bottom, no design, no purpose, no evil, no good, nothing but pitiless indifference" to argue that atheism leads to a philosophy of hopelessness and despair. I have heard several talks by intelligent design creationism advocate Michael Behe and he repeatedly uses the quote to get a laugh at the expense of atheism by saying that Dawkins must be a real downer at parties. But anyone who has seen interviews with Dawkins and read his writings will come away with the contrary impression, that he is a witty, courteous, and engaging man with a mischievous sense of humor. One can well imagine him livening up any party. Dawkins was merely making a factual observation about the nature of the universe, saying that it is futile to try and obtain our meaning and purpose externally from the universe, although we can observe it with awe and wonder. We can, and should, construct meaning and purpose for our lives.

The idea that atheists "suffer" from a "lack of meaning" is a curious preoccupation of religious apologists. For example, a Catholic priest called Jonathan Morris talks with sympathetic interviewers on Fox News and trots out the same old tired and discredited arguments for the existence of god, including the eye. Although he seems to have very little understanding of science himself, he has the audacity to suggest that people like Richard Dawkins don't know science. He also suggests that atheists suffer because they know that "the world makes a whole lot more sense if god does exist". Morris does not, of course, provide any evidence that atheists are more unhappy than believers.

In actual fact, the world make a lot more sense if you think that god does not exist. As this latest series of posts has repeatedly pointed out, it is religious people have to repeatedly resort to the MWC ('mysterious ways clause') when confronted with the numerous awkward contradictions that are raised in trying to understand a world that has a god in it.

Every atheist I know is relieved that they don't have to try and make sense out of absurd religious doctrines. When atheists do have regrets about the non-existence of god it is usually because it precludes the possibility of meetings one's dead loved ones again in the afterlife or, as philosopher Colin McGinn says, because it means that the people who do real evil and create suffering will likely escape punishment in this world. I admit that it would be nice to think that such people will get their comeuppance in the next. But the evidence is so overwhelmingly against the existence of god and life after death that to cling to it is to indulge in escapism. In the long run it is better not to take refuge in illusions but accept reality and use that knowledge as a spur to work for peace and justice in this world.

Religious people are given a philosophy of life and a sense of meaning packaged in with the religious teaching they imbibe from childhood. Atheism, on the other hand, is not itself a philosophy, any more than disbelief in fairies or unicorns (afairyism? aunicornism?) are philosophies. But atheism has implications for philosophy.

Since atheists do not have off-the-shelf philosophies and meaning that they can adopt as a package the way that religious people do, they have to create their own. Thus atheists have to do some reflective introspection to construct a philosophy of life, and in that sense, being an atheist requires a certain level of intellectual effort. Most naturally tend to be attracted to versions of humanist and existential philosophies. Ethicist Peter Singer in his book Writings on an Ethical Life (2000) outlines some ideas about what kinds of meanings and moral and ethical values an atheist might adopt. (I hope to write more about these some day).

That search for meaning in the absence of god can produce wonderful results. In the British TV program The Root of All Evil, the writer Ian McEwan says:

We are the very privileged owners of a brief spark of consciousness and we therefore have to take responsibility for it. We cannot rely, as Christians or Muslims do, on a world elsewhere, a paradise to which one can work towards and maybe make sacrifices, or crucially make sacrifices of other people. We have a marvelous gift, and you see it develop in children, this ability to become aware that other people have minds just like your own and feelings that are just as important as your own. And this gift of empathy seems to me to be the building block of our moral system.

If you have a sacred text that tells you how the world began or what the relationship is between this sky god and you, it does curtail your curiosity. It cuts off a source of wonder. The loveliness of the world in its wondrousness is not apparent to me in Islam or Christianity or the other major religions.

Richard Dawkins adds:

By disclaiming the idea of a next life we can take more excitement in this one. The here and now is not something to be endured before eternal bliss or damnation. The here and now is all we have, an inspiration to make the most of it. So atheism is life affirming in a way religion can never be. Look around you. Nature demands our attention, begs us to explore, to question. Religion can provide only facile, unsatisfying answers. Science, in constantly seeking real explanations, reveals the true majesty of our world in all its complexity. People sometimes say "There must be more than just this world, than just this life." But how much more do you want?

Atheists have one huge advantage over religious people that more than compensates for the fact that they are not handed a philosophy of life by religion. It is that they do not have to deal with all the intractable logical problems that belief in god entails and for which religious believers have to repeatedly invoke the MWC and shut down further investigations. They are free to pursue intellectual inquiry with no restrictions. Unlike religious believers, on the road to increased knowledge they do not have to obey signs that cordon off some areas saying "No admittance by order of religion." They are free to go anywhere and explore anything.

And that is a wonderfully liberating feeling.

POST SCRIPT: Year in review

Here is the second part of Tom Tomorrow's year in review. (The first part is here.)

Opinion polls and statistics

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

As the 2008 election season gets into high gear, we will get inundated with the results of opinion polls. Many of our public policies are strongly influenced by these polls, with politicians paying close attention to them before speaking out.

But while people are inundated with opinion polls, there is still considerable misunderstanding about how they work. Especially during elections, when there are polls practically every day, one often hears people expressing skepticism about polls, saying that they feel the polls are not representative because they, personally, and all the people they know, have never been asked their opinion. Surely, they reason, if so many polls are done, every person should get a shot at answering these surveys? That fact that no pollster has contacted them or their friends and families seem to make the poll results suspect in their eyes, as if the pollsters are using some highly selective group of people to ask and leaving out 'ordinary' people.

This betrays a misunderstanding of statistics and the sampling size needed to get good results. The so-called "margin of error" quoted by statisticians is found by dividing 100 by the square root of the size of the sample. So if you have a sample of 100, then the margin of error is 10%. If you have a sample size of 625, then the margin of error drops sharply to 4%. If you have a sample size of 1111, the margin of error becomes 3%. To get to 2% requires a sample size of 2500.

Clearly you would like your margin of error to be as small as possible, which argues for large samples, but your sample sizes are limited by the cost and time involved in surveying people, so trade offs have to be made. Most pollsters use samples of about 1000, and quote margins of error of 3%.

One interesting point is that there are statistical theorems that say that the sample size needed to get a certain margin of error does not depend on the size of the whole population (for large enough populations, say over 100,000). So a sample size of 1000 is sufficient for Cuyahoga County, the state of Ohio, or the whole USA. This explains why any given individual is highly unlikely to be polled. Since the population of the US is close to 300 million, any one of the 1000 people I may personally know has only a 0.00033% probability of being contacted.

We know that a poll tells us that 54% of Americans say that "I do not think human beings developed from earlier species." The sample size was 1000, which means a margin of error of about 3%. Statistically, this means that there is a 95% chance that the "true" percentage of people who agree with that statement (i.e., the number we would get if could actually ask each and every person on the country) lies somewhere between 51% and 57%.

Certain assumptions and precautions go into interpreting these results. The first assumption is that the people polled are a truly random sample of the population. If you randomly contact people, that may not be true. You may, for example, end up with more women than men, or you may have contacted more old people or registered Republicans than are in the general population. If, from census and other data, you know the correct proportions of the various subpopulations in your survey, then this kind of skewing can be adjusted for by changing the weight of the contributions from each subgroup to match the actual population distribution.

With political polls, sometimes people complain that the sample sizes of Democrats and Republicans are not equal and that thus the poll is biased. But that difference is usually because the number of people who are officially registered as belonging to those parties are not equal.

But sometimes pollsters also quote the results for the subpopulations in their samples, and since the subsamples are smaller, the breakdown data has greater margin of error than the results for the full sample, though you are often not explicitly told this. For example, the above-mentioned survey says that 59% of people who had high school education or less agreed that "I do not think human beings developed from earlier species." But the number of people in the sample who fit that description is 407, which means that there is a 5% uncertainty in the result for that subgroup, unlike the 3% for the full sample of 1000.

But a more serious source of uncertainty these days is that many people refuse to answer pollsters when they call and it is not possible to adjust for the views of those who refuse. So although the pollsters do have data on the numbers of persons who hang up on them or otherwise refuse to answer, they do not know if such people are more likely or less likely to think that humans developed from earlier species. So they cannot adjust for this factor. They have to simply assume that if those non-responders had answered, their responses would have been in line with those who actually did respond.

Then there may be people who do not answer honestly for whatever reason or are just playing the fool. They are also hard to adjust for. This is why I am somewhat more skeptical of surveys of teens on various topics. It seems to me that teenagers are just the right age to get enjoyment from deliberately answering questions in exotic ways.

These kinds of biases are hard, if not impossible, to compensate for, though in serious research the researchers try to put in extra questions that can help gauge whether people are answering honestly. But opinion polls, which have to be done quickly and cheaply, are not likely to go to all that trouble

Because of such reasons, polls like the Harris poll issue this disclaimer at the end:

In theory, with probability samples of this size, one could say with 95 percent certainty that the overall results have a sampling error of plus or minus 3 percentage points of what they would be if the entire U.S. adult population had been polled with complete accuracy. Sampling error for subsamples is higher and varies. Unfortunately, there are several other possible sources of error in all polls or surveys that are probably more serious than theoretical calculations of sampling error. They include refusals to be interviewed (nonresponse), question wording and question order, and weighting. It is impossible to quantify the errors that may result from these factors.

For all these reasons, one should take the quoted margins of error, which are based purely on sample size, with a considerable amount of salt.

There is one last point I want to make concerning a popular misconception propagated by news reporters during elections. If an opinion poll says that a sample of 1000 voters has candidate A with 51% support and candidate B with 49%, then since the margin of error (3%) is greater than the percentage of votes separating the candidates (2%), the reporters will often say that the race is a "statistical dead heat," implying that the two candidates have equal chances of winning.

Actually, this is not true. What those numbers imply (using math that I won't give here) is that there is about a 75% chance that candidate A truly does lead candidate B, while candidate B has only a 25% chance of being ahead. So when one candidate is three times as likely as the other to win, it is highly misleading to say that the race is a "dead heat."

POST SCRIPT: Inflated value of religion

Many people have an inflated sense of the value of religion that simply falls apart on close examination. For example, Mike Huckabee said the following: "The Ten Commandments form the basis of most of our laws and therefore, you know if you look through them does anybody find anything there that would be all that objectionable? I don't think most people would if they actually read them."

He says this as if it is obviously true. But Ed Brayton shows how absurd this is.

January 01, 2008

The joy of free thinking

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

There is scarcely a week that does not pass without some interesting new scientific discovery about the nature of life. You open the newspaper and read of observations of light emitted by distant stars from the very edges of the known universe, light that must have been emitted almost at the very beginning, over ten billion years ago. Such research puts us in touch with our own cosmic beginnings.

Just recently there was the discovery of the fossils a possible new Hobbit-like people who lived in a remote island in the Indonesian archipelago about 18,000 years ago. Then there was the discovery in China of an almost perfectly preserved bowl of noodles that is about the 4,000 years old. Discoveries like these shed light on how evolution works and how human society evolved. And then the discovery of Tiktaalik, the 375-million year-old fossil that seems like an intermediary between sea and land animals.

Similarly, the discoveries that come from studies of DNA tell us a lot about where humans probably originated, how we are all related to one another and how, despite our common origins, the species spread over the Earth and diversified. The fact (according to the September 21, 2005 issue of The Washington Post) that we share nearly 99 percent of our DNA with chimpanzees, lend further strong support (not that it needed it) to the evolutionary idea that chimpanzees and humans share a common ancestry. (The approximately one percent difference, according to The Daily Show, is what causes human beings to kill each other!)

I enjoy reading things like this because it reminds me of Charles Darwin's central idea, that we are all linked together in one great biological evolutionary tree, with the various animal species being our cousins, and even seemingly insignificant things like worms and bacteria having common ancestors with us, however distantly in the past that might have happened. Some people may find the idea of being related to a monkey repulsive but I think it is fascinating. The ability of science to investigate, to find new relationships, to explore and conjecture and come up with answers to old questions as well as create new questions to investigate is one of its greatest qualities.

And for me, personally, being an atheist makes that joy completely unalloyed. Shafars (i.e., secularists, humanists, atheists, freethinkers, agnostics, and rationalists), as well as religious people who interpret their religious texts metaphorically and not literally, do not have any concerns when new headlines describing a new scientific discovery are reported in the news. They do not have to worry whether any new fact will contradict a deeply held religious belief. They do not have to worry about whether they need to reconcile the new information with any unchanging religious text.

On the other hand, the same news items that give us fascinating glimpses of scientific discoveries undoubtedly create fresh headaches for all religious people, and especially those whose beliefs are based on literal readings of religious texts, because each new discovery has to be explained away if it disagrees with some dogma. There are people who devote their entire lives to this kind of apologetics, to ensure that their religious beliefs are made compatible with science. The website Answers in Genesis, for example, is devoted to making Young-Earth creationism (YEC) credible. So it goes to great lengths to show that the earth is less that 10,000 years old, all the animals could have fitted into Noah's Ark, and that dinosaurs lived at the same time as humans.

One has to admire the tenacity of such people, their willingness to devote enormous amounts of time, sometimes their whole lives, to find support for a belief structure that is continuously under siege from new scientific discoveries. It must feel like trying to hold back the tide. (See this site which heroically tries to fit into a 10,000 year old universe model the astrophysical data received from light emitted by stars that are billions of light years away.)

Of course, scientific discoveries come too thick and fast for even the most determined religious apologists to keep up. So they tend to focus only on explaining away a few questions, the kinds of questions that the lay public is likely to be most concerned about, such as whether dinosaurs existed concurrently with humans, the ages of the universe and the Earth, whether the size of the Ark was sufficient to accommodate all the species, how Noah coped with the logistical problems of feeding all the animals and disposing of the waste, how Adam and Eve's children could multiply without there already being other people around or indulging in incest, and so on.

But the rest of us don't have to worry about any of that stuff and so can enjoy new scientific discoveries without any cares, and follow them wherever they lead. It is nice to know that one can throw wide open the windows of knowledge and let anything blow in, clearing out the cobwebs of old ideas and freshening up the recesses of the mind.

It is a wonderful and exhilarating feeling.

So for this new year, I wish all the readers of this blog the joys of free thinking. May your thoughts not be hobbled by superstitions ancient or modern.

POST SCRIPT: The 50 Most Loathsome People in America

I usually avoid reading all the lists of best, worst, etc. that come out this time of year, but this one is actually very good.

Here's #29 on the list Dinesh D'Souza:

Charges: Wrote a book blaming 9/11 on -- who else? -- liberals, because if we didn't live in a free society, then fundamentalists wouldn't dislike us so. Even conservative nuts blasted D'Souza's empathy for poor al Qaeda. Lately, he's been engaging prominent atheists in debates, revealing himself to be a pseudointellectual ass, and then declaring victory. D'Souza's master plan for attacking atheism is the ridiculous Pascal's wager: Atheists could be wrong, and then they'd go to hell, but if the religious are wrong, then they suffer no ill effect -- aside from living their lives in delusion, of course. And possibly going to someone else's hell for believing the wrong religion. D'Souza seems to think that if he speaks more loudly and rapidly than his opponent, he is winning, but his arguments are weak and idiotic, and he never even attempts to truly debate the existence of any god, which is the ostensible point of these debates. Instead, he likes to compare body counts -- Stalin and Mao killed more than the religious leaders of their time -- rather than actually debate whether there is a God, or for that matter a Jesus. This, of course, is because there is no case to be made.

Exhibit A: "[Atheists] are God-haters... I don't believe in unicorns, but then I haven't written any books called The End of Unicorns, Unicorns are Not Great, or The Unicorn Delusion." But what if everyone you met did believe in unicorns, and not only that, but worshiped a unicorn, held a book about unicorns to be the divine truth of the universe, invoked unicorns in political contexts, and speechified about how non-believers were indecent people waging a war on morality, which could only be predicated on the unquestioning belief in unicorns? Then, maybe, D'Souza would think about writing that book. But of course, that's not really true, because if that was the world we lived in, then Dinesh D'Souza would believe in unicorns.

Sentence: Spanish inquisition.