THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries for June 2006

June 30, 2006

Algeria and Iraq

I just saw a remarkable film The Battle of Algiers. Made in black and white (French with English subtitles) in 1966 by the Italian filmmaker Gillo Pontecorvo, the story is about the Algerian struggle for independence and the battle between the rebels and the French colonial powers in the capital city of Algiers in the period 1954-1960.

In order to deal with the increasing violence during this period, the French government sends in elite paratroopers led by Colonel Mathieu. Mathieu sets about ruthlessly identifying the structure of the insurgent network, capturing and torturing members to get information on others, and killing and blowing up buildings in his pursuit of the rebels even if it contains civilians. And yet, he is not portrayed as a monster. In one great scene where he is giving a press conference, he is asked about his methods of getting information and the allegations of torture. He replies quite frankly that the French people must decide if they want to stay in Algeria or leave, and if they want to halt the violence against them or let it continue. He says that if they want to stay and stop the violence, then they must be prepared to live with the consequences of how that is achieved. It is the French people's choice.

One gets the sense that Mathieu does not torture and kill suspects because he enjoys it. He is simply an amoral man, who has been given a job to do and he will get it done using whatever means he deems necessary. This is the kind of military person that political leaders want. They don't want people who worry about the niceties of human rights and human dignity. But when you train people to deny their normal human feelings, then you get the kind of people who carry out the tortures described in Abu Ghraib and Guantanamo, and who are even surprised when there is an outcry that what they did was wrong.

And Mathieu does succeed in his task, at least in the short run. By his ruthless methods he destroys the rebel network. But all that this buys is some time. After a lull in the violence for a couple of years, a sudden eruption of mass protests results in Algeria becoming independent in 1962. The French win the battle of Algiers but lose the war of independence.

The film gives a remarkably balanced look at the battle, avoiding the temptation to fall into easy clichés about good and evil. It shows the FLN (National Liberation Front) using women and children to carry out its bombing campaign against French civilians living in the French areas of the city. In one memorable sequence, three young Muslim women remove their veils, cut their hair, put on makeup, and dress like French women to enable them to carry bombs in their bags and pass through military checkpoints that surround the Muslim sector of the city (the Casbah). They place those bombs in a dance hall, coffee shop and Air France office, bombs that explode with deadly effect killing scores of civilians who just happen to be there.

In one scene:

Pontecorvo deals with the issue of the killing of innocents by an army vs. such killing by an irregular force. During a press conference, a reporter asks a captured official of the FLN: "Isn’t it a dirty thing to use women’s baskets to carry bombs to kill innocent people?" To which the official answers, "And you? Doesn’t it seem even dirtier to you to drop napalm bombs on defenseless villages with thousands of innocent victims? It would be a lot easier for us if we had planes. Give us your bombers, and we’ll give you our baskets."

The parallels of Algeria and Iraq are striking, so much so that it is reported that the US policy makers and military viewed this film with a view to hoping to learn how to combat the Iraq insurgency.

As in Iraq, the rebels are Muslims and the objections they have to being ruled by non-Muslims plays an important role in their motivation to revolt. The French had just humiliatingly lost in Vietnam in 1954 and their military was anxious to rehabilitate their reputations by winning elsewhere. In other words, they had their own 'Vietnam syndrome' to deal with, just like the US.

In the film, you see how the ability of the insurgents to blend in with the urban population enables them to move around and carry out attacks on the French police and citizenry, with women and children playing important roles. We see how the privileged and western lifestyle of the French people in Algeria makes them easy targets for attacks. We see how the attacks on French people and soldiers in Algeria causes great fury amongst the French citizenry, causing them to condone the torture and killing and other brutal methods of the French troops.

One major difference between the French involvement in Algeria and US involvement is Iraq is that Algeria had been occupied by the French for 130 years, since 1830. They had been there for so long that they considered it part of France and refused to consider the possibility of independence. The long occupation also resulted in a significant number of French people living in the city of Algiers, thus making them vulnerable targets. In Iraq, there are very few US civilians and almost all of them are in the heavily fortified so-called 'green zone.'

The film takes a balanced look at what an urban guerilla war looks like and those who wish to see what might be currently happening in cities like Ramadi and Falluja and Baghdad can get a good idea by seeing this film. The scenes of mass protest by huge crowds of Algerians and their suppression by the occupying French forces are so realistic that the filmmakers put a disclaimer at the beginning stating that no documentary or newsreel footage had been used. And amazingly, this realism was achieved with all novice actors, people who were selected off the streets of Algiers. Only the French Colonel Mathieu was played by a professional actor, but you would not believe it from just seeing the film since the actors give such natural and polished performances, surely a sign of a great director.

For a good analysis of the film and background on its director, see here. The film is available at the Kelvin Smith Library.

POST SCRIPT: Documentary about Rajini Rajasingham-Thiranagama

Today at 10:00 pm WVIZ Channel 25 in Cleveland is showing No More Tears Sister. I wrote about this documentary earlier.

June 29, 2006

The strange game of cricket

I am a lifelong fan of cricket and spent an enormous amount of my youth devoted (many would say wasted) to the game. As a boy, much of my free time was spent playing it, reading about it, watching it, or listening to it on the radio. I was such a devoted fan that I would set the alarm to wake up in the middle of the night to listen to crackly and indistinct short wave radio commentary of the games from distant time zones in England, Australia, and West Indies. Such was my fanaticism towards the game that I was going to all this trouble to listen to games involving other countries, Sri Lanka achieving international Test playing status only in 1981. And now with the internet, I have been able to renew my interest in the game since the lack of coverage in the US media is no longer a hindrance, so the time wasting has begun anew.

But the game seems to leave Americans cold, just like baseball is hard to appreciate for those from countries that do not play the game. I have become convinced that indoctrination into the joys of slow games like cricket or baseball is something that must occur very early in childhood and is difficult to develop later in life.

To help Americans to understand the game (and thus appreciate the film Lagaan even more), I'll provide a brief description. For more details, see here.

It is easiest for Americans to understand the traditional form of the game by comparing its features with that of baseball, its closest relative.

The classical form of cricket consists of two teams of eleven players each (nine in baseball). Each team has two innings (nine in baseball). An inning ends when there are ten outs (three in baseball). As in baseball, the team that has the most runs at the end of the game wins.

There are two chief differences with baseball that give cricket its quirky features. The first is that, unlike baseball and like football, the game is time-limited. International games, called 'Tests', last for five consecutive days, six hours a day, with breaks for lunch and afternoon tea. Several shorter forms of the game also exist. In Lagaan, for example, the game is limited to three days and one inning for each side.

This time-limited feature means that even after five days, a game can (and often does) end in a no-decision because time has run out and the game ends before either or both teams have had a chance to complete their two innings. The thought that such a result is even possible, let alone not unusual, boggles the mind of Americans, who are used to obtaining a definite result.

The second distinctive feature is that the batsman (batter) who hits the ball is not obliged to run but has the option of choosing to stay put, running only if he is sure that he can complete the run safely. Because of this option, in theory it is possible for the first batsmen to stay out there for five full days, neither scoring runs nor getting out, and the game ending in a no-decision with no runs scored, no outs, and no innings completed by either side. This has never happened. This would be career suicide for the batsmen concerned. The crowds would riot if anyone tried this and their teammates would be incensed.

The reason this potentially possible scenario does not play out is a consequence of the combination of the natural competitive desire of players to win a contest, coupled with the time-limited nature of the game. In order to win, one side must score a high enough total of runs quickly so that they have sufficient time to get the other team out for fewer runs before time runs out. This requires each team to take chances to either score runs or to get the opponents out. It is this continuous balancing of risk with reward that gives the game most of its appeal and thrills.

It is only when winning is seen as a hopeless task for one side that it becomes a good (and even required) strategy to try and play safe for a no-decision. There have been many memorable games in which one side was hopelessly outscored in runs and had no chance to win but salvaged a no-decision by digging in and not trying to score runs but simply not allowing their opponents to get ten outs. That strategy is considered perfectly appropriate and in such situations those batsmen who successfully adopt this dour defensive strategy are hailed. Weather sometimes plays a role in creating no-decisions by reducing the time available for the game.

Paradoxically, the fact that batsmen are not obliged to run after hitting the ball results in cricket being a high scoring game (since they run only when it seems reasonably safe to do so) with a single innings by one team often lasting for more than a day and a five day game producing typically 1,500 runs or so.

A cricket field requires a large amount of land and consists of an elliptical shape about three to four times the size of a baseball field. Unlike in baseball, where the action takes place in one corner of the field where home plate is, cricket action takes place at the center of the field in a strip about 22 yards long, called the 'pitch.' There is no foul territory. At each end of the pitch are the 'wickets', three vertical sticks about knee high and spanning a width of about 9 inches. There are always two batsmen simultaneously on the pitch, with one batsman standing at each end. A bowler (pitcher) runs up and delivers the ball (with a straight arm delivery, no throwing allowed) from near the wickets at one end to the batsman guarding the wickets at the other end (the striker). If the ball hits the wicket, the batsman is out (said to be 'bowled'), and replaced by the next one.

If the batsman hits the ball away from a fielder and decides it is safe to run, he and the batsman at the other end (the non striker) run towards the opposite wickets, crossing paths. If the ball is returned to either end, and the wicket there broken before the batsman running towards it reaches it, then that batsman is out ('run out'). If the batsmen run an odd number (1,3,5) of runs safely then the non-striker becomes the striker. If an even number (2,4) of runs, then the same batsman retains the strike. Four runs are scored if the striker hits the ball so that it crosses the boundary of the field, and six runs are scored if it does so without first touching the ground. The boundary is not a high wall (as in baseball) but simply a line marked on the ground, usually by a rope.

In addition to getting out by being bowled or run out, a batsman is also out if a hit ball is caught by a fielder before it hits the ground. These are familiar forms of getting out to baseball fans but there are seven additional (and rarer) ways of getting out in cricket that are too complicated to get into here.

After one bowler has made six consecutive deliveries (called an 'over'), the ball is given to a different bowler, who bowls an over from the opposite end, while the batsmen stay put during the changeover. Thus the non-striker at the end of one over becomes the striker at the beginning of the next. (A fairly recent development has been that of one-day games where each team has just one inning that lasts for a maximum of 50 overs, with the team that scores the most runs winning. This format guarantees a result and aggressive batting, and has proven to be very popular with the general public, though cricket purists look down on it.)

To play cricket well requires developing a lot of technique (especially for batting) and thus fairly extensive coaching. Simply having raw talent is not sufficient to make it to the top. This is why the villagers in the film Lagaan, having never played the game before, faced such an uphill task in challenging the British team, who presumably had been playing the game since childhood.

I still enjoy watching cricket being played by good teams, although I no longer have the opportunity. There is no question that it is a strange game, and I can understand why newcomers to the game find its appeal highly elusive. It is slow moving and its delights are subtle. It is a game where good technique can give the spectator pleasure, even when displayed by the opponents. A batsman who hits the ball with style and grace, and a bowler whose run-up and delivery are fluid and skilful, and great fielding moves, tend to be appreciated by all spectators, not just those supporting that team.

Cricket is not a game that would have been invented in the modern age. It could only have been conceived in a different, more leisurely era, when people had the time and the money to while away whole days chasing after a ball on a grassy field. The fact that it has survived and even flourished in modern times, with more and more countries taking it seriously, is somewhat amazing.

POST SCRIPT: Class warfare in America

It always amazes me that it is the comedy shows that understand and report on policy best. Catch Stephen Colbert's look at the minimum wage and class warfare.

June 28, 2006

Cricket and the politics of class

Whenever I read the novels of (say) Jane Austen or P. G. Wodehouse, that deal with the life of the British upper classes around the dawn of the twentieth century, one thing that always strikes me is that the characters who inhabit those books never seem to do any work. Beneficiaries of a class-ridden feudal system, they seem to live on inherited income and property that enables them to spend their days not having to worry about actually making a living. There is rarely any description of people's jobs. Work seems to be something that the lower classes do and is vaguely disreputable. Even in Charles Dickens' novel, which often dealt with characters who were desperately poor, the happy ending usually took the form of the hero obtaining wealth by means of an inheritance or otherwise, and then promptly stopping work and hanging around at home, even if they were still young

These rich people seemed to spend all their time visiting each other's homes for weeks on end, go for walks, ride horses, write long letters to each other, play cards and board games, and throw elaborate parties. In short, these are rich idle people with plenty of time on their hands.

This kind of life was not entirely unproductive. Some of these people used their time to become amateur scientists, using their freedom from having to earn a living to devote their lives to studying scientific questions, often coming up with major discoveries. Charles Darwin's voyage on the Beagle was not a job. He was not paid to go to the Galapagos Islands. His was an unpaid expedition, made possible by his lack of financial need. Nobel-prize winning physicist Lord Rayleigh was also somewhat of an amateur scientist. Even now the idea of the 'gentleman scholar' is quite strong in England, with people developing very detailed expertise in areas of knowledge on their own purely for the love of it and using their own money.

But many members of the idle rich classes were preoccupied with purely recreational activities and only such a class of people could have enjoyed the game of cricket. After all, who else has the time to play or watch a game that goes on for days on end? International games, called 'Tests', last for five consecutive days, six hours a day, with breaks for lunch and afternoon tea. Furthermore, the cricket field requires a large amount of land (an elliptical shape about three to four times the size of a baseball field), with carefully tended turf, and the equipment required to play is expensive, again making it a rich person's game.

Despite the fact that the economics of the game and the time commitment it required made it hard for working people to play it, it gained in popularity and shortened versions of the game that lasted only one day enabled even working people to play it on Sundays, and eventually people even started being paid for playing the game. Such professionals were looked down upon by the amateurs, those who could play it without being paid to do so because they were independently wealthy. The latter learned the game at the exclusive private schools like Eton and Harrow and then later at prestigious universities like Oxford and Cambridge, and the ranks of the English national team tended to filled with the products of these elite institutions.

But the class system in England is very strong and even after professional players became part of cricket teams, some distinctions were maintained. In a manner strikingly similar to Jim Crow laws in the US (although not nearly as severe in intent or implementation), until the mid twentieth century, the amateurs (who were called 'Gentlemen') and the professionals (who were called 'Players') had separate dressing rooms and entered and left the cricket field by different entrances. Teams were usually captained by an amateur, even if the amateur was the worst player in terms of skill, presumably so that an amateur would not have to take orders from a professional. (Unlike in American sports where the non-playing coach or manager controls the game and tells players what to do, in cricket it is the playing captain makes all the decisions on the field and his order must be obeyed unquestioningly.) Len Hutton was an exception in that he was a professional who captained the England national team in the 1950s.

This Wikipedia entry shows the state of affairs as late as 1950, in a story about Brian Close who came from a working class background and in 1966 became the first professional (i.e. Player) to captain England after the amateur/professional distinction was finally and formally abolished in the mid-1960s.

At that time class status was still important: professionals, known as Players, were expected to show deference to the amateurs, who were the Gentlemen. Gentlemen did not share changing rooms with Players, and cricket scorecards would differentiate between the two of them, with the names of Gentlemen being prefixed "Mr", the names of the professionals being styled by their surnames and then their initials. This was a time when it was considered necessary to announce on the tannoy errors such as "for F.J. Titmus read Titmus, F.J.".

Close did well for the Players and top-scored with 65. When he reached his fifty, he was congratulated by the Gentlemen's wicket-keeper, Billy Griffith, and in a conversation that now seems innocuous, Griffith congratulated Close by saying, "Well played, Brian", with Close replying, "Thank you, Billy". However, Close had not referred to Griffith as "Mister", and ten days later was called to see Brian Sellers, a former captain and member of the Yorkshire committee, who reprimanded Close for the effrontery.

In societies that practice domination by one class or ethnicity over another, we often forget the important role that seemingly petty indignities play. In order to achieve complete domination over someone, it is not sufficient to just have total legal or even physical control over that person. It is important to also have psychological power and this is done by destroying their sense of dignity and self-worth. The British imperialists understood this well and never missed an opportunity to rub their 'superiority' in to their 'inferiors,' whether it was the people of their colonies or the working classes at home. People who have little or no dignity or sense of personal self-worth are defeated right from the start and thus easy to control.

This is why developing pride in oneself and dignity-building are usually an important part of getting any group to rise up and organize to improve themselves

The petty practices that arose from such an approach seem bizarre now, and thankfully most of us have not encountered such behavior. But it is sobering to realize that the worst such features were commonplace just fifty years ago and subtle forms still exist.

Next: So what is cricket all about anyway?

June 27, 2006

Lagaan and the Bollywood film tradition

In watching Lagaan, I was reminded of the increasing interest in the west in Bollywood films. For those not familiar with it, 'Bollywood' is a generic term for films produced mostly in the prolific studios of Mumbai (formerly Bombay), an industry that rivals Hollywood in size. But a Bollywood film is not merely defined by where it is produced but also by the nature of its content. (A caveat: I have never been a fan of Bollywood films and my following comments should not be too taken seriously because I have not seen many such films, and the few I did see were many, many years ago when I was an undergraduate in Sri Lanka. It is quite possible that my perceptions are out of date and that these films have changed and improved considerably over time.)

Bollywood films were immensely popular in Sri Lanka despite being in Hindi (a language not spoken there) and with no subtitles. The lack of understanding of the dialogue did not seem to pose a problem for audiences because the strict formula and conventions of these films made the general features of plot transparent and the details immaterial. The formula required many things. The films had to be long, at least three hours. Cinema was the chief form of popular entertainment and poor people wanted their money's worth. The films were also outrageously escapist. The male and female leads were invariably young and good looking and middle or upper class, with lifestyles beyond the reach of most of their audiences. The plot was always boy-meets-girl, boy-and-girl-lose-each-other, boy-and-girl-overcome-problems-and-get-married.

The plot usually involved some misunderstanding that could have been easily resolved if someone had simply spoken up at the right time, but inexplicably does not. A daft woman was often the culprit. Providing light relief and humor is a comic sub-plot, usually involving servants or working class or stupid people, that runs in parallel with the main story line. The villain in the film usually has no redeeming qualities. In fact, the main romantic leads and the villain lack complexity and depth of character, being just types. This makes it easy to root for the heroes and hiss the villain. It is usually the supporting characters who are allowed to display some complexity and development. And a Bollywood film must end happily, with the villain getting his (it is usually a man) just desserts.

And of course, there have to be songs. Lots of songs. Combined with dancing. Lots of dancing. These are combined into big production numbers that break out often for no discernible reason and seem to go on and on and serve no purpose in the story other than to jazz up the proceedings. The song-and-dance scenes usually involve rapid changes of clothes and location. Just within one song, the couple might be singing wearing one outfit in their local town, then the location will shift to London in another outfit, then to the Alps, then Tokyo, and so on. Why? No dramatic reason. Just to give the audience the sheer escapist pleasure of seeing the world's tourist spots. The romantic leads sing and dance in parks and play peek-a-boo behind the trees.

The songs, songwriters, and the singers of the songs (called "playback singers") are the actual stars of a Bollywood film. They are not seen and the actors lipsynch to them, transparently so. Little effort is made to match the actor's own voice with that of the playback singers. It is not unusual in a big ensemble song-and-dance scene for several characters who have vastly different speaking voices to 'sing' different lines, while the same playback singer is used for both. Verisimilitude is not a high priority for Bollywood film audiences, who seem to subscribe to Duke Ellington's dictum: "If it sounds good, it is good."

Lagaan sticks to the Bollywood formula in many areas but deviates from it in significant ways. It is very long but it is a tribute to the screenwriters and director that I did not feel it dragging at all. There is no comic sub-plot. The song-and-dance numbers are still there but thankfully much fewer (I think there were only six) and they were integrated into the story and advanced the plot. In fact, the last song, a devotional one sung by the villagers during the night before the third and final day of the match when they had their backs to the wall and were asking god to help them, was extraordinarily beautiful and very moving.

One Bollywood tradition that was retained in Lagaan was that the male and female leads must always be good looking and well-groomed and very buff, whatever the circumstances. Here they play two young people in an impoverished village that is baking in the heat, suffering from drought, and the people close to starving. You would expect such people to look somewhat emaciated and haggard, and yet the two leads always look like they have just come from a spa, with hair in place, clean-shaven, clean clothes, and make up done just so. Only the supporting characters sweat and wear torn and shabby clothes.

Another tradition that was retained was that the villain had no redeeming qualities. Here the villain was the British Captain Russell who offered the wager that could not be refused. He always has a sneer on his face and never seems to miss an opportunity to be nasty. In order to do a trivial favor for the raja (prince), he insists that the raja (who is a vegetarian) must eat meat. He kicks and beats with a whip a villager who accidentally hurts his horse while shoeing him. He yells at a subordinate because he did not seem him salute. And he kills a deer and rabbit for fun. You can be sure that the director chose those particular animals for their cuteness appeal and to increase the repulsion of the audience. The closeups of those two animals just prior to their death show them looking like Bambi and Thumper. I am surprised that Russell was not shown kicking a puppy.

But all that pales before the unmistakable sign of Russell's bad character, which is that he indulges in unsportsmanlike behavior at cricket! In British tradition, cricket is the ultimate venue for fair play and anyone who does not play by the spirit of the rules, let alone the letter, is undoubtedly a bad person. George Orwell in his essay Raffles and Miss Blandish highlights this peculiarly British belief that someone who is good at cricket and upholds its spirit of sportsmanship is automatically assumed to be a good person, whereas someone who acts unsportingly, let alone (gasp!) cheats at the game, is considered a bounder, a cad, a scoundrel, a blackguard, completely beyond the pale. (Raffles is a fictional character in British literature, a thief who uses his acceptance in high society and invitations to their parties to steal people's valuable possessions from their homes. No one suspects him because he played for the English national cricket team so how could he possibly be a thief?) To do something, anything, that is branded as 'not cricket' is to be accused of violating the spirit of fair play.

Although Lagaan retains some of the Bollywood and cricket clichés, it is a tribute to the film that it is also able to rise above them and tell a good story well.

Next: Cricket and the class system.

POST SCRIPT: So that explains it

New Scientist magazine reports on the results of a new study that finds that "Overconfident people are more likely to wage war but fare worse in the ensuing battles". It also finds that "Those who launched unprovoked attacks also exhibited more narcissism."

The study, done by Dominic Johnson of Princeton University involving 200 volunteers playing war games, was published in the Proceedings of the Royal Society B.

Bertram Malle of the University of Oregon says that "the study raises worrying questions about real-world political leaders. "Perhaps most disconcerting is that today's leaders are above-average in narcissism," he notes, referring to an analysis of 377 leaders published in King of the Mountain: The nature of political leadership by Arnold Ludwig."

Peter Turchin of the University of Connecticut comments that "One wishes that members of the Bush administration had known about this research before they initiated invasion of Iraq three years ago," he adds. "I think it would be fair to say that the general opinion of political scientists is that the Bush administration was overconfident of victory, and that the Iraq war is a debacle."

I think it is naïve to think that things might have been different if the Bush administration had known of this study. I can't recall the source now but there was an earlier study that found that the prime reason that some people are so incompetent is that they are unaware that they are incompetent! They do not think that negative indicators apply to them and thus do not seek to improve themselves. Such a lack of realistic self-awareness seems to be a hallmark of the current leadership.

June 26, 2006

Lagaan

I recently watched the film Lagaan (2001) (Hindi and English with English subtitles) on DVD and was very impressed. Although the film is very long (3 hours, 45 minutes!) it did not drag at all which, for me, puts its director (Ashutosh Gowarikar) in the same class as David Lean (Lawrence of Arabia, Bridge on the River Kwai) as one of those rare filmmakers who can make me overcome my feeling that films should not exceed two hours, and preferably should be 90 minutes.

Lagaan takes place in a remote village region in India in 1893 during British colonial rule. The area has been hit by a drought for several years and the impoverished villagers are unable to pay the tax ('lagaan') to their British military rulers.

In seeking relief from the tax, some of the villagers try to ask for a temporary amnesty, but run afoul of the local British military head Captain Russell who, in a fit of pique because of a prior run-in with one of the villagers (the hero of the film) actually doubles the tax instead. When the appalled villagers protest, Russell raises the stakes even more. He says that he will now triple the tax but offers them the following wager: he will exempt the village from any tax at all for three years if the villagers can field a cricket team that beats the team comprised of the British military officers. Since the British officers grew up playing the game and even in India play cricket all the time, while the villagers have never even seen the game, the villagers seem doomed. But having no option but to agree to this unbalanced wager, the villagers set about trying to learn to play cricket within the three months allocated to them, and this preparation and the actual climactic game forms the main storyline of the film.

The villagers who form the cricket team are made up of Hindus, Muslims, Sikhs, handicapped, and members of the so-called 'untouchable' caste, and they have to learn to overcome their traditional animosities for the sake of the village. This rag-tag group, lacking proper equipment or coaching (except for some guidance from the sympathetic sister of Captain Russell who is appalled by her brother's cruelty), have to resort to unorthodox training methods, such as catching chickens to improve their reflexes and fielding technique.

Clearly the cricket match is a metaphor for the independence struggle waged by India against the British, which resulted in the British being forced to leave in 1947. That struggle was a landmark in national liberation struggles, with people like Jawaharlal Nehru and Mahatma Gandhi successfully managing to forge a highly diverse and large population, riddled with religious, ethnic, language, caste, and class differences, into a cohesive force against a common enemy. Unfortunately, that unity was short-lived with ongoing Hindu-Muslim clashes, the partitioning into India and Pakistan, the Kashmir area still under dispute, Sikh dissatisfaction, the isolation of the so-called 'untouchable; caste, and so on. But they managed to work together enough during crucial periods to make continuing British control impossible. Like the village cricket team being forced to learn how to play the game of their oppressor, the Indian independence leaders had to learn the 'game' of British politics and public opinion in order to advance their goals.

Cricket as a metaphor for the anti-imperialist and anti-colonial struggle against the British is extended when we realize that the demise of the British Empire after World War II correlated with the decline in the dominance of their cricket. Now India and Pakistan are dominant cricket nations, regularly beating England in international contests (called 'Test' matches), and two current players who are easily among the best batsmen of all time (Sachin Tendulkar of India and Brian Lara of West Indies) come from former British colonies. Sri Lanka also fields competitive international teams. While in Lagaan the villagers were totally ignorant of the game and amused by the Englishmen's passion for what they considered a childish pastime, nowadays the Indian subcontinent has arguably the most enthusiastic cricket fans in the world and there is probably no corner, however remote, where children are not enthusiastically playing it.

You don't really need to understand cricket in order to appreciate this fine film, but in a later posting, I will provide a Cliffs notes version for those who are bewildered by the appeal of this very strange game.

(Note: If you are a member of the Case community, you can borrow the Lagaan DVD from the Kelvin Smith Library.)

Next: Lagaan and the Bollywood film tradition.

POST SCRIPT: The real reason why the attack on Iraq was wrong

Periodically, some defender of the invasion of Iraq will resurrect the idea that Iraq did possess so-called weapons of mass destruction. The latest people to do this are Senator Rick Santorum and Congressman Peter Hoekstra, whose claims have been disavowed even by Defense Department officials and Fox News.

Although their claims have been discredited, it is easy for such discussions to obscure an important and fundamental fact. The immorality and illegality of the invasion of Iraq has nothing to do with whether such weapons existed or not so whether they are found or not is not central to the issue of whether the attack was justified. The war was wrong because Iraq had neither attacked nor even threatened to attack the US. What the US engaged in was an unjustified war of aggression.

What the lack of discovery of weapons shows is that the Bush administration lied even about their unjustified rationale for the attack.

June 23, 2006

Free will

Belief in a god rests on a foundation that requires one to postulate the existence of a mind/soul that can exist independently of the body (after all, the soul is assumed to live on after the physical death of the body) and freely make decisions. The idea that the brain is all there is, that is creates our consciousness and that the mind/soul are auxiliary products of that overall consciousness, strikes at the very root of belief in god.

So what about the role of free will? Where does that fit in with this? If the mind is an entity that exists independently of the brain and which can influence the brain, then one can think of free will as a product of the mind. But is free will compatible with the idea that the brain is all there is?

The idea that we have free will came under attack with the development of materialistic models of the universe. With the success of Newtonian physics in explaining and predicting the motion of celestial and terrestrial objects, and with the rise of a materialistic philosophy of nature (that everything consists of matter in motion under the influence of natural laws), it became inevitable for people to suppose that the mechanical universe was all there is.

According to the Newtonian model, all you needed to be able to predict the future state of an object was (1) exact knowledge of the current state of the object (known as the initial conditions), and (2) the forces of interaction between the object and its environment, because it these forces, and only these forces, that influenced its subsequent behavior. Since there was no reason to think that these two types of information were unknowable in principle, that implied the future of that object was predetermined. If everything that existed in the universe (including the brain and mind) had this same material basis and consisted of objects in motion, then the logical implication is that the future is predetermined.

Of course, the mere fact of predetermination did not imply that the future was predictable in practice. Since any object other than a few elementary particles is composed of a vast number of constituent elements such as atoms, no program of prediction can be actually carried out, simply because of the enormous complexity of the calculations involved. Since we are not able to predict the future with 100% accuracy in the absence of perfect information, the belief in an undetermined future for anything but elementary particles can be preserved from actual experimental contradiction.

But at a philosophical level, the fact that predetermination existed in the deterministic Newtonian word pretty much killed the idea of free will and the existence of an independent mind, and hence god. So in order to preserve those concepts, one has to find flaws in either or both of the two underpinnings of the Newtonian system given above.

One approach is to argue that we can never know all the forces acting on an object. This is essentially the idea behind the concept of god (or intelligent designer, which is the same thing) whose actions does not conform to any natural laws and hence can intervene in any system in unpredictable ways. There has been no real evidence that such an unknown and unpredictable force exists.

The other approach is to argue that we cannot know, even in principle, what the initial conditions are. This latter view actually has experimental support (at least in some situations) in quantum mechanics and the Heisenberg uncertainty principle, which says that there is an underlying limit (inherent in nature) that limits the precision with which we can know the initial state of a system. The quantum world is not totally unpredictable of course. In fact, there exists a very high degree of predictability but it is a statistical predictability that says that we can state with some certainty what will happen on average, but each individual event is unpredictable. A classical analog is the case of tossing a coin. If I toss a coin a million times, I can predict with a very high degree of confidence that the number of heads will be very close to 50%, but I have only a 50-50 chance of guessing the result on any individual toss. And as I said before, almost everything in nature is made up of a vast number of constituent elements so it is the average motions of all these things that actually matter. This is why the predictions of science tend to be so accurate.

But the fact that there is even this small inherent uncertainty in nature has led some religious scientists to argue that quantum mechanics provides a non-deterministic niche that allows god to act and they have seized on it. For example, Brown University biology professor Ken Miller is a devout Catholic who has been a very strong opponent of the intelligent design movement. In his book Finding Darwin's God he reconciles his belief in god with his belief in the sufficiency of natural selection by invoking the uncertainty principle as the means by which god can act in the world and yet remain undetectable. He doesn't actually suggest a mechanism, he just asserts that quantum mechanics allows a window through which god can act.

So in some sense, the uncertainty principle is playing the role that the pineal gland played for Descartes, providing a point of intersection for the intersection of the nonmaterial world with the material world.

Those, like Jeffrey Schwartz and William Dembski, who are looking for new ways to preserve their intelligent design idea, have also tried to use the uncertainty principle to create room for it.

Frankly, this is not convincing. Although the uncertainty principle does assert an inherent limit, set by nature, on some kinds of knowledge, the limitation is highly restricted in its operation, significant only for very small objects at very low temperatures, and does not allow for the wide latitude required to believe in the kind of arbitrary intervention of god in the physical world that is favored by religious people. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Last year Dr. Schwartz and two colleagues published a paper on their quantum theory in the Philosophical Transactions of the Royal Society B. They are not the first to try linking quantum mechanics to concepts of consciousness, but such efforts have failed to win over either physicists or neuroscientists, who discount the role that quantum effects would play at the size and temperature of the human brain. In discussions of consciousness, "the only reason people involve quantum mechanics is because of pure mysticism," says Christof Koch, a professor of cognitive and behavioral biology at the California Institute of Technology.

Using the quantum mechanical uncertainty principle to sneak in god into the world is not tenable. Those who know anything about quantum mechanics, even those sympathetic to religion, see this as a futile maneuver, serving only to awe those who are intimidated by quantum mechanics.

Many other scientists have been highly critical of Dr. Schwarz; even some researchers interested in exploring spirituality discount his theory. The Templeton Foundation, a philanthropy devoted to forging links between science and religion, rejected a grant proposal by Dr. Schwartz, says Charles L. Harper Jr., senior vice president of the foundation. A cosmologist by training, Mr. Harper says the proposal was turned down because "it had to do with a lot of hocus-pocus on quantum mechanics."

So that is where things stand. To retain a belief in god and free will and soul requires one to postulate not just one non-material entity (god) interacting with the material world, but to suggest that each one of us also possesses a non-material entity (the soul/mind) that exists independently of us and interacts only with our own material brain (and with no one else's brain) in some unspecified way. The mind-body interaction must have a blocking mechanism that prevents such cross-over since, if one person's mind/soul can interact with another person's brain, that can cause all kinds of problems.

Is this a plausible picture? Again, plausibility is something that each person must judge. For me personally, it just seems far too complicated, whereas assuming that the brain is all there is makes things simple.

In my own case, I had already begun to seriously doubt the existence of god before I even thought about the brain/mind relationship. When I started looking closely at how the brain works, I became convinced that the idea of a mind that has an existence independent of the brain was highly implausible. The dawning realization that the brain is all there is sealed the conviction that there is no god.

POST SCRIPT: Running on empty

Money was hard to borrow in Sri Lanka when I was growing up. So we got used to the idea that we had to live within our means or have to (embarrassingly) borrow from friends and relatives. One of the things that took me a long time to get used to in the US was the ease of credit and that people would go so willingly and easily into debt, even for things like unnecessary luxury goods or taking vacations. I am still not used to that actually, even after all these years here. I cannot imagine borrowing money except for absolute necessities.

As we all know, the saving rate in America is non-existent and even (by some reports) negative, which means that as a whole, the people in the nation are spending more than they earn. We also know that the government is racking up huge budget deficits, and record-breaking debt.

Why is this happening? How long can it continue? Why is everyone seemingly oblivious to this?

Danny Schecter has created a new documentary In Debt We Trust: America before the bubble bursts (coming out in June 2006) where he talks about how the rise in debt is being deliberately driven by people who make money off increasing indebtedness.

You can read about it and see the trailer here.

June 22, 2006

What the neuroscience community thinks about the mind/brain relationship

The idea that the mind is purely a product of the material in the brain has profound consequences for religious beliefs, which depend on the idea of the mind as an independent controlling force. The very concept of 'faith' implies an act of free will. So the person who believes in a god is pretty much forced to reject the idea that the mind is purely a creation of the brain. As the article Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) says:

Pope John Paul II struck a similar theme in a 1996 address focusing on science, in which he said theories of evolution that "consider the mind as emerging from the forces of living matter, or as a mere epiphenomenon of this matter, are incompatible with the truth about man. Nor are they able to ground the dignity of the person."

As I wrote yesterday, the flagging intelligent design creationism (IDC) movement seems to be hoping for some fresh energy to emerge from the work of psychiatric researcher Dr. Schwartz. Or at the very least they may be hoping that they can persuade the public that the mind does exist independently of the brain. But they are going to have a hard time getting traction for this idea within the neurobiology community. There seems to be a greater degree of unanimity among them about the material basis of the mind than there is among biologists about the sufficiency of natural selection.

Stephen F. Heinemann, president of the Society for Neuroscience and a professor in the molecular-neurobiology lab at the Salk Institute for Biological Studies, in La Jolla, Calif., echoed many scientists' reactions when he said in an e-mail message, "I think the concept of the mind outside the brain is absurd."

But the ability of the neurobiology community to do their work unfettered by religious scrutiny may be coming to an end as increasing numbers of people become aware of the consequences of accepting the idea that the mind is purely a product of the brain. People might reject this idea (and be attracted to the work of Dr. Schwartz), not because they have examined and rejected the scientific evidence in support of it, but because it threatens their religious views. As I discussed in an earlier posting, people who want to preserve a belief system will accept almost any evidence, however slender or dubious, if it seems to provide them with an option of retaining it. As the article says:

Though Dr. Schwartz's theory has not won over many scientists, some neurobiologists worry that this kind of argument might resonate with the general public, for whom the concept of a soul, free will, and God seems to require something beyond the physical brain. "The truly radical and still maturing view in the neuroscience community that the mind is entirely the product of the brain presents the ultimate challenge to nearly all religions," wrote Kenneth S. Kosik, a professor of neuroscience research at the University of California at Santa Barbara, in a letter to the journal Nature in January.
. . .
Dr. Kosik argues that the topic of the mind has the potential to cause much more conflict between scientists and the general public than does the issue of evolution. Many people of faith can easily accept the tenets of Darwinian evolution, but it is much harder for them to swallow the assumption of a mind that arises solely from the brain, he says. That issue he calls a "potential eruption."

When researchers study the nature of consciousness, they find nothing that persuades them that the mind is anything but a product of the brain.

The reigning paradigm among researchers reduces every mental experience to the level of cross talk between neurons in our brains. From the perspective of mainstream science, the electrical and chemical communication among nerve cells gives rise to every thought, whether we are savoring a cup of coffee or contemplating the ineffable.
. . .
Mr. [Christof] Koch [a professor of cognitive and behavioral biology at the California Institute of Technology] collaborated for nearly two decades with the late Francis Crick, the co-discoverer of DNA's structure, to produce a framework for understanding consciousness. The key, he says, is to look for the neural correlates of consciousness - the specific patterns of brain activity that correspond to particular conscious perceptions. Like Crick, Mr. Koch follows a strictly materialist paradigm that nerve interactions are responsible for mental states. In other words, he says, "no matter, never mind."

Crick summed up the materialist theory in The Astonishing Hypothesis: The Scientific Search for the Soul (Scribner, 1994). He described that hypothesis as the idea that "your joys and your sorrows, your memories and your ambitions, your sense of personal identity and free will, are in fact no more than the behavior of a vast assembly of nerve cells and their associated molecules."

What many people may find 'astonishing' about Crick's hypothesis is that among neurobiologists it is anything but astonishing. It is simply taken for granted as the way things are. Is it surprising that religious believers find such a conclusion unsettling?

Next: What does "free will" mean at a microscopic level?

POST SCRIPT: Why invading Iraq was morally and legally wrong

Jacob G. Hornberger, founder and president of The Future of Freedom Foundation has written a powerful essay that lays out very clearly the case of why the US invasion and occupation of Iraq is morally and legally indefensible, and why it has inevitably led to the atrocities that we are seeing there now, where reports are increasingly emerging of civilians being killed by US forces. Hornberger writes "I do know one thing: killing Iraqi children and other such “collateral damage” has long been acceptable and even “worth it” to U.S. officials as part of their long-time foreign policy toward Iraq."

The article is well worth reading.

June 21, 2006

IDC gets on board the brain train

An article titled Religion on the Brain (subscription required) in the May 26, 2006 issue of the Chronicle of Higher Education (Volume 52, Issue 38, Page A14) examined what neuroscientists are discovering about religion and the brain. It is a curious article. The author (Richard Monastersky) seems to be trying very hard to find evidence in support of the idea that brain research is pointing to the independent existence of a soul/mind, but it is clear on reading it that he comes up short and that there is no such evidence, only the hopes of a very small minority of scientists.

He reports that what neuroscientists have been doing is studying what happens in the brain when religious people pray or meditate or think about god or have other similar experiences.

At the University of Pennsylvania, Andrew B. Newberg is trying to get at the heart - and mind - of spiritual experiences. Dr. Newberg, an assistant professor of radiology, has been putting nuns and Buddhist meditators into a scanning machine to measure how their brains function during spiritual experiences.

Many traditional forms of brain imaging require a subject to lay down in a claustrophobia-inducing tube inside an extremely loud scanner, a situation not conducive to meditation or prayer, says Dr. Newberg. So he used a method called single-photon-emission computed tomography, or Spect, which can measure how a brain acted prior to the scanning procedure. A radioactive tracer is injected into the subjects while they are meditating or praying, and the active regions of the brain absorb that tracer. Then the subjects enter the scanner, which detects where the tracer has settled.

His studies, although preliminary, suggest that separate areas of the brain became engaged during different forms of religious experience. But both the nuns and the meditators showed heightened activity in their frontal lobes, which are associated in other studies with focused attention.

The experiments cannot determine whether the subjects were actually in the presence of God, says Dr. Newberg. But they do reveal that religious experiences have a reality to the subjects. "There is a biological correlate to them, so there is something that is physiologically happening" in the brain, he says.

The finding that certain parts of the brain get activated during 'spiritual experiences' is not surprising. Neither is the fact that those experiences have a 'reality to the subjects.' All acts of consciousness, even total hallucinations, are believed to originate in the brain and leave a corresponding presence there, and why the researcher ever expected this to demonstrate evidence for god is not made clear in the article.

It is clear that intelligent design crationism (IDC) advocates are concerned about the implication of brain studies for religious beliefs. It seems plausible that as we learn more and more about how the brain works and about consciousness in general, the idea of a mind independent of the brain becomes harder to sustain. Hence IDC advocates are promoting meetings that highlight the work of those few researchers who think they see a role for god within the brain. But these meetings are being held in secret.

Organizers of the conference, called "Research and Progress on Intelligent Design," had hoped to keep its existence out of public view. The university held a well-advertised public debate about ID that same week, but Michael N. Keas, a professor of history and the philosophy of science at Biola who coordinated the private meeting, would not confirm that it was happening when contacted by a reporter, nor would he discuss who was attending.

But one of the people doing this work is not shy about talking about his research.

When the leaders of the intelligent-design movement gathered for a secret conference this month in California, most of the talks focused on their standard concerns: biochemistry, evolution, and the origin of the universe. But they also heard from an ally in the neurosciences, who sees his own field as fertile ground for the future of ID.

Jeffrey M. Schwartz, a research professor of psychiatry at the University of California at Los Angeles, presented a paper titled "Intelligence Is an Irreducible Aspect of Nature" at the conference, held at Biola University, which describes itself as "a global center for Christian thought." Dr. Schwartz argued that his studies of the mind provide support for the idea that consciousness exists in nature, separate from human brains.

Michael Behe, the author of Darwin's Black Box which suggested five 'irreducibly complex' systems on which the IDC people have long hung their hopes for evidence of god, may be losing his status as the IDC movement's scientific standard bearer. His book came out in 1996 and nothing new has been produced since then. It is clear that you cannot dine forever on that meager fare, especially since evolutionary biologists keep churning out new results all the time. The need for a new poster child is evident and it seems as if the IDC movement has found one in psychiatrist Schwartz.

Leaders of the intelligent-design movement, though, see clear potential for Dr. Schwartz's message to resonate with the public.

"When I read Jeff's work, I got in touch with him and encouraged him to become part of this ID community," says William A. Dembski, who next month will become a research professor in philosophy at the Southwestern Baptist Theological Seminary, in Texas. "I regard him as a soul mate," says Mr. Dembski.

This may be a sign that the real science-religion battle is shifting away from biological evolution to brain research. This new battle will not be as high profile as the evolution one simply because brain studies are not part of the school curriculum and thus not subject to the policies of local school boards. So the evolution battle will likely continue to dominate the news headlines for some time.

Tomorrow we will see what neurobiologists think of this attempt to find god in their area of study. If the IDC advocates thought that the biologists were a tough foe to convince, they are going to find that the brain research community is even more resistant to their overtures.

POST SCRIPT: War profiteers

One of the underreported stories of the Iraq invasion is the enormous amount of money that is being made by some people because of it. Coming in fall 2006 is a new documentary by Robert Greenwald titled Iraq for Sale: The War Profiteers.

Greenwald's marketing strategy for his documentaries has been to bypass the main distribution networks and put his documentaries out straight to video for a low price. He did this with is earlier productions Outfoxed: Rupert Murdoch's war on journalism (a look at the bias of Fox news), Uncovered: The war on Iraq (which exposed the fraudulent case made for the Iraq invasion), and Walmart: The high cost of low prices.

Look out for the release of Iraq for Sale. You can see the preview here.

June 20, 2006

Religion's last stand-2: The role of Descartes

In the previous posting, I discussed two competing models of the mind/brain relationship.

It seems to me that the first model, where the physical brain is all there is and the mind is simply the creation of the brain, is the most persuasive one since it is the simplest and accepting it involves no further complications. In this model, our bodies are purely material things, with the brain's workings enabling us to think, speak, reason, act, and so forth. The idea of 'free will' is an illusion due to the brain being an enormously complicated system whose processes and end results cannot be predicted. (A good analogy would be classically chaotic systems like the weather. Because of the specific non-linearity of the equations governing weather, we cannot predict long-term weather even though the system is a deterministic and materialistic.)

The second model, that of an independently existing non-material mind/soul, separate from the brain and directing the brain, immediately raises all kinds of problems, which have long been recognized. The scientist-philosopher Rene Descartes (1596-1650) of "I think, therefore I am" fame was perhaps the first person to formulate this mind-body dualism (or at least he is the person most closely associated with the idea) and it is clear that he felt that it was necessary to adopt this second model if one was to retain a belief in god.

But he realized immediately that it raises the problem of how the non-material mind/soul can interact with the material brain/body to get it to do things. Princess Elizabeth of Bohemia, with whom Descartes had an extended correspondence, was unable to understand Descartes' explanation of this interaction and kept prodding him on this very question. Descartes had no adequate answer for her, even though both clearly wanted to believe in the existence of god and the soul. In the introduction to his translation of Descartes' Meditations and other Metaphysical Writings (which contains extended segments of the Elizabeth-Descartes correspondence), Desmond Clarke writes:

After repeated attempts to answer the question, how is it possible for something which is not physical to interact with something else which, by definition is not physical?, Descartes concedes that he cannot explain how it is possible.

But he tried, using the best scientific knowledge available to him at that time. He argued that the location of the soul's interaction with the body occurred in the pineal gland.

As is well known, Descartes chose the pineal gland because it appeared to him to be the only organ in the brain that was not bilaterally duplicated and because he believed, erroneously, that it was uniquely human. . . By localizing the soul's contact with body in the pineal gland, Descartes had raised the question of the relationship of mind to the brain and nervous system. Yet at the same time, by drawing a radical ontological distinction between body as extended and mind as pure thought, Descartes, in search of certitude, had paradoxically created intellectual chaos.

Although Descartes failed in his efforts to convincingly demonstrate the independent existence of the soul, research into the relationship of religious beliefs to the central nervous system of the brain has continued.

Descartes is an interesting character. Much of his scientific work, and even his temperament, seem to indicate a materialistic outlook. But at the same time, he took great pains to try and find proofs of god's existence. One gets the sense that he was a person trying to convince himself of something he did not quite believe in, and had he lived in a different time might have rejected god with some relief. The article on Descartes in Encyclopaedia Britannica Online, 13 June 2006 says:

Even during Descartes's lifetime there were questions about whether he was a Catholic apologist, primarily concerned with supporting Christian doctrine, or an atheist, concerned only with protecting himself with pious sentiments while establishing a deterministic, mechanistic, and materialistic physics.

The article points to reasons for the ambiguousness of his views, which could be due to the fact that there was, at that time, considerable fear of the power of the Catholic Church and this may have guided the way he presented his work.

In 1633, just as he was about to publish The World (1664), Descartes learned that the Italian astronomer Galileo Galilei (1564–1642) had been condemned in Rome for publishing the view that the Earth revolves around the Sun. Because this Copernican position is central to his cosmology and physics, Descartes suppressed The World, hoping that eventually the church would retract its condemnation. Although Descartes feared the church, he also hoped that his physics would one day replace that of Aristotle in church doctrine and be taught in Catholic schools.

Descartes definitely comes across as somewhat less than pious, and non-traditional in his religious beliefs.

Descartes himself said that good sense is destroyed when one thinks too much of God. He once told a German protégée, Anna Maria van Schurman (1607–78), who was known as a painter and a poet, that she was wasting her intellect studying Hebrew and theology. He also was perfectly aware of - though he tried to conceal - the atheistic potential of his materialist physics and physiology. Descartes seemed indifferent to the emotional depths of religion. Whereas Pascal trembled when he looked into the infinite universe and perceived the puniness and misery of man, Descartes exulted in the power of human reason to understand the cosmos and to promote happiness, and he rejected the view that human beings are essentially miserable and sinful. He held that it is impertinent to pray to God to change things. Instead, when we cannot change the world, we must change ourselves.

Clearly he was not orthodox in his thinking. Although he tried to believe in god, it was his emphasis on applying the materialistic principles that he used in his scientific work to try and identify the mechanism by which the mind interacts with the brain that has the potential to create the big problem for religion.

To sum up Descartes' argument, following sound scientific (methodological naturalistic) principles, he felt that if the mind interacted with the brain, then there had to be (1) some mechanism by which the non-material mind could influence the material brain, and (2) some place where this interaction took place. Although he could not satisfactorily answer the first question, he at least postulated a location for the interaction, the pineal gland. We know now that that is wrong, but the questions he raised are still valid and interesting ones that go to the heart of religion.

Next: What current researchers are finding about the brain and religion.

POST SCRIPT: Documentary on Rajini Rajasingham-Thiranagama

I have written before about the murder of my friend Rajini Rajasingham-Thiranagama, who had been an active and outspoken campaigner for human rights in Sri Lanka. I have learned that a documentary about her life called No More Tears Sister is the opening program in the 2006 PBS series P.O.V.

In the Cleveland area, the program is being shown on Friday, June 30, 2006 at 10:00pm on WVIZ 25. Airing dates vary by location, with some PBS stations showing it as early as June 27. The link above gives program listings for other cities. The synopsis on the website says:

If love is the first inspiration of a social revolutionary, as has sometimes been said, no one better exemplified that idea than Dr. Rajani Thiranagama. Love for her people and her newly independent nation, and empathy for the oppressed of Sri Lanka - including women and the poor - led her to risk her middle-class life to join the struggle for equality and justice for all. Love led her to marry across ethnic and class lines. In the face of a brutal government crackdown on her Tamil people, love led her to help the guerrilla Tamil Tigers, the only force seemingly able to defend the people. When she realized the Tigers were more a murderous gang than a revolutionary force, love led her to break with them, publicly and dangerously. Love then led her from a fulfilling professional life in exile back to her hometown of Jaffna and to civil war, during which her human-rights advocacy made her a target for everyone with a gun. She was killed on September 21, 1989 at the age of 35.

As beautifully portrayed in Canadian filmmaker Helene Klodawsky's "No More Tears Sister," kicking off the 19th season of public television's P.O.V. series, Rajani Thiranagama's life is emblematic of generations of postcolonial leftist revolutionaries whose hopes for a future that combined national sovereignty with progressive ideas of equality and justice have been dashed by civil war - often between religious and ethnic groups, and often between repressive governments and criminal rebel gangs. Speaking out for the first time in the 15 years since Rajani Thiranagama's assassination, those who knew her best talk about the person she was and the sequence of events that led to her murder. Especially moving are the memories of Rajani's older sister, Nirmala Rajasingam, with whom she shared a happy childhood, a political awakening and a lifelong dedication to fighting injustice; and her husband, Dayapala Thiranagama, who was everything a middle-class Tamil family might reject - a Sinhalese radical student from an impoverished rural background. Also included are the recollections of Rajani's younger sisters, Vasuki and Sumathy; her parents; her daughters, Narmada and Sharika; and fellow human-rights activists who came out of hiding to tell her story. The film rounds out its portrayal with rare archival footage, personal photographs and re-enactments in which Rajani is portrayed by daughter Sharika Thiranagama. The film is narrated by Michael Ondaatje, esteemed author of The English Patient and Anil's Ghost.

I knew Rajini well. We were active members of the Student Christian Movement in Sri Lanka when we were both undergraduates at the University of Colombo. It does not surprise me in the least that she threw herself with passion into the struggle for justice. She was brave and spoke the truth, even when it was unpalatable to those in power and with guns, and backed up her words with actions, thus putting her life on the line for her beliefs. Such people are rare. I am proud to have known her.

June 19, 2006

Religion's last stand: The brain

As almost everyone is aware, the science-religion wars have focused largely on the opposition of some Christian groups to the teaching of evolution. The religious objections to Darwin's theory of natural selection have been based on the fact that if the universe and the diversity of life that we see around us could have come about without the guidance of a conscious intelligence like god (even operating under the pseudonym of 'intelligent designer'), then what need would we have for believing in a god?

But while evolution has been the main focus of attention, I see that as more of a preliminary skirmish to the real final battle battleground for religion, which involves the brain.

The crucial question for the sustaining of religious beliefs is the relationship of the mind to the brain. Is the mind purely a creature of the brain, and our thoughts and decisions merely the result of the neurons firing in our neuronal networks? If so, the mind is essentially a material thing. We may have ideas and thoughts and a sense of consciousness and free will that seem to be nonmaterial, but that is an illusion. All these things are purely the products of interactions of matter in our brains. In this model, the mind is entirely the product of the physical brain. This premise underlies the articles selected for the website MachinesLikeUs.com.

Or is the mind a separate (and non-material) entity, that exists independently of the brain and is indeed superior to it, since it is the agent that can cause the neurons in our brain to fire in certain ways and thus enable the brain to think and feel and make decisions? In this model, the 'mind' is who 'I' really am, and the material body 'I' possess is merely the vehicle through which 'I' am manifested. In this model, the mind is synonymous with the soul.

If we are to preserve the need for god, then it seems that one must adopt the second model, that human beings (at the very least among animals) are not merely machines operating according to physical laws. We need to possess minds that enable us to think and make decisions and tell our bodies how to act. Most importantly, our minds are supposed to have the capacity of free-will. After all, what would be the value of an act of 'faith' if the mind were purely driven by mechanical forces in the brain?

It should be immediately obvious why the nature of the mind is a far more disturbing question for religion than evolution is or ever will be. With evolution, the question centers around whether the mechanism of natural selection (and its corollary principles) is sufficient to explain the diversity of life and changes over time. As such, the debate boils down to the question of weighing the evidence for and against and determining whether which is more plausible.

But plausibility lies in the eye of the beholder and we have seen in a previous posting how the desire to preserve beliefs one holds dear leads people to adopt intellectual strategies that enable them to do so.

Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) says that the strategies adopted are: "1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender."

In the discussions about evolution, people who wish to preserve a role for god have plenty of viable options at their disposal. They can point to features that seem to have a low probability of occurring without the intervention of an external, willful, and intelligent guidance (aka god). These are the so-called 'irreducibly complex' systems touted by intelligent design creationism (IDC) advocates. Or they can point to the seeming absence of transitional fossils between species. Or they can point to seemingly miraculous events or spiritual experiences in their lives.

Scientists argue that none of these arguments are valid, that plausible naturalistic explanations exist for all these things, and that the overwhelming evidence supports evolution by natural selection as sufficient to explain things, without any need for any supernatural being.

But in one sense, that argument misses the point. As long as the debate is centered on weighing the merits of competing evidence and arriving at a judgment, van Gelder's point is that it does not matter if the balance of evidence tilts overwhelmingly to one side. People who strongly want to believe in something will take the existence of even the slenderest evidence as sufficient for them. And it seems likely that the evolution debate, seeing as it involves complex systems and long and subtle chains of inferential arguments, will always provide some room to enable believers to retain their beliefs.

But the mind/brain debate is far more dangerous for religion because it involves the weighing of the plausibility of competing concepts, not of evidence. The fundamental question is quite simple and easily understood: Is the brain all there is and the mind subordinate to it, a product of its workings? Or is the mind an independently existing entity with the brain subordinate to it?

This is not a question that scientific data and evidence has much hope of answering in the near future. Eliminating the mind as an independently existing entity has all the problems associated with proving a negative, and is similar to trying to prove that god does not exist.

But since the mind, unlike god, is identified with each individual and is not necessarily directly linked to god, discussing its nature carries with it less religious baggage, and its nature can be examined more clinically

Next: Descartes gets the ball rolling on the mind and the brain.

POST SCRIPT: Choosing god

I came across this story (thanks to onegoodmove) that illustrates the point that I was trying to make on the way people choose what kind of god to believe in. I have no idea if the events actually occurred, though, or if the story has been embellished to make the point.

The subject was philosophy. Nietzsche, a philosopher well known for his dislike of Christianity and famous for his statement that 'god is dead', was the topic. Professor Hagen was lecturing and outside a thunderstorm was raging. It was a good one. Flashes of lightning were followed closely by ominous claps of thunder. Every time the professor would describe one of Nietzsche's anti-Christian views the thunder seemingly echoed his remarks.

At the high point of the lecture a bolt of lightning struck the ground near the classroom followed by a deafening clap of thunder. The professor, non-plussed, walked to the window, opened it, and starting jabbing at the sky with his umbrella. He yelled, "You senile son of a bitch, your aim is getting worse!"

Suffice it to say that some students were offended by his irreverent remark and brought it to the attention of the Department Head. The Department Head in turn took it to the Dean of Humanities who called the professor in for a meeting. The Dean reminded the professor that the students pay a lot of tuition and that he shouldn't unnecessarily insult their beliefs.

"Oh," says the professor, "and what beliefs are those?"

"Well, you know" the Dean says, "most students attending this University are Christians. We can't have you blaspheming during class."

"Surely" says the professor, "the merciful God of Christianity wouldn't throw lightning bolts. It's Zeus who throws lightning bolts."

Later the Dean spoke with the Department Head, and said, "The next time you have a problem with that professor, you handle it, and let him make an ass out of you instead."

June 16, 2006

The desire for belief preservation.

In the previous post we saw how human beings are believed to not be natural critical thinkers, preferring instead to believe in the first plausible explanation for anything that comes along, not seeing these initial explanations as merely hypotheses to be evaluated against competing hypotheses.

But one might think that when we are exposed to alternative hypotheses, we might then shift gears into a critical mode. But Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46) argues that what foils this is the human desire for belief preservation.

He quotes seventeenth century philosopher Francis Bacon who said:

The mind of man is far from the nature of a clear and equal glass, wherein the beams of things should reflect according to their true incidence; nay, it is rather like an enchanted glass, full of superstition and imposture, if it be not delivered and reduced.

In other words, van Gelder says, "the mind has intrinsic tendencies toward illusion, distortion, and error." These arise from a combination of being hard-wired in our brains (because of evolution), natural growth of our brains as we grow up in the Earth's environment, and the influence of our societies and cultures. "Yet, whatever their origin, they are universal and ineradicable features of our cognitive machinery, usually operating quite invisibly to corrupt our thinking and contaminate our beliefs."

All these things lead us to have cognitive biases and blind spots that prevent us from seeing things more clearly, and one of the major blind spots is that of belief preservation. van Gelder says that "At root, belief preservation is the tendency to make evidence subservient to belief, rather than the other way around. Put another way, it is the tendency to use evidence to preserve our opinions rather than guide them."

van Gelder says that when we strongly believe some thing or desire it to be true, we tend to do three things: "1. We seek evidence that supports what we believe and do not seek and avoid or ignore evidence that goes against it. . . 2. We rate evidence as good or bad depending on whether it supports or conflicts with our belief. That is, the belief dictates our evaluation of the evidence, rather than our evaluation of the evidence determining what we should believe. . . 3. We stick with our beliefs even in the face of overwhelming contrary evidence as long as we can find at least some support, no matter how slender."

This would explain why (as vividly demonstrated in the popular video A Private Universe) people hold on to their erroneous explanations about the phases of the moon even after they have been formally instructed in school about the correct explanation.

This would also explain the question that started these musings: Why for so long had I not applied the same kinds of questioning to my religious beliefs concerning god, heaven, etc. that I routinely applied to other areas of my life? The answer is that since I grew up in a religious environment and accepted the existence of god as plausible, I did not seek other explanations. Any evidence in favor of belief (the sense of emotional upliftment that sometimes occurs during religious services or private prayer, or some event that could be interpreted to indicate god's action in my life or in the world, or scientific evidence that supported a statement in the Bible) was seized on, while counter evidence (such a massive death and destruction caused by human or natural events, personal misfortunes or tragedies, or scientific discoveries that contradicted Biblical texts) was either ignored or explained away. It was only after I had abandoned my belief in god's existence that I was able to ask the kinds of questions that I had hitherto avoided.

Did I give up my belief because I could not satisfactorily answer the difficult questions concerning god? Or did I start asking those questions only after I had given up belief in god? In some sense this is a chicken-and-egg problem. Looking back, it is hard to say. Probably it was a little of both. Once I started taking some doubts seriously and started questioning, this probably led to more doubts, more questions, until finally the religious edifice that I had hitherto believed in just collapsed.

In the series of posts dealing with the burden of proof concerning the existence of god, I suggested that if we use the common yardsticks of law or science, then that would require that the burden of proof lies with the person postulating the existence of any entity (whether it be god or a neutrino or whatever), and that in the absence of positive evidence in favor of existence, the default assumption is to assume the non-existence of the entity.

In a comment to one of those postings, Paul Jarc suggested that the burden of proof actually lay with the person trying to convince the other person to change his views. It may be that we are both right. What I was describing was the way that I thought things should be, while Paul was describing the way things are in actual life, due to the tendency of human beings to believe the first thing that sounds right and makes intuitive sense, coupled with the desire to preserve strong beliefs once formed.

van Gelder ends up his article with some good advice:

Belief preservation strikes right at the heart of our general processes of rational deliberation. The ideal critical thinker is aware of the phenomenon, actively monitors her thinking to detect its pernicious influence, and deploys compensatory strategies.

Thus, the ideal critical thinker
• puts extra effort into searching for and attending to evidence that contradicts what she currently believes;
• when “weighing up” the arguments for and against, gives some “extra credit” for those arguments that go against her position; and
• cultivates a willingness to change her mind when the evidence starts mounting against her.

Activities like these do not come easily. Indeed, following these strategies often feels quite perverse. However, they are there for self-protection; they can help you protect your own beliefs against your tendency to self-deception, a bias that is your automatic inheritance as a human being. As Richard Feynman said, “The first principle is that you must not fool yourself - and you are the easiest person to fool.”

The practice of science requires us to routinely think this way. But it is not easy to do and even scientists find it hard to give up their cherished theories in the face of contrary evidence. But because scientific practice requires this kind of thinking, this may also be why science is perceived as 'hard' by the general public. Not because of its technical difficulties, but because you are constantly being asked to give up beliefs that seem so naturally true and intuitively obvious.

POST SCRIPT: The people who pay the cost of war

I have nothing to add to this powerful short video, set to the tune of Johnny Cash singing Hurt. Just watch. (Thanks to Jesus' General.)

June 15, 2006

Why religious (and other) ideas are so persistent

When people are asked to explain the phases of the moon, the response given most frequently is that they are caused by the shadow of the Earth falling on the moon. They are not aware that this explanation holds true only for rare cases of eclipses, and not for the everyday phases.

When the people making these responses are asked to consider the alternative (and correct) model in which the phases are caused by one part of the moon being in the shadow thrown by the other part (which can be easily seen by holding up any object to the light and seeing that parts of it are in its own shadow), such people quickly recognize that this alternative self-shadow model is more plausible than the Earth-shadow model.

So the interesting question is why, although the correct model is not hard to think up, people stick for so long with their initial erroneous model. The answer is that they did not even consider the possibility that the Earth-shadow explanation they believed in was just a hypothesis that ought to be compared with other, alternative, hypotheses to see which was more consistent with evidence. They simply accepted uncritically as true the first hypothesis they encountered and stayed with it. Why is this?

Tim van Gelder, writing in the article Teaching Critical Thinking: Some Lessons from Cognitive Science (College Teaching, Winter 2005, vol. 53, No. 1, p. 41-46), looks into why this kind of critical thinking is rare among people and his article (summarizing the insights gleaned from cognitive science research) sheds some light on my own puzzlement as to why it took me so long to question the implausible aspects of my beliefs in heaven and immortality.

van Gelder points out that critical thinking does not come naturally to people, that it is 'a highly contrived activity' that is hard and has to be deliberately learned and cultivated. He says that "[e]volution does not waste effort making things better than they need to be, and homo sapiens evolved to be just logical enough to survive, while competitors such as Neanderthals and mastodons died out."

But if we are not by nature critical thinkers, what kind of thinkers are we? To answer this question, van Gelder refers to Michael Shermer's 2002 book Why people believe weird things: Pseudoscience, superstition, and other confusions of our time and says:

We like things to make sense, and the kinds of sense we grasp most easily are simple, familiar patterns or narratives. The problem arises when we do not spontaneously (and do not know how to) go on to ask whether an apparent pattern is really there or whether a story is actually true. We tend to be comfortable with the first account that seems right, and we rarely pursue the matter further. Educational theorist David Perkins and colleagues have described this as a “makes-sense epistemology”; in empirical studies, he found that students tend to

act as though the test of truth is that a proposition makes intuitive sense, sounds right, rings true. They see no need to criticize or revise accounts that do make sense - the intuitive feel of fit suffices.

Since for most of us, the religious 'explanations' of the big questions of life, death, and meaning are the ones we are first exposed to as children, and they do provide a rudimentary explanatory pattern (even if in a selective and superficial way), we tend to accept them as true and thus do not actively look for, and even avoid, alternative explanations.

But what happens when alternative explanations thrust themselves on us, either in school or elsewhere? Do we then go into critical thinking mode, evaluating the alternatives, weighing the competing evidence and reasoning before forming a considered judgment?

Alas, no. But the reasons for that will be explored tomorrow.

POST SCRIPT: That bad old AntiChrist

I wrote before about the new video game Left Behind: Eternal Forces. Their website has an interesting FAQ page which says:

The storyline in the game begins just after the Rapture has occurred - when all adult Christians, all infants, and many children were instantly swept home to Heaven and off the Earth by God. The remaining population - those who were left behind –-are then poised to make a decision at some point. They cannot remain neutral. Their choice is to either join the AntiChrist - which is an imposturous one world government seeking peace for all of mankind, or they may join the Tribulation Force - which seeks to expose the truth and defend themselves against the forces of the AntiChrist.

So the goal of the AntiChrist is to create a one world government seeking peace for all of mankind! What a dastardly plan. So naturally they must be massacred in the name of Jesus to prevent this awful fate from occurring.

For those who might be concerned that this game goes counter to the message of love preached by Jesus in the Bible, Jesus' General thoughtfully provides the relevant text of the inexplicably overlooked Gospel of Left Behind, which provides the justification for the violent philosophy of the game.

Also, don't forget to check out the animation "Don't dis Elisha!" which shows the story of how the prophet Elisha cursed children who teased him, who were then killed by bears sent by god. (Again, thanks to the ever-vigilant General.)

Who knows, the Elisha story could form the basis for another video game marketed in Christmas 2007 by the same people behind the Left Behind: Eternal Forces game. In the new game the players could represent bears and the goal is to attack and kill as many children as possible.

June 14, 2006

The dubious appeal of immortality

During the time I was a Christian, I took it for granted that immortality was not only a Good Thing, it was the thing that mattered most. The idea that if one believes in Jesus (or in some other way meets the needs of Christian doctrine), one is saved and has eternal life is a central tenet of Christianity. The formulation "For God so loved the world that he gave his only begotten son that whosoever believes in him shall not perish but have eternal life" is something that any Christian can recite. It makes up the famous verse John 3:16 which you will often see written on a bed sheet and draped over railings at big sporting events. (This passage is so familiar to Christians that I was able to type it out accurately after all these years without even looking it up.)

What is surprising is that despite all the emphasis on going to heaven as the main point of living, the Bible contains very few actual descriptions of the place and what people there actually do. Even the good folks at Rapture Ready, who are counting the minutes until the world ends and they get taken up, admit that they don't have much data on this key question. Their page What Heaven Will Be Like is very brief. (Disturbingly, for me personally at least, it says that in heaven the laws of physics do not apply. Why is this information not given to students when they are deciding what to major in? In the unlikely event that I am raptured, all my years of study and work will have been wasted and in heaven I will have to learn a new trade.)

The one really concrete description comes from (where else?) the Book of Revelations and it says that everyone in heaven will live in a place called New Jerusalem, which consists of a cube of side 1,500 miles. Although large (roughly the size of the moon), it should be easy to visit friends since the Rapture Ready website says that people in heaven will be able to travel instantaneously, presumably because of their ability to circumvent the laws of physics that are such restrictive nuisances for us on Earth.

Islam is more detailed than Christianity in its descriptions of heaven. Ibn Warraq writes in Virgins? What virgins? that the Koran gives the following description:

What of the rewards in paradise? The Islamic paradise is described in great sensual detail in the Koran and the Traditions; for instance, Koran sura 56 verses 12-40 ; sura 55 verses 54-56 ; sura 76 verses 12-22. I shall quote the celebrated Penguin translation by NJ Dawood of sura 56 verses 12-39: "They shall recline on jewelled couches face to face, and there shall wait on them immortal youths with bowls and ewers and a cup of purest wine (that will neither pain their heads nor take away their reason); with fruits of their own choice and flesh of fowls that they relish. And theirs shall be the dark-eyed houris, chaste as hidden pearls: a guerdon for their deeds... We created the houris and made them virgins, loving companions for those on the right hand. . ."

So basically heaven for Muslims consists of your choice of food and drink and sex, with no negative after effects. Ibn Warraq's article describes Muslim commentators who go into even more great detail about the sexual pleasures of heaven, seemingly written exclusively from the male perspective.

I wrote previously that asking questions like where heaven is located and how it is related to life on Earth can make belief complicated because of the scientific problems it creates. First off, how come we cannot detect heaven's existence although we are now able to probe the far reaches of the universe? Is heaven in some parallel universe, with impenetrable barriers? But they cannot be totally impenetrable since people can get there from here. Since most people believe that people in heaven can see and hear us as we go about on Earth, that means that light and sound waves can travel from Earth to heaven, crossing the barrier. So must it be a one-way barrier? How would such a barrier work to prevent two-way transmission? (This is again the kind of question a physicist would ask, because I have trouble accepting that the laws of physics don't apply in heaven.)

But another problem is: what is it about heaven that is supposed to make it so attractive? Most people, even if they have no explicit model to work from, envisage eternal life in heaven as where everything is very pleasant and discouraging words are never heard. But surely if everything is perfect, and people in heaven live forever experiencing neither pain nor sorrow, it also has to be dull?

And that is the key problem. I cannot conceive of any way of conceptualizing heaven that is not also mind-numbingly boring. The only way to overcome that is to think that our personalities in heaven also change so that we never get tired of unchanging perfection. ("Wow, this grape is delicious! Wow, so is the next one! And the next one!...") But then people become boring.

For example, suppose you are an avid golfer and your vision of heaven is where you can play everyday in perfect weather. Does being in heaven mean that you hit perfect shots each time? But if you do and your opponent does too, wouldn't that take the fun out of the game? Golf is trivial, but I cannot think of anything at all that would not get tedious very quickly if one was assured of constant success. Pleasure in life goes along with failure. Take away failure and pain and loss and I am not sure what pleasure means.

The only thing that I personally can see that is good about immortality is that I may learn the answers to some difficult and unanswered questions that may elude me in my lifetime: Is quantum mechanics the ultimate theory or is there a deeper underlying theory? What exactly happens when the quantum wave function collapses in its interaction with the observer? How can one unambiguously draw the quantum/classical system boundary? How does the brain produce consciousness and the appearance of free will? Why do so many people find Julia Roberts attractive?

So basically, my own idea of heaven is to have the equivalent of unlimited high-speed internet access and subscriptions to science journals. But even that would be boring if I had to wait around a long time for Earth-bound scientists to find (if they ever do) the answers to those questions. On the other hand, if people in heaven already know the answers to these questions and told me as soon as I got there (not that there's much chance of that), then there would be nothing to look forward to.

I can think of many, many things that would be wonderful to experience for a very short time but all of them would bore me totally if they went on indefinitely. It reminds me of the time, soon after high school in Sri Lanka, when I had a temporary job working in a chocolate factory. I was told that we could eat all the chocolates we wanted and since I loved chocolate, this sounded like heaven, and everyone envied me. But after a week of eating chocolate, I was sick of it.

MrBoffo.jpg

I am becoming convinced that we have pleasure on Earth precisely because it is unpredictable and transient, it is mixed with pain and failure, and we know that everything, including our lives, will eventually come to an end. We experience happiness and pleasure at moments in time, but for those moments to occur they must be preceded by periods of anticipation, disappointment, and failures. Take those things away and there is no pleasure either

It amazes me that I never asked any of these questions or thought of any of these things until now. Even during the many years I was religious, I never questioned then what form eternal life would take and whether it is such an unequivocal Good Thing after all. This is surprising because I was always curious about other things and trying to make connections.

Is there something in the way we are taught our religious beliefs that gently steers us away from these questions because they are so problematic? Why had I never probed deeply into what heaven might be like?

In the next posting, I will look at what research in cognitive science says about why we don't ask questions or look for answers to certain questions, even when it might seem obvious that we should do so.

POST SCRIPT: Our tax dollars at work in the DHS

Ray LeMoine writes about his weird experience at the hands of Department of Homeland Security officials on his return to the US after traveling in the Islamic world.

June 13, 2006

Choosing the god we want

The series of postings on the burden of proof in relation to the existence of god (see part 1, part 2, and part 3) produced some very thoughtful comments by readers that explore many facets of the issue, and I would urge those interested to read those comments.

What initiated that series of posts was Laplace's comment that he had no need to hypothesize the existence of god to understand the workings of the universe. I agree with that point, that whether or not one believes in god is a matter of choice and that there is no evidence for the existence of god that is compelling in the way that science requires in support of its hypotheses. In the absence of such compelling positive evidence, I simply proceed on the assumption of non-existence of god.

But the issue of choice is not just between the existence and non-existence of god. Religious people who personally feel that there exists evidence for the existence of god also have to make a choice, except that in their case they have to choose what kind of god to believe in. Believing in a Christian god means rejecting a Jewish or Muslim or Hindu or other vision of god.

But the need for making choices does not end there. Even if one has chosen to believe in a Christian god, one has to make further choices. The fact is that there are many different kinds of god portrayed in the Bible - vengeful, loving, murderous, merciful, just, capricious, cruel, generous, and so on. A god who can order every living thing in the world to be drowned except for one family and two representatives of each species (in Noah's flood) is revealing quite a different attitude to life and death from a god who tells Abraham (Genesis 18:16-33) that he cannot bring himself to destroy the wicked town of Sodom because of the possibility that it might contain even as few as ten righteous persons who did not deserve to die. It is impossible to make the case that there is a single vision of god in the Bible, unless one also asserts that human comprehension is too weak to understand and resolve the different portrayals into one non-contradictory whole.

It is clear that what most religious believers have done is chosen what type of god they wish to believe in and what type to reject. In the contemporary political context, some Christians have chosen the gay-lifestyle hating god, while others have chosen a gay-lifestyle accepting god, and so on. Depending on what choice was made results in each person having to explain away those features that seem to contradict the view of god they have chosen. This partly explains why churches tend to splinter into so many different denominations and why there are so many disagreements about what god expects from people and how god expects people to behave.

If you want to believe in a kind and loving god, you have some stiff challenges to overcome, not limited to the appalling massacre of people in the great flood. For example, take the story of Abraham and Isaac. For those not familiar with this story (Genesis chapters 21 and 22), Abraham and Sarah did not have children for a long time and finally (when Abraham was 100 years old) she gave birth to Isaac. But then god decides to 'test' Abraham and asks him to sacrifice Isaac as a burnt offering. Abraham obeys, making all the preparations for this horrendous sacrifice until at the very last moment, just as he is about to kill the boy, god stops him. God is impressed by Abraham's unquestioning obedience and rewards him.

This story is disturbing on a whole host of levels. What kind of god would ask a parent to kill his child as a test of faith? And what kind of person would be willing to kill his own child to prove his faith? If we knew of anyone today who was planning to sacrifice his child to prove his worthiness to god, would we not feel justified in labeling that person as dangerously hallucinating and do everything we could to stop him, including forcible restraint and even incarceration? So why is Abraham's behavior seen as somehow noble? And why is god given a pass for asking someone to commit murder? Even if one were to assume that god and Abraham were engaged in some monstrous game of chicken, not believing that the sacrifice will be actually carried out but simply playing mind games, waiting for the other to relent first so that the murder is avoided, this episode still does not reflect well on either party.

Or take the tsunami which killed hundreds of thousands of people in South East Asia in December 2004. I moderated a discussion of faculty members from the major religions to discuss the question of theodicy (theories to justify the ways of God to people, or understanding why bad things happen to good people). But the very topic of theodicy assumes that what we think of as bad (such as the deaths of children) are in fact not deliberate acts of god. Why should we think that? How do we know that god did not deliberately kill all these people out of a sense of whimsy or out of callousness or because he was bored or because he likes seeing people suffer?

The answer is that we don't really know the answers to these questions or to the ones raised about Abraham and Isaac because we have no way of knowing the true nature of god even if we believed in god's existence. What most people have done is choose to believe in a god who would not casually murder people. They are not compelled to make such a choice by anything in the Bible.

This illustrates a paradox. Believers in a god will often explain away disturbing facts by arguing that we mere mortals cannot really understand god's ineffable plan, but at the same time argue that they know god's nature. The reality is that people are choosing a god that is congenial to their world-view.

Choice is always involved whether one is a believer or not. While believers choose one vision of god and reject all others, atheists go just one step further and reject all visions of god. It is not such a big step.

POST SCRIPT: Update on net neutrality

As I wrote earlier, the net neutrality amendment was defeated in the House of Representatives. The issue now goes to the Senate, which is where there is the best chance of writing it into law. The excellent website Talking Points Memo is maintaining a list of which way each senator is leaning on this issue for those of you who want to try and exert pressure on your own senator.

For more information on this issue, updates, and contact information to take action, see SaveTheInternet.com.

June 09, 2006

War and Death

I always liked Chandi. He was my cousin's cousin, not a near relative, but his family and my family and the family of cousins in-between have been close since childhood. Sri Lanka is a small country, which makes it is easy for children to spend a lot of time with one another and thus one became very close with one's childhood friends. Although Chandi was five years older than I was, and I was closer in age to his younger brother and sister, age gaps among children in Sri Lanka are not as distancing as they seem to be in the US, and Chandi had an easy-going, friendly, warm, and generous nature that made people like him.

On my return from Sri Lanka last year, we stopped for a few days in London and Chandi and his wife Anula (who also happened to be visiting London for a family wedding) came to visit us and we caught up on all that happened to our respective families in the decades since we had last met. Chandi was that same gentle and fun-loving person he had always been. As is usually the case when you meet up with good friends whom you've known for a long time, we just picked up from where we had left off and it was as if no time had elapsed since our previous meeting.

Hence it was a shock to me on the first day of my return from Australia last week to learn that Chandi, Anula, and five of their friends and relatives had been blown up by a powerful landmine while they were all visiting a nature reserve in Sri Lanka. The Sri Lankan government alleges that it was the Tamil Tigers who planted that mine in a national park, hoping to discourage tourism. The Tigers deny responsibility, counter-alleging that it was the work of the government.

This is the kind of killing in wartime where the truth will never be discovered, nor the perpetrators punished, just like the case of Rajini Rajasingham-Tiranagama many years ago. Chandi and Anula will join Rajini as another statistic, a 'casualty of war,' 'collateral damage,' 'innocent bystanders caught in the crossfire,' and all the other soothing phrases to lull us into forgetting that wars kill real people, people with families, and children, and parents, and friends. And in modern wars, an increasing number of casualties are innocent people, just trying to make the best of the one life we have.

Chandi and Anula were the latest victims in the long-running civil war, now experiencing a highly shaky and frequently violated truce, between the Sinhala majority government and Tamil Tiger guerrillas, a bitter irony since in their own personal lives (Chandi being Tamil and Anula being Sinhalese) they had seen through the shallowness of ethnic divisions and knew each other as simply human beings.

But the same holds true for the wars in Iraq, Afghanistan, Somalia, East Timor. Most people live their lives without thinking of themselves and their families as fulfilling some grand ethnic or religious destiny. They have simple ambitions about creating better lives for themselves and improving their corner of the world . But such innocent people die as part of the schemes and ambitions of so-called leaders and their war-hungry supporters, who make sure that they themselves and their own loved ones are carefully sheltered from the consequences of their actions.

War is a brutal, cruel thing, destroying the lives of people and societies and leaving scars that last generations. I have little patience with those who have never known what it is like to live through war and have no idea what it does to people, and thus find it easy to act as cheerleaders for it, urging others to fight and die as if it were some kind of game, as if it were some clinical strategy exercise rather than seeing it for what it actually is: blood, guts, limbs torn apart, brains scattered, children orphaned and scarred for life, bereaved parents and spouses.

I am not a pacifist. I can see where there can be rare occasions where war may be the only option left. But what appalls me is when decisions to go to war are taken casually, rather than done after much carefully deliberation and elimination of all other options,
as the absolutely last resort that it should be.

I do not wish war on anyone. The only positive thing I can see about anyone experiencing war at close range is that it would so disgust them that they would recoil from it and not wish it on anyone else. The US has been fortunate in not experiencing a protracted and bloody war on its own soil since the civil war. Thus people have been spared the sight of war up close, seeing their friends, family and neighbors killed and maimed and their homes and neighborhoods destroyed. This may also explain why there is such a casual and unconcerned response here to the frequent decisions of the US government to wage war in other countries. Even as deadly violence occurs on a daily basis in Iraq, it is scarcely even a topic of discussion here. It just doesn't seem real, except for the families of the US troops who are killed and injured.

Veteran Australian war correspondent John Pilger describes what war really looks like and how the media sanitizes the carnage of war to make it more acceptable;

In Vietnam, where more than a million people were killed in the American invasion of the 1960s, I once watched three ladders of bombs curve in the sky, falling from B52s flying in formation, unseen above the clouds.

They dropped about 70 tons of explosives that day in what was known as the "long box" pattern, the military term for carpet bombing. Everything inside a "box" was presumed destroyed.

When I reached a village within the "box", the street had been replaced by a crater.

I slipped on the severed shank of a buffalo and fell hard into a ditch filled with pieces of limbs and the intact bodies of children thrown into the air by the blast.

The children's skin had folded back, like parchment, revealing veins and burnt flesh that seeped blood, while the eyes, intact, stared straight ahead. A small leg had been so contorted by the blast that the foot seemed to be growing from a shoulder. I vomited.

I am being purposely graphic. This is what I saw, and often; yet even in that "media war" I never saw images of these grotesque sights on television or in the pages of a newspaper.

The response to the recent killing of al-Zarqawi in Iraq illustrates the callousness to death that has overtaken us. (This report by Patrick Cockburn, one of the best journalists covering Iraq, gives the background on the rise and fall of this "little known Jordanian petty criminal turned Islamic fundamentalist fanatic.") One of the contributors on the website DailyKos wrote this:

CHEERS to finding a really evil needle in a really big haystack. U.S. forces rocked terrorist Abu Musab "Dick" al-Zarqawi's world last night when they tossed a thousand pounds of explosive whupass down his gullet. They found his body in the bedroom. And the kitchen. And the den. And the garage. And the neighbor's apartment. And I think I found an eyebrow in my Cocoa Puffs this morning. My only regret: he didn't know what hit him.

What causes people to respond in such a gleeful and barbaric manner? Even though al-Zarqawi himself is charged with appallingly violent and brutal crimes that were committed with no respect for human life, how can anyone respond to another's death with such frivolousness? The writer is clearly reveling in his ghastly descriptions in a manner that makes me suspect that for him these are just words, that sudden violent death is nothing that he has seen personally or had happen to anyone he knew.

Contrast this with the response of Michael Berg (the father of Nicholas Berg, who was beheaded by al-Zarqawi) on hearing the same news:

Well, my reaction is I'm sorry whenever any human being dies. Zarqawi is a human being. He has a family who are reacting just as my family reacted when Nick was killed, and I feel bad for that.

I feel doubly bad, though, because Zarqawi is also a political figure, and his death will re-ignite yet another wave of revenge, and revenge is something that I do not follow, that I do want ask for, that I do not wish for against anybody. And it can't end the cycle. As long as people use violence to combat violence, we will always have violence.

The incredulous CNN interviewer tries to get a more vindictive and bloodthirsty response by reminding Berg of the brutal way his own son died, and asks him "[A]t some point, one would think, is there a moment when you say, 'I'm glad he's dead, the man who killed my son'?" Michael Berg replies "No. How can a human being be glad that another human being is dead?"

Michael Berg's reaction to the death of his son's murderer will be incomprehensible to many people because we have got so used to thinking that revenge killing is an honorable thing, something to be excused, or desired, even exulted over. The targeted killings of political enemies is now routinely celebrated, and the possibility of capturing them alive is not even considered, dismissed as a sign of pusillanimity. 'Real men,' it is believed, kill the 'bad guys' and 'evildoers' and then laugh and boast about it. Our childish language matches our cartoonish attitude towards grave issues of life and death.

But let's pause and think about this for a minute. If we celebrate and justify summary executions when it is done by our own government, how can we condemn it when it is done against us by suicide bombers? Just as much as the level of our commitment to free speech is truly measured by how willing we are to protect the speech of those whose ideas we despise, so is our humanity measured by how we respond to the deaths of those whose actions we loathe.

vietnamchildren.jpgEach war has its iconic pictures and the ones that affect me most are not the headshots of the corpses of well known figures like Saddam Hussein's sons (Uday and Qusay) and al-Zarqawi that are splashed in large color photographs across the front pages of newspapers, as if they were trophies. What moves me are the pictures of children affected by war. The picture that I will always remember about Vietnam is the one that shows a crying young child running away with others from the scene of a bombing, her clothes and skin burned from the napalm dropped on her village, smoke from its ruins billowing in the background.

Iraqchild.jpgFor Iraq, the picture that haunts me is the one that shows a blood-spattered terrified Iraqi child just after US soldiers had killed her parents as they were traveling in their car near a checkpoint. I have seen some truly grisly and stomach churning pictures of the casualties of the car bombings and shootings and bombings of the Iraq war, of children and old people, men and women, their dead and mutilated bodies captured in indelible images that have never been seen by most people. But there is something about this picture of a grief stricken little crying child, cowering under the gun of a heavily armed soldier, her upturned supplicating hands stained with the blood of her parents, that fills me with an almost unbearable sadness.

I will never forget these pictures. They are seared in my memory as symbols of the atrocity of war. Each time I see or remember them I feel sick at the level of brutality to which we have sunk.

War is not a game. It creates monsters while at the same time destroying people and societies in unspeakable ways. The cycle of killing and counter killing, death and retribution, revenge and counter-revenge usually ends up with mostly the innocent dying.

And now it has taken the lives of my friends Chandi and Anula. Ordinary people. Living ordinary lives. Just like you and me.

Saving the internet: The importance of net neutrality

[UPDATE: Read this Democracy Now transcript for clarifications on the net neutrality issue.]

After singing the praises of the internet in the last three posts, it is now time to sound the alarm. There are serious threats underway to undermine the very features of the internet that have made it the democratizing force it has been so far, and these efforts should be resisted strongly. Last night, the House of Representatives voted down (268-152) an amendment that would have placed into law a provision that would ensure something called 'net neutrality.' The issue now goes before the Senate. Founders of the web like Tim Berners-Lee argue that we could be entering a 'dark period' in which a few suppliers would be able to determine what users could do and see on the web.

Here's the issue. Currently, you (the end user) can use any browser you like and go to any site that you want and the speed and ease with which you can access them is largely determined by the content creators and consumers: i.e., the server at the other end and your own computer. The general features of the connecting medium (whether cable, phone line, or wireless) play a neutral role in this process. Think of the medium like the role that roads play in transport. Everyone can use them equally, although each user may use a different kind of vehicle.

But the big telecommunication companies (telcos) that own that connecting medium (AT&T, Verizon, BellSouth) are arguing that since they are the ones who own that infrastructure, they should be able to use that control to generate additional revenue by providing different levels of service (affecting speed and quality) depending on how much people pay. It is as if all roads become toll roads and how much access you get to them, how quickly you could get on them, and how fast you can go on them is determined by how much you pay the road owners.

As the Washington Post reported on December 1, 2005:

A senior telecommunications executive said yesterday that Internet service providers should be allowed to strike deals to give certain Web sites or services priority in reaching computer users, a controversial system that would significantly change how the Internet operates.

William L. Smith, chief technology officer for Atlanta-based BellSouth Corp., told reporters and analysts that an Internet service provider such as his firm should be able, for example, to charge Yahoo Inc. for the opportunity to have its search site load faster than that of Google Inc.

Or, Smith said, his company should be allowed to charge a rival voice-over-Internet firm so that its service can operate with the same quality as BellSouth's offering.

This has huge ramifications for the internet, as the website SaveTheInternet.com points out. Here's a sample of the threats (more complete list here):

Google users - Another search engine could pay dominant Internet providers like AT&T to guarantee the competing search engine opens faster than Google on your computer.
Ipod listeners -A company like Comcast could slow access to iTunes, steering you to a higher-priced music service that it owned.
Political groups - Political organizing could be slowed by a handful of dominant Internet providers who ask advocacy groups to pay "protection money" for their websites and online features to work correctly.
Online purchasers - Companies could pay Internet providers to guarantee their online sales process faster than competitors with lower prices - distorting your choice as a consumer.
Bloggers - Costs will skyrocket to post and share video and audio clips - silencing citizen journalists and putting more power in the hands of a few corporate-owned media outlets.

The telcos are using their money (and correspondingly huge lobbying muscle) to try and get legislation through Congress to enable them to do this, and are being fought by grassroots groups. It is speculated that one reason that phone companies so easily (and secretly) gave phone records over to the government in its NSA phone monitoring program was because they were trying to curry favor with the administration concerning this legislation, hoping for this big payoff in return.

This is an important issue and could determine whether the internet remains an egalitarian force or goes down the road of big corporation control that way that newspapers, radio and TV did. In the early days of each of those media forms, it was relatively easy for people to enter into it. It did not cost a lot of money to start a newspaper or radio station, although TVs were more expensive. But then big companies aided by a friendly Congress started dominating the field and nowadays one has to have enormous wealth to start up. In the case of radio and TV, the government has colluded with the big companies by taking the public airways (the broadcast spectrum) and giving it away free to private companies to make exorbitant profits. If you or I were to start a radio and TV station and broadcast it over the public airways, we would be prosecuted.

Newspapers, radio, and TV have ceased to be representative of the interests of ordinary people because they are not owned by them. They now represent the interests of their owners and shareholders. It is the internet, still an embryo medium, that still has the ease of entry to make it a democratizing force because, at least in principle, anyone can gain access to it to spread ideas. It is this that is threatened by the attacks on net neutrality. History has shown that once we let the big companies muscle in and dominate a media system, we cannot get it back.

Considering how much we all use the internet, this issue has been surprisingly below the radar. People seem to assume that the internet will always be the way it is now. But just as the democratic aspects of the internet were not an accident but deliberately designed to be so by its pioneers like Tim Berners-Lee, keeping it that way will also require deliberate efforts by us. We cannot take it for granted.

Case has many tech-savvy people who have a much better idea of the implications of surrendering net neutrality to the big telcos. Lev Gonick, Case's Vice President for Information Technology as early as last year had a very detailed and informative post on this topic. We need to build more awareness on this important issue. Perhaps we should have a concerted effort, with more bloggers expressing their views on this issue.

For more information on this topic, see the very helpful FAQ put out by the Save The Internet coalition.

June 08, 2006

Why I love the internet-3: How blogs have changed the pundit game.

In the previous post, I discussed that the main role of columnists and pundits was to act as sheepdogs for us, herding us into pens that limit the range of opinions we are allowed to express and be taken 'seriously.' To be frank, I rarely read any of the newspaper columnists anymore. However, since they do appear regularly in the Plain Dealer, I occasionally glance at them while reading the paper. I can usually predict what they are going to say on any given issue and the first paragraph usually confirms my prediction. There is almost never any new information or data or perspective that I find enlightening, whether it be from the 'liberal' or 'conservative' columnists. But what those columns do give me one useful piece of information and that is to tell me is what the acceptable range of conventional wisdom is, what I am supposed to think.

Blogs have changed this world of news commentary and analysis. What the internet has revealed is two important things. The first is that there exists a whole host of knowledgeable and astute analysts of the news out there in cyberspace, people who care passionately about specific issues and are willing to put in the time and effort to really study things in detail. The second is that those of us whose views are outside the 'acceptable' range of opinions defined by the traditional newspaper columnists are not alone. In fact, there are quite a lot of us, and with internet we can discover one another's existence, talk with each other, share information, and build alliances that transcend the conventional political labels.

Take for example, blogger Glenn Greenwald. Unknown a year ago, he burst on the scene with his sharp and critical analyses if the Bush administration's electronic surveillance programs. I read his blog if I want to analysis by someone who understands constitutional law and who reads legislation and other documents carefully. He has become so influential so quickly that he has even been invited to give commentary on TV shows and his book How Would a Patriot Act: Defending American Values From a President Run Amok debuted last week at #11 on the New York Times non-fiction best-seller list, powered in part by the enthusiastic support he received from fellow bloggers.

Similarly, I read Juan Cole if I want to understand what is really going on in Iraq, how events are being viewed in the Arab media and backgrounds on the people involved. I read Justin Raimondo for generally astute and informed analysis on issues of war and peace, coupled with a sharp, no-nonsense writing style. Daily Kos and Atrios are good for alerting me to news items that I would otherwise miss. And there are always Joshus Micah Marshall and Kevin Drum for commentary that is similar to that of traditional columnists and pundits but is usually much better informed and perceptive. All these bloggers link to other bloggers on specific issues.

Almost none of these people have editors checking on them to make sure that what they write is accurate. I don't know any of them personally either. So how do I know they are any good? How do I know they are reliable? The answer is their record. Blogs are mercilessly quick to point out when a fellow blogger makes an error and you quickly learn to distinguish between the people who are careful about what they write and the people who are merely glib. Of course, blogging is a fast-paced activity and errors are bound to creep in. But good bloggers respond well to having errors pointed out and you can easily tell the difference between those who make the occasional error and those who are trying to mislead readers in order to push an agenda. The deliberate misleaders, or those whose message is purely driven by ideology and undeterred by contradictory facts, end up with only partisan supporters (although there may be many of these).

While bloggers have no editors or other external quality control mechanisms like newspaper and TV and radio columnists do, they do have a far more powerful internal quality control mechanism. This is because bloggers know that the only thing they have to offer is the content they provide. People do not stumble across them while were looking for sports news, or department store sales, or comics. People have to actually seek them out. If bloggers do not provide good content, they are out of business.

I first realized the sheepherding or thought control role of newspaper columnists in the US soon after I first came here for graduate studies. In Sri Lanka as a student, I had read the sharp and incisive analyses of global politics of Noam Chomsky. Any person interested in politics there had heard of Chomsky, who is a distinguished professor of linguistics at MIT and became well known as a political analyst during the Vietnam war.

Chomsky is widely read everywhere in the world. He has been ranked in the top ten of the most cited scholars who have ever lived and recently was voted (by a landslide) the world's top public intellectual in a poll conducted by Prospect and Foreign Policy magazines. (See Robin Blackburn's article for why he deserves this recognition.)

But when I came to the US for graduate studies in the late 1970s, I found him to be completely absent from the mainstream media here. In order to read his take on current events, I had to go to the library and read newspapers and magazines from other countries. What was even more surprising to me was that many people in the US had not even heard of Chomsky.

I now know why. Chomsky had made the cardinal 'error' of going outside the boundaries of acceptable thought. He had argued that the Vietnam war was an act of aggression by the US against that country, with the aim of making sure that that country's economy was destroyed along with its socialist program of trying to provide education and housing and health to all its citizens. Such a good example, he argued, would be tempting for other developing nations to follow and thus dangerous to US business interests. This view went against the conventional view that Vietnam was a well-intentioned attempt to prevent the spread of Communism, taken on behalf of the Vietnamese people with their best interests in mind.

Chomsky has proceeded to elaborate on his analyses, arguing that the mainstream consensus idea of US foreign policy being benevolent in intent but undermined by incompetent execution or events beyond its control is a myth, and that its foreign policy is governed by ruthless self-interest on the part of a small group of US elites, carried out mercilessly, and dependent for its success on keeping the vast majority of American people in the dark about their true intentions. Controlling the range of debate and opinions in the mainstream media is an important tool towards achieving this goal. For stepping outside the mainstream consensus, and showing how that fraudulent mainstream consensus is created, he was banished from the op-ed pages of US newspapers and his articles could not be found in US mainstream magazines. (See the book Manufacturing Consent by Chomsky and Edward Herman (professor of economics at the University of Pennsylvania) for a sharp analysis of how the media functions.)

Whether you agree with Chomsky or not, there is no denying the fact that he does his research and can back up his claims with historical facts, actual data, and clear, logical reasoning. And yet Chomsky cannot be found anywhere in the mainstream media in the US while fact-free ranters of the Ann Coulter variety seem to be all over the place. If that is not in itself a good reason to celebrate the death on establishment punditry, I don't know what is. (See here for the kinds of things that Coulter says.)

Despite this shunning by the mainstream US media, Chomsky's prolific output and seemingly unlimited energy enabled him to become one of the world's most influential intellectuals. But in pre-internet days, he was a rare exception, like I. F. Stone. But with the internet, it will not be as hard for people with similar ideas to reach an audience. The internet no longer allows for the kind of thought policing that Chomsky experienced and that is why I think blogs will drive traditional media columnists out of business. They have become redundant.

I for one will not miss them.

POST SCRIPT: Man mauled by lioness

Here's a disturbing story:

A man shouting that God would keep him safe was mauled to death by a lioness in Kiev zoo after he crept into the animal's enclosure, a zoo official said on Monday.

"The man shouted 'God will save me, if he exists', lowered himself by a rope into the enclosure, took his shoes off and went up to the lions," the official said.

"A lioness went straight for him, knocked him down and severed his carotid artery."

This is the kind of tragedy that happens when people take the Bible and god too seriously. This unfortunate person probably had read the story in the Book of Daniel (chapter 6) where some enemies of the god-worshipping Daniel trick the king into throwing him overnight into the lion's den. The Bible says that god closed the mouths of the lions to prevent harm coming to Daniel. The next morning, the king finds Daniel unharmed and, on discovering that he has been tricked into endangering him, is enraged at the people who had tried to use him to destroy Daniel:

At the king's command, the men who had falsely accused Daniel were brought in and thrown into the lions' den, along with their wives and children. And before they reached the floor of the den, the lions overpowered them and crushed all their bones.

This story is one of the many Biblical stories that, although the ostensible point of it is to show god in a good light by demonstrating his power and responsiveness to those who worship him, actually creates even more problems for those who believe in a benevolent god. Why didn't god (like he did with Daniel) protect the wives and children who, after all, were not accused of any wrongdoing (even assuming that you like the idea of a god who approves of wrongdoers being torn apart by lions)?

June 07, 2006

Why I love the internet-2: Bypassing the official pundits

Yesterday I discussed how blogs and other forms of alternative media on the internet prevented Stephen Colbert's speech to the White House Correspondents Association Dinner from being ignored. But that is not the only benefit of the internet. The more important innovation may be the rise of blogs as alternative and better sources of news analysis and commentary.

Some time ago, I was on the Cleveland PBS show Feagler and Friends along with Plain Dealer editor Doug Clifton discussing the future of newspapers in the age of the internet and blogs. Neither Clifton nor Feagler seemed very knowledgeable about blogs (for example, they seemed to think that Wikipedia was a blog), which surprised me, since blogs are rapidly becoming a major force in, for want of a better name, the alternative media.

Clearly these two people with long histories in traditional newspapers were worried that the internet would speed the demise of newspapers, which are already suffering declines in readership, especially among younger readers. But their criticisms of blogs were somewhat ill-informed and seemed to be based on a stereotype of bloggers as ignorant ranters. They did (correctly) point out that any one can create a website and self-publish, even anonymously, and that there was no quality control as to whether what was said on a blog was reliable or not, whereas newspaper reports and columns had to pass through several editorial layers before seeing the light of day. But their inference that hence blogs should not be taken seriously and might even be harmful was not justified.

In my response, I said that there would always be room for the traditional journalist, the person who gets the primary information. We need people with trained reporting skills to be out there interviewing people, witnessing events, asking questions, obtaining documents, etc. So this role of the traditional media will likely remain, although even here there are independent people who are taking advantage of the access that the internet provides to become independent journalists providing first-hand reports. (I am thinking of people like Dahr Jamail who has been doing some good original reporting from Iraq.) Of course, such freelancers are more limited in their access to official figures because of their lack of credentials and uncertain financial support, but this might conversely work in their favor since they are more likely to go off the beaten track and report non-official news.

But I think that where the internet and blogs are really going to change things is with the traditional national newspaper columnists. People like George Will, Maureen Dowd, Charles Krauthammer, Thomas Friedman, David Brooks, and Richard Cohen are rapidly becoming dinosaurs whose days are numbered.

To see why this is so, we need to understand why the media hire and support these pundits. The standard reason is that columnists are expected to provide perspective and insights on the news, and be able to translate complex policy issues into more readily understandable form. It is assumed that these are people with broad experience who study news events, have access to background information on them, and thus can tell the rest of us (who are presumably too busy with out lives to study the issues) what the news means and what should be done.

In actual practice, none of the above-named columnists have any more expertise on the news than you or me. It is not obvious to me that they even study the issues more than the rest of us. There are rare exceptions. Paul Krugman is a professional economist and thus is in a good position to analyze complex budgetary and fiscal issues and reports. But most columnists do not have that kind of expert and detailed knowledge. They just glibly pontificate.

Maureen Dowd's snarky humor quickly wears thin and is downright irritating. Has David Broder, the supposed dean of newspaper columnists and a so-called 'liberal,' said anything of real interest in the last twenty years? Can anyone follow David Brook's leaps of logic? Isn't it obvious that Charles Krauthammer's extremely partisan ideology colors everything he says? For how long can George Will's bow tie and pompous phrasings hide the vacuousness of his thought? And what on earth are Thomas Friedman's banalities supposed to mean?

Listen to such people closely as they discuss things like tax cuts. They give only a quick nod to the actual details of the policy or its impact. They rarely talk hard numbers or work through detailed implications of actual policies, They quickly shift the debate to personnel, politics, and style, addressing such questions as: Will the new policy (whatever it is) help the President/Republican/Democrat fortunes? Will the public support it? How should they sell it? How will it affect the next elections? What do the polls say and what does it mean? And so on.

But the real purpose served by such columnists is to serve as guardians of the boundaries of acceptable debate, and thus thought. Think of them as like sheepdogs with us, the public, as sheep. Their job is to make sure that all our articulated opinions stay within a certain range. So people like Cohen and Dowd and Broder, by being identified as liberals, serve as the 'liberal' goal posts and Will, Krauthammer, and Brooks similarly serve as the 'conservative' goal posts. (Friedman occupies his own weird space.) They are the people who define 'mainstream' or 'moderate' opinion. So liberals are supposed to take their cues from liberal commentators and conservatives from their standard bearers. As long as we stay within the boundaries of thought defined by these people, we are allowed to participate in the discussion. But step outside these defined boundaries, and you are labeled an extremist and kicked out of the game.

Take for example, Iraq. Before the war began, the acceptable range of opinion was that Iraq and its leaders were undoubtedly evil and needed to be replaced, the motives of the Bush administration were good and honorable, and the only issues up for debate was whether more diplomacy and time should be allowed for Hussein's overthrow or an immediate attack launched. Cohen and William Raspberry (another so-called 'liberal' columnist) both swooned with admiration over Colin Powell's disgraceful and now thoroughly discredited speech to the UN and announced that they were now convinced that attacking Iraq was the right thing to do, thus serving notice to all people who considered themselves liberals that they should get on the war-wagon or be considered 'outside the mainstream.'

Now that the Iraq debacle has occurred, the range of allowed opinion has shifted slightly to say that the information on which the war was based was flawed and the implementation was bad, but what we should debate now is how to solve the problem that has been created.

It was not allowed at any time to make a more fundamental case and argue that the attack on Iraq was an act of unprovoked aggression on a country that had never attacked, or even threatened, the US, that the motives of the Bush administration were never honorable, that they repeatedly and deliberately lied and misled the public about the evidence, and that the key perpetrators should be impeached and tried for war crimes. Such talk was, and still is, not allowed in polite company. Say things like that and you are shunned and outside the game.

Thus the role of the columnists is to keep the discussion within 'safe' boundaries. As a result people who had sharper criticisms of policies tended to keep quiet about them for fear of being labeled an extremist or worse. And before the days of the internet, such people were completely isolated and thus it was easy to keep them quiet.

But not anymore.

Next: How blogs have dramatically changed the pundit game

POST SCRIPT: And now, the Rapture video game!

About a year ago, I posted a series of items (here, here, and here) about the blood and gore aspects of the rapture based on the Left Behind series of books and suggested in a comment on Mark Wilson's blog that it contained all the elements necessary to make a violent video game. Mark subsequently reported that such a game was actually in the works.

Well, it turns out that the game Left Behind: Eternal Forces has been created and is going to be marketed for the coming Christmas season. This website reviews the game and its creators and describes the goals of the game:

Imagine: you are a foot soldier in a paramilitary group whose purpose is to remake America as a Christian theocracy, and establish its worldly vision of the dominion of Christ over all aspects of life. You are issued high-tech military weaponry, and instructed to engage the infidel on the streets of New York City. You are on a mission - both a religious mission and a military mission -- to convert or kill Catholics, Jews, Muslims, Buddhists, gays, and anyone who advocates the separation of church and state - especially moderate, mainstream Christians.

Ah, yes, there's nothing that captures the spirit of Christmas more than murdering all those who disagree with your own extreme vision of it. I don't know about the wisdom of their choice of city, though. In real life, Christian warriors might be hopelessly outnumbered by their enemies in New York City. I'm guessing that the number of gays alone would be enough to rout the rapturites. They should perhaps start with a more realistic location (say Topeka, Kansas) and hone their killing skills before taking on the core of the Big Apple.

Here's the official website for the game. Its creators are apparently connected to Rick Warren, author of the book The Purpose Driven Life.

June 06, 2006

Why I love the internet

Stephen Colbert's speech at the White House Correspondents Association Dinner, where he ripped into the President and the assembled insider media right to their faces, was broadcast only on C-Span and initially buried by the offended media. When it became clear that many people were talking about it, the elite commentators sniffed and said that they had not thought much of the speech.

Washington Post columnist Richard Cohen, who can invariably be counted on for conventional wisdom, parroted the standard line in an unintentionally hilarious piece where he said that Colbert wasn’t funny and was in fact rude and a bully to say mean things to that nice Mr. President. In writing this, Cohen was demonstrating again how craven the mainstream press is, so anxious to curry favor with the powerful.

What made Cohen's column so humorous was that he started out by asserting that he was an expert on comedy, saying: "First, let me state my credentials: I am a funny guy. This is well known in certain circles, which is why, even back in elementary school, I was sometimes asked by the teacher to "say something funny"- as if the deed could be done on demand."

It is well known in every circle that anyone who actually has to say that he is funny is already pretty pathetic, and to appeal to one's reputation in elementary school as evidence is to enter the world of self-parody and to practically beg to be made fun of. And few do ridicule better than Penn State professor Michael Berube, who has been having fun at Cohen's expense for some time now, at one time issuing an appeal to his readers to come to the aid of Cohen because he was in danger of running out of ways to be wrong. You simply must read Berube's brutally funny takedown of Cohen's Colbert column.

What added to the general merriment in the blog world was that Cohen then wrote a subsequent column complaining about how so many nasty people were now being mean to him by ridiculing his original Colbert column. This brought on another round of ridicule, this time aiming at his whiny self-pitying. Ah, the fun never ends with young Richard! I do not doubt Cohen's claim that the other children in his elementary school were in stitches when he was around, but I think he misinterprets the reasons why.

In days gone by, very few of us (especially people like me with no cable) would have heard about Colbert's speech and even then would only have had the opinions of gatekeepers like Cohen to enlighten us. Those of us who disagreed with Cohen's supercilious tone and suspected that there was more to the story would have seethed but would have had no recourse. He would have remained secure in his media bubble, blissfully thinking that people actually took his pontificating seriously. But with the internet, Cohen received his comeuppance swiftly and widely, and there is no doubt that he is aware that there is a different world out there. He cannot simply say ridiculous stuff and think that having an august perch in one of the major news outlets will protect him. He may not like this new state of affairs, but he has to deal with it.

In the pre-internet days, Colbert's actual speech would have disappeared, leaving behind barely a ripple. But thanks to the internet, the story of that dinner speech spread like wildfire thanks to blogs and millions have seen it online (start at about the 51:30 mark), read the transcript, commented on the speech, and passed it on. The ignoring of the speech by most of the traditional media only made the story even more interesting in the world of blog-driven political readership.

Read Arianna Huffington's summary of the impact the speech has had. See here for my take on it.

This is why I love the internet and the blogs. They have broken the stranglehold of elite opinion makers who can pontificate without content and close ranks around each other and the political establishment. People can now get news and information from many more sources and have access to people who can analyze the news critically and piercingly, people who have no interest in ingratiating themselves to those in power, and thus can say what they mean, even if they become pariahs.

I.F. Stone would have loved it.

POST SCRIPT: Interesting website

I have been introduced to a fascinating new website called MachinesLikeUs. The site's welcome message pretty clearly lays out its premises, all of which I enthusiastically agree with.

MachinesLikeUs.com is a resource for those interested in evolutionary thought, cognitive science, artificial life and artificial intelligence. It encourages relevant scientific research and analysis, posts current news and disseminates articles that promote the following concepts: 1) Evolution is the guiding principle behind life on earth; 2) Religions and their gods are human constructs, and subject to human foibles; 3) Life and intelligence are emergent properties based upon fundamental mechanics, and, as such, are reproducible; 4) Living organisms are magnificent machines - robust, dynamic, self-sufficient, precisely tuned to their environment - and deserving our respect and study. You are invited to participate in the venerable quest.

The site provides a great set of links to recent news items about scientific findings in these areas and articles that deal with the above issues. The editor has generously included some of my own blog entries on his site.

June 05, 2006

Seeing the world through Darwin's eyes

It is good to be back and blogging again!

On my trip to Australia, I had the chance to see some of the marsupial animals that are native to that continent, and as I gazed at these strange and wondrous creatures, I asked myself the same question that all visitors to the continent before me must have asked: Why are these animals so different from the ones I am familiar with? After all, Australia's environment is not that different from that found in other parts of the world, but the fact that most marsupials (like kangaroos, wallabies, koalas, and wombats) are found only on that continent is remarkable. I was stunned to learn that when a kangaroo is born, it weighs less than one gram. This is because much of the development of the newborn (which occurs in other animals inside the womb of the mother) takes place in the pouch for marsupials.

The Encyclopedia Brittanica says that marsupials are:

a mammalian group characterized by premature birth and continued development of the newborn while attached to the nipples on the lower belly of the mother. The pouch, or marsupium, from which the group takes its name, is a flap of skin covering the nipples. Although prominent in many species, it is not a universal feature - in some species the nipples are fully exposed or are bounded by mere remnants of a pouch. The young remain firmly attached to the milk-giving teats for a period corresponding roughly to the latter part of development of the fetus in the womb of a placental mammal (eutherian).

The largest and most varied assortment of marsupials - some 200 species - is found in Australia, New Guinea, and neighbouring islands, where they make up most of the native mammals found there. In addition to the larger species such as kangaroos, wallabies, wombats, and the koala, there are numerous smaller forms, many of which are carnivorous, the Tasmanian devil being the largest of this group (family Dasyuridae). About 70 species live in the Americas, mainly in South and Central America, but one, the Virginia opossum (Didelphis virginiana), ranges through the United States into Canada.

The significance of the way that animals are distributed in the world was a key insight that Charles Darwin obtained as result of his voyage on the Beagle from 1831 to 1836. He noted that although the environment in the Galapagos Islands was very similar to that of the Cape Verde islands (off the coast of West Africa), the animal life found is each of these islands were quite dissimilar to one another and more similar to the wildlife in their immediately neighboring continents (South America and Africa respectively). This made his speculate that a few animals had arrived at the islands from the nearby continents and then changed over time to become distinctive species.

This line of reasoning caused him to doubt the dominant belief of his time (called 'special creation') that said that god had created each species to fit into their environmental niches. (Darwin had at one time been contemplating joining the priesthood and one can assume that he would have initially been quite comfortable with this belief.)

What would have further fuelled Darwin's doubts about special creation was the increasing awareness, even in his own time, that large numbers of species had already gone extinct. It is now estimated that over 90% of all species that ever existed are no longer around. If god was creating each species specially to suit the available environmental niches, explaining extinction becomes problematic.

On a side note, the nature parks I visited in Australia were surprisingly relaxed about visitors. They did not keep the animals in pens separated from people, except for dangerous animals like the Tasmanian Devil. You walked around in the same area as the animals and could get up close and pet wallabies and wombats and koalas if you so wished and they were nearby. You could even enter the cages housing birds and there was no one checking to see that the doors were kept closed to prevent the birds from escaping. The rangers assumed that park visitors would not keep the doors open.

I could not imagine such a relaxed attitude in the US where people are scared that if a bird pecked someone or an animal bit or scratched a visitor, lawsuits would follow. A park ranger told me that if an animal showed signs of aggression or unwonted interest in people, they would take some action but they did not, as a rule, try to shield themselves from any chance of being sued by putting up barriers, as is the case here. He asked me where I was from and when I said the USA he nodded understandingly and said that Australia was not as litigious a country as the US, although he feared that eventually Australian nature parks would follow the US model and put up barriers between animals and visitors. (I did see tremendous American cultural dominance in their TV stations, where the programs and news formats seemed indistinguishable from their US counterparts, except for the accents.)

Seeing strange new animals in their natural habitat was very intriguing for me, provoking different feelings than seeing them in a zoo here. I can well understand how Darwin's trip the Galapagos Islands would have triggered similar questions in his own mind and lead to his own investigations and groundbreaking theory of evolution.

POST SCRIPT: Barry's new blog

If you have been reading the comments to this blog, you would have found some interesting and thought-provoking by Barry. Barry has now started his own blog called Those Who Can't Teach Wish They Could where he chronicles the path of his career switch from engineering to teaching, and his observations about how the whole certification process may be discouraging otherwise talented and knowledgeable teachers from entering the classroom

Barry's comments on my blog were always thoughtful and lively, and his blog is the same. You should visit.

June 02, 2006

Why scientific theories are more than explanations

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

At its heart, intelligent design creationism (IDC) advocates adopt as their main strategy that of finding phenomena that are not (at least in their eyes) satisfactorily explained by evolutionary theory and arguing that hence natural selection is a failed theory. They say that adding the postulate of an 'intelligent designer' (which is clearly a pseudonym for God) as the cause of these so-called unexplained phenomena means that they are no longer unexplained. This, they claim, makes IDC the better 'explanation.' Some (perhaps for tactical reasons) do not go so far and instead say that it is at least a competing explanation and thus on a par with evolution.

As I discussed in an earlier posting, science does purport to explain things. But a scientific explanation is more than that. The explanations also carry within themselves the seeds of new predictions, because whenever a scientist claims to explain something using a new theory, the first challenge that is thrown invariably takes the form "Ok, if your theory explains X under these conditions, then it should predict Y under those conditions. Is the prediction confirmed?"

If the prediction Y fails, then the theory is not necessarily rejected forever but the proponent has to work on it some more, explain the failure to predict Y, and come back with an improved theory that makes better predictions.

If the prediction Y is borne out, then the theory is still not automatically accepted but at least it gains a little bit of credibility and may succeed in attracting some people to work on it.

Theories become part of the scientific consensus when their credibility increases by these means until they are seen by the scientific community to be the exclusive framework for future investigations. A scientist who said things like "My new theory explains X but makes no predictions whatsoever" would be ignored or face ridicule. Such theories are of no use for science.

And yet this is precisely the kind of thing that IDC proponents are saying. To see why this cannot be taken seriously, here is something abridged from the book Physics for the Inquiring Mind by Eric Rogers (p. 343-345), written way back in 1960. In it Rogers looks at competing claims for why an object set in motion on a surface eventually comes to rest:


The Demon Theory of Friction

How do you know that it is friction that brings a rolling ball to a stop and not demons? Suppose you answer this, while a neighbor, Faustus, argues for demons. The discussion might run thus:

You: I don't believe in demons.
Faustus: I do.
You: Anyway, I don't see how demons can make friction.
Faustus: They just stand in front of things and push to stop them from moving.
You: I can't see any demons even on the roughest table.
Faustus: They are too small, also transparent.
You: But there is more friction on rough surfaces.
Faustus: More demons.
You: Oil helps.
Faustus: Oil drowns demons.
You: If I polish the table, there is less friction and the ball rolls further.
Faustus: You are wiping the demons off; there are fewer to push.
You: A heavier ball experiences more friction.
Faustus: More demons push it; and it crushes their bones more.
You: If I put a rough brick on the table I can push against friction with more and more force, up to a limit, and the block stays still, with friction just balancing my push.
Faustus: Of course, the demons push just hard enough to stop you moving the brick; but there is a limit to their strength beyond which they collapse.
You: But when I push hard enough and get the brick moving there is friction that drags the brick as it moves along.
Faustus: Yes, once they have collapsed the demons are crushed by the brick. It is their crackling bones that oppose the sliding.
You: I cannot feel them.
Faustus: Rub your finger along the table.
You: Friction follows definite laws. For example, experiment shows that a brick sliding along a table is dragged by friction with a force independent of velocity.
Faustus: Of course, the same number of demons to crush however fast you run over them.
You: If I slide a brick among a table again and again, the friction is the same each time. Demons would be crushed on the first trip.
Faustus: Yes, but they multiply incredibly fast.
You: There are other laws of friction: for example, the drag is proportional to the pressure holding the surfaces together.
Faustus: The demons live in the pores of the surface: more pressure makes more of them rush out and be crushed. Demons act in just the right way to push and drag with the forces you find in your experiments.

By this time Faustus' game is clear. Whatever properties you ascribe to friction he will claim, in some form, for demons. At first his demons appear arbitrary and unreliable; but when you produce regular laws of friction he produces a regular sociology of demons. At that point there is a deadlock, with demons and friction serving as alternative names for sets of properties - and each debater is back to his first remark.


Faustus's arguments are just like those of the IDC advocates, and the reason why they are consistently rejected by the scientific community. Scientists ask for more than just explanations from their theories. They also need mechanisms that make predictions. They know that that is the only way to prevent being drowned in an ocean of 'explanations' that are of no practical use whatsoever.

You can't really argue with people like Faustus who are willing to create ad hoc models that have no predictive power. Such explanations as he gives have no value to the practicing scientist. But when you walk away from this kind of fruitless pseudo-debate, you do allow the other side to charge that you are afraid to debate them, at which point, they may jump up and down and shout "See they cannot refute us. We win! We win!", however illogical the charge.

It reminds me of the duel scene in Monty Python and the Holy Grail in which King Arthur chops off the arms and legs of the Black Knight, leaving just his torso and attached head on the ground, totally defenceless. The Black Knight refuses however to concede defeat and offers a compromise: "Oh? All right, we'll call it a draw." When Arthur and his assistant walk away from this offer, the Black Knight starts taunting him saying "Oh. Oh, I see. Running away, eh? You yellow bastards! Come back here and take what's coming to you. I'll bite your legs off!"

At some point, in order to save your time (and your sanity) you have to simply walk away and ignore them. This explains why so many scientists refuse to get involved in the IDC battles.

POST SCRIPT: Unlearning ideas

Worried about children learning dangerous ideas (like evolution) in science classes and elsewhere? See the advertisement for the Unlearning Annex and learn how to protect them!

June 01, 2006

Why IDC is not science

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

In the previous posting, I pointed out that if one looks back at the history of science, all the theories that are considered to be science are both (1) naturalistic and (2) predictive. Thus these two things constitute necessary conditions.

This is an important fact to realize when so-called intelligent design creationism (IDC) advocates argue that theirs is a 'scientific' theory. If so, the first hurdle IDC must surmount is that it meet both those necessary criteria, if it is to be even eligible to be considered to be science. It has to be emphasized that meeting those conditions is not sufficient, for something to be considered science, but the question of sufficiency does not even arise because IDC does not meet either of the two necessary conditions.

I issued this challenge to the IDC proponents when I debated them in Kansas in 2002. I pointed out that nowhere did they provide any kind of mechanism that enabled them to predict anything that anyone could go out and look for. And they still haven't. At its essence, IDC strategy is to (1) point to a few things that they claim evolutionary theory cannot explain; (2) assert that such phenomena have too low a probability to be explained by any naturalistic theory; and (3) draw the conclusion that those phenomena must have been caused by an 'unspecified designer' (with a nudge, nudge, wink, wink to the faithful that this is really God) whose workings are beyond the realm of the natural world explored by science.

Thus they postulate a non-natural cause for those phenomena and cannot predict any thing that any person could go and look for. (This is not surprising. The designer is, for all intents and purposes, a synonym for God and it would be a bit bizarre to our traditional concept of God to think that his/her actions should be as predictable as that of blocks sliding down inclined planes.) When I asked one of the IDC stalwarts (Jonathan Wells) during my visit to Hillsdale College for an IDC prediction, the best he could come up with was that there would be more unexplained phenomena in the future or words to that effect.

But that is hardly what is meant by a scientific prediction. I can make that same kind of vague prediction about any theory, even a commonly accepted scientific one since no theory ever explains everything. A scientific prediction takes the more concrete form: "The theory Z encompassing this range of phenomena predicts that if conditions X are met, then we should be able to see result Y."

IDC advocates know that their model comes nowhere close to meeting this basic condition of science. So they have adopted the strategy of: (1) challenging the naturalism condition, arguing that it is not a necessary condition for science and that it has been specifically and unfairly adopted to exclude IDC from science; and (2) tried to create a new definition of science so that IDC can be included. This takes the form of arguing that a scientific theory is one that 'explains' phenomena.

There are variations and expansions on these arguments by the various members of the IDC camp but I have tried to reduce it to its skeletal elements. These variations that IDC proponents adopt are designed to blur the issues but are easy to refute. See this cartoon by Tom Tomorrow (thanks to Daniel for the link) and this cartoon (thanks to Heidi) and this funny post by Canadian Cynic about the possible consequences of using IDC-type reasoning in other areas of life.)

The rejection by IDC advocates of naturalism and predictivity as necessary conditions for science goes against the history of science. Recall for example the struggle between the Platonic and Copernican models of the universe. Remember that both sides of this debate involved religious believers. But when they tried to explain the motions of the planets, both sides used naturalistic theories. To explain the retrograde motion of Mercury and other seemingly aberrant behavior, they invoked epicycles and the like. They struggled hard to find models that would enable them to predict future motion. They did not invoke God by saying things like "God must be moving the planets backwards on occasion." Or "This seemingly anomalous motion of Mercury is due to God." Such an explanation would not have been of any use to them because allowing God into the picture would preclude the making of predictions.

In fact, the telling piece of evidence that ended the geocentric model was that the Rudolphine Tables using Kepler's elliptical orbits and a heliocentric model were far superior to any alternative in predicting planetary motion.

While it may be true that the underlying beliefs that drove people of that time to support the Platonic or Copernican model may have been influenced by their religious outlook, they did not seem to invoke God in a piecemeal way, as an explanation for this or that isolated phenomenon, as is currently done by IDC advocates. Instead they were more concerned with posing the question of whether the whole structure of the scientific theory was consistent with their understanding of the working of God. In other words, they were debating whether a geocentric model was compatible with their ideas of God's role in the world. The detailed motions of specific planets, however problematic, seemed to have been too trivial for them to invoke God as an explanation, although they would probably not have excluded this option as something that God was capable of doing.

It may also well be true that some scientists of that time thought that God might be responsible for such things but such speculations were not part of the scientific debate. For example, Newton himself is supposed to have believed that the stability of the solar system (which was an unexplained problem in his day and remained unsolved for about 200 years) was due to God periodically intervening to restore the initial conditions. But these ideas were never part of the scientific consensus. And we can see why. If scientists had said that the stability was due to God, and closed down that avenue of research, then scientists would never have solved this important problem by naturalistic means and thus advanced the cause of science. This is why scientists, as a community, never accept non-natural explanations for any phenomena, even though individual scientists may entertain such ideas.

So the attempts by IDC advocates to redefine science to leave out methodological naturalism and predictivity fly completely in the face of the history of science. But worse than that, such a move would result in undermining the very methods that has made science so successful.

In the next posting, we will see why just looking for 'good' explanations of scientific phenomena (the definition of science advocated by the IDC people) is not, by itself, a useful exercise for science.