THIS BLOG HAS MOVED AND HAS A NEW HOME PAGE.

Entries in "Education and learning"

November 28, 2011

Teenager faces down Kansas governor and school principal

High schooler Emma Sullivan refuses to apologize to the governor of Kansas Sam Brownback for criticizing him on Twitter. The governor's staff apparently scours the internet for unflattering things about him and noticed her tweet and reported her to her school principal who, rather than stand up for Emma's free speech rights (after all, if making fun of a politician isn't allowed by the First Amendment, what is?) demanded that she apologize.

You knew from the beginning that this could not help but end badly for Brownback and so it has. He is now forced to apologize for his staff over-reacting. The school district has also backed off in its demand for an apology.

Once again, I think that this is a victory for the internet. Emma received a lot of support from the blogosphere and it may have helped her stand firm against the bullying.

Way to go, Emma.

February 08, 2011

Case Connection Zone

One of the best things about working at Case Western Reserve University is that it has been very forward-looking and supportive in providing technology to serve the needs of its students, employees, and the community.

In the early days of the internet CWRU, with its Freenet system, was the first in the nation to provide free internet access to anyone who had a dial-up modem. It later was the first university campus to have an entirely fiber-optic network going to every office, classroom, and dorm room on the campus.

In partnerships with other local non-profit groups, CWRU has been expanding access to free broadband access to city dwellers. This video (admittedly also a plug for the university) shows a new initiative to provide free gigabit broadband fiber-optic network access to the campus community and an adjacent neighborhood to research what kinds of new uses might emerge, with an eye to expanding the reach of the network.

December 28, 2010

The secrets of an academic ghostwriter

The Chronicle of Higher Education recently had an article by someone who has made a good living (about $66,000 this year) by writing custom research papers on almost any topic for undergraduate and graduate students who hire him to do their assignments.

I've written toward a master's degree in cognitive psychology, a Ph.D. in sociology, and a handful of postgraduate credits in international diplomacy. I've worked on bachelor's degrees in hospitality, business administration, and accounting. I've written for courses in history, cinema, labor relations, pharmacology, theology, sports management, maritime security, airline services, sustainability, municipal budgeting, marketing, philosophy, ethics, Eastern religion, postmodern architecture, anthropology, literature, and public administration. I've attended three dozen online universities. I've completed 12 graduate theses of 50 pages or more. All for someone else.

His strategy was to collect the minimal information necessary from Wikipedia and other online sources and simply write everything down, cutting and pasting quotes, and using filler language to get to the necessary word count, without rewriting or editing or polishing.

After I've gathered my sources, I pull out usable quotes, cite them, and distribute them among the sections of the assignment. Over the years, I've refined ways of stretching papers. I can write a four-word sentence in 40 words. Just give me one phrase of quotable text, and I'll produce two pages of ponderous explanation. I can say in 10 pages what most normal people could say in a paragraph.

I've also got a mental library of stock academic phrases: "A close consideration of the events which occurred in ____ during the ____ demonstrate that ____ had entered into a phase of widespread cultural, social, and economic change that would define ____ for decades to come." Fill in the blanks using words provided by the professor in the assignment's instructions.

The reason he gets away with this is because this is what some students do on their own. For them too, their first version is the one they hand in as their 'finished' work, so the roughness of the submitted manuscript must seem familiar to the teacher. As the author says:

I don't ever edit my assignments. That way I get fewer customer requests to "dumb it down." So some of my work is great. Some of it is not so great. Most of my clients do not have the wherewithal to tell the difference, which probably means that in most cases the work is better than what the student would have produced on his or her own. I've actually had customers thank me for being clever enough to insert typos. "Nice touch," they'll say.

As a writing generalist myself, I was vaguely curious about whether I could be as successful a ghostwriter, assuming that I could overcome any scruples. I don't think I could simply because over the years I have developed habits that would give me away immediately. I would not be able to avoid being opinionated and this would undoubtedly set off suspicions. I am also somewhat obsessive about avoiding typos and grammatical errors, repeatedly rewriting and editing even for my blog posts. My books may not be great works of literature but they are 'clean' in the sense that they have very few or no basic errors of this sort. All this attention to detail would slow me down too much, while also likely to set off alarm bells for the reader. As an academic hired gun, I would be a bust.

I was of course bothered by students passing off other people's work as their own and wondered how widespread it was. But I was also impressed with the writer's ability to churn out papers on topics for which he had no training and yet be able to fool the student's teachers and even their graduate thesis advisors into thinking their students had written them.

This article makes for fascinating but disturbing reading and is as much an indictment of the way our educational system is structured, that enables such practices to pass undetected, as it is of the students who use ghostwriters.

August 25, 2009

College as a Disney World of Learning

(Talk given at Case Western Reserve University's Share the Vision program, Severance Hall, Friday, August 21, 2009 1:00 pm. This program is to welcome all incoming first year students. My comments centered on the common reading book selection Three Cups of Tea by Greg Mortenson and David Oliver Relin. Mortenson will be the speaker at the annual fall convocation to be held on Wednesday, August 26, 2009 in Severance Hall at 4:30 pm.)

As I read the book Three Cups of Tea, two stories struck me. One begins on page 202 and is that of the little boy Mohammed Aslam Khan who was sent by his father alone on a perilous journey downriver in frigid waters, all so that he might get a chance at an education. Despite all the odds against him, he not only survived the trip but got a good education and returned to the village to become an educational leader.

The other story is on page 31 where Mortenson describes his amazement when he saw eighty two children assemble by themselves and do their lessons on their own in the open, in the cold, some writing on the ground with sticks, since the village could only afford a teacher for three days a week, and on the other days they were on their own.

As Mortenson said, "Can you imagine a fourth-grade class in America, alone, without a teacher, sitting there quietly and working on their lessons?"

Why were the people in that remote region of Pakistan willing to go through so much in order to get an education? Compare the situation in the US where learning is often seen as something to be avoided, and the complaints that some teachers get when they cover too much ground. When schools are closed or lessons cancelled due to some emergency, it is usually a cause for cheering amongst students. As a colleague of mine here said recently, education may be the only thing in the US where people actually want less than what they pay for.

There are of course classes, teachers, and students in the US where learning for its own sake is valued. But these are unfortunately few. But I do not believe that there is any fundamental difference between the children in those remote villages of Pakistan and Afghanistan and those in the US that explains this difference in attitude.

What may be true is that America suffers, if that is the right word, from too easy access to education. Schooling is fairly easily available and, at least in the K-12 sector, is free. A good analogy is with food, which is also freely and cheaply available in the US, when compared with other countries. And we waste and throw away vast amounts of it. I am sure your mothers pleaded with you to eat your vegetables, invoking images of starving children in China who would gladly eat with relish the food that you want to dump in the trash. Actually given the economic crisis in the US and the rapidly rising economic power of China, soon Chinese mothers might be pleading with their spinach-rejecting children to think of poor starving children in the US.

Students in the US, because of the ease and abundance of educational opportunities, have to be exhorted to take advantage of these abundant resources, just like they have to be coaxed to eat their broccoli, and this may be devaluing education in students' eyes, because people tend to not value the things that are easily available.

This is why the story of the immense struggles and sacrifices made by the villagers that Mortenson worked with to build their schools is so inspiring. They realized that education is a precious gift to be cherished, not something whose availability can be taken for granted.

All of you are now embarking on four years of education here at Case Western Reserve University. Some people may tell you that college will be the happiest time in your lives. I disagree. In fact, it would be very sad if the happiest years of your life were over by the age of twenty-two. So I hope that you will have much happier times in the future.

But there is one aspect in which these four years will be a unique experience that you must take advantage of to the fullest. It is the one time in your life when you will be surrounded by people who want nothing else but to help you learn. The world-class faculty here, who are experts on all manner of things, will share their knowledge and expertise freely and willingly. Here you will get free access to incredible libraries full of books, journals, magazines, audio-visual materials, and newspapers, and to librarians who are positively eager to help you use them. And it is all available to you just for the asking. Once you graduate and go out, that opportunity is gone.

Of course, all this is not technically 'free' since you are paying tuition that, despite the extraordinary fund-raising abilities of our president, is still considerable. But the way to think of tuition fees is the way you would the admission price to Disney World or other amusement parks. It is not cheap to get in but once you are in, people try to get as much out of their time there as possible. It would be absurd to spend all your time sitting on a bench eating ice cream or surfing the web or sleeping.

You should have that attitude during the years you spend here. Think of Case Western Reserve University as the Disney World of learning. You have paid the admission fee in terms of grades and tuition. Now that you are in, rather than get by with minimal work, you should try to get in as much learning as possible, formally in classes, and informally in all the talks and seminars and casual discussions with teachers and fellow students. Once you develop that attitude towards learning, you will find that it is much more fun than roller coaster rides and with none of the accompanying motion sickness.

I am lucky in that I actually work here and take full advantage on a daily basis of the knowledge that is so freely available. And I would urge you to do the same. In fact, as soon as this program is over, and you have some free time, you should go over to the library and see what they offer, and you should go to all the museums that are right here in University Circle, as the first steps in a four-year adventure of learning.

Trust me, you will never regret it.

POST SCRIPT: The story of Genesis as told by Eddie Izzard

Much more interesting than the original. Makes more sense, too.

July 08, 2008

Collective good versus private profit

One of the clichés of academia which even non-academics know is "publish or perish." In its most common understanding, it implies that those who publish more are perceived as productive scholars, worthy of recruitment and promotion.

But there are other reasons for publishing. One is to establish priority for one's ideas. In academia, ideas are the currency that matter and those who have good ideas are seen as creative people. So people publish to ensure that they receive the appropriate credit.

Another reason for publishing is to put the ideas into public circulation so that others can use them and build on them to create even more knowledge. Knowledge thrives on the open exchange of information and the general principle in academia is that all knowledge should be open and freely available so that everyone can benefit from it.

This is not, of course, the case, in the profit-driven private sector where information is jealously guarded so that the maximum profit can be obtained. This is not unreasonable in many cases. After all, without being profitable, companies would go out of business and many of the innovations we take for granted would not occur. So the knowledge is either guarded jealously (say like the formula for Coca Cola) or is patented so that other users have to pay for the privilege of using it.

But the open-information world of academia can collide with the closed, profit-making corporate world. Nowhere is this most apparent than in the drug industry. Much of the funding for medical and drug research comes from the government via agencies like the National Institutes of Health, and channeled through university and hospital researchers. These people then publish their results. But that knowledge is then often built on by private drug companies that manufacture drugs that are patented and sold for huge profits. These companies often use their immense legal resources to extend the effective lifetime of their patents so that they can profit even more.

Another example of a collision between the public good and private profit was the project to completely map the human genome. This government-funded project was designed to be open, with the results published and put into the public domain. Both heads of the Human Genome Project, first James Watson and then Francis Collins, strongly favored the open release of whatever was discovered, because of the immense potential benefits to the public. They created a giant public database into which researchers could insert their results, enabling others to use them. (To see what is involved in patenting genomic information, see here.)

But then Craig Venter, head of the private biotechnology company Celera Genomics, decided that his company would try to map the genome and make it proprietary information, and create a fee-based database,. This was fiercely resisted by the scientific community who accelerated their efforts to map the genome first and make the information open to all. The race was on and the scientific community succeeded in its goal of making the information public. Information on how to access the public database can be found here.

Many non-academics, like the journalist writing about faculty cars, simply do not understand this powerful desire amongst academics for open-access to information. I recall the discussion I had with my students regarding the film Jurassic Park. I hated the film for many reasons and said how bizarre it was that the discoverer of the process by which dinosaurs had been recreated from their DNA, a spectacular scientific achievement, had kept his knowledge secret in order to create a dinosaur theme park and make money. I said that this was highly implausible. A real scientist would have published his results to establish his claim as the original discoverer and made the information public so that others could build on it. But some of my students disagreed. They thought that it was perfectly appropriate that the first thought of the scientist was how to make a lot of money off his discovery rather than spread knowledge.

It is true that nowadays scientists and universities are increasingly seeking to file patents and create spin-off companies to financially benefit from their discoveries. Michael Moore talks about how things have changed and how the drive to make money is harming the collective good;

Thinking about that era, back in the first half of the 20th century, where you had for instance the man who invented the kidney-dialysis machine. He didn't want the patent for it, he felt it belonged to everybody. Jonas Salk and the polio vaccine, again, he wouldn't patent it. The famous quote for him is, "Would you patent the sun? It belongs to everyone." He wasn't doing this to become a millionaire. He was doing it because it was the right thing to do. During that era, that's the way people thought.

It may be that I am living in the past and that those students who thought I was crazy about not making money as the prime motivator for scientists and other academics have a better finger on the pulse than I. Perhaps new knowledge is now not seen so clearly as a public good, belonging to the world, to be used for the benefit of all. If so, it is a pity.

POST SCRIPT: Nelson Mandela, terrorist

Did you know that all this time, the US government considered Nelson Mandela to be a terrorist?

January 03, 2008

Precision in language

(I am taking a break from original posts due to the holidays and because of travel after that. Until I return, here are some old posts, updated and edited, for those who might have missed them the first time around. New posts should appear starting Monday, January 14, 2008.)

Some time ago, a commenter to this blog sent me a private email expressing this view:

Have you ever noticed people say "Do you believe in evolution?" just as you would ask "Do you believe in God?" as if both schools of thought have equal footing? I respect others' religious beliefs as I realize I cannot disprove God just as anyone cannot prove His existence, but given the amount of evidence for evolution, shouldn't we insist on asking "Do you accept evolution?"

It may just be semantics, but I feel that the latter wording carries an implied affirmation just as "Do you accept that 2+2=4?" carries a different meaning than "Do you believe 2+2=4?"

I guess the point I'm trying to make is that by stating something as a belief, it opens the debate to the possibility that something is untrue. While this may fine for discussions of religion, shouldn't the scientific community be more insistent that a theory well supported by physical evidence, such as evolution, is not up for debate?

It's a good point. To be fair, scientists themselves are partly responsible for this confusion because we also say that we "believe" in this or that scientific theory, and one cannot blame the general public from picking up on that terminology. What is important to realize, though, is that the word 'believe' is being used by scientists in a different sense from the way it is used in religion.

The late and deeply lamented Douglas Adams, author of The Hitchhiker's Guide to the Galaxy, who called himself a "radical atheist" puts it nicely (thanks to onegoodmove):

First of all I do not believe-that-there-is-not-a-god. I don't see what belief has got to do with it. I believe or don't believe my four-year old daughter when she tells me that she didn't make that mess on the floor. I believe in justice and fair play (though I don't know exactly how we achieve them, other than by continually trying against all possible odds of success). I also believe that England should enter the European Monetary Union. I am not remotely enough of an economist to argue the issue vigorously with someone who is, but what little I do know, reinforced with a hefty dollop of gut feeling, strongly suggests to me that it's the right course. I could very easily turn out to be wrong, and I know that. These seem to me to be legitimate uses for the word believe. As a carapace for the protection of irrational notions from legitimate questions, however, I think that the word has a lot of mischief to answer for. So, I do not believe-that-there-is-no-god. I am, however, convinced that there is no god, which is a totally different stance. . .

There is such a thing as the burden of proof, and in the case of god, as in the case of the composition of the moon, this has shifted radically. God used to be the best explanation we'd got, and we've now got vastly better ones. God is no longer an explanation of anything, but has instead become something that would itself need an insurmountable amount of explaining…

Well, in history, even though the understanding of events, of cause and effect, is a matter of interpretation, and even though interpretation is in many ways a matter of opinion, nevertheless those opinions and interpretations are honed to within an inch of their lives in the withering crossfire of argument and counterargument, and those that are still standing are then subjected to a whole new round of challenges of fact and logic from the next generation of historians - and so on. All opinions are not equal. Some are a very great more robust, sophisticated and well supported in logic and argument than others.

When someone says that they believe in god, they mean that they believe something in the absence of, or even counter to, the evidence, and even to reason and logic. When scientists say they believe a particular theory, they mean that they believe that theory because of the evidence and reason and logic, and the more evidence there is, and the better the reasoning behind it, the more strongly they believe it. Scientists use the word 'belief' the way Adams says, as a kind of synonym for 'convinced,' because we know that no scientific theory can be proven with 100% certainty and so we have to accept things even in the face of this remaining doubt. But the word 'believe' definitely does not carry the same meaning in the two contexts.

This can lead to the generation of confusion as warned by the commenter but what can we do about it? One option is, as was suggested, to use different words, with scientists avoiding use of the word 'believe.' I would have agreed with this some years ago but I am becoming increasingly doubtful that we can control the way that words are used.

For example, there was a time when I used to be on a crusade against the erroneous use of the word 'unique'. The Oxford English Dictionary is pretty clear about what this word means:

  • Of which there is only one; one and no other; single, sole, solitary.
  • That is or forms the only one of its kind; having no like or equal; standing alone in comparison with others, freq. by reason of superior excellence; unequalled, unparalleled, unrivalled.
  • Formed or consisting of one or a single thing
  • A thing of which there is only one example, copy, or specimen; esp., in early use, a coin or medal of this class.
  • A thing, fact, or circumstance which by reason of exceptional or special qualities stands alone and is without equal or parallel in its kind.

It means, in short, one of a kind, so something is either unique or it is not. There are no in-betweens. And yet, you often find people saying things like "quite unique" or "very unique" or "almost unique." I used to try and correct this but have given up. Clearly, people in general think that unique means something like "rare" and I don't know that we can ever change this even if we all become annoying pedants, correcting people all the time, avoided at parties because of our pursuit of linguistic purity.

Some battles, such as with the word unique are, I believe, lost for good and I expect the OED to add the new meaning of 'rare' some time in the near future. It is a pity because then we would then be left with no word with the unique meaning of 'unique', but there we are. We would have to say something like 'absolutely unique' to convey the meaning once reserved for just 'unique.'

In science too we often use words with precise operational meanings while the same words are used in everyday language with much looser meanings. For example, in physics the word 'velocity' is defined operationally by the situation when you have an object moving along a ruler and, at two points along its motion, you take ruler readings and clock readings, where the clocks are located at the points where the ruler readings are taken, and have been previously synchronized. Then the velocity of the moving object is the number you get when you take the difference between the two ruler readings and divide by the difference between the two clock readings.

Most people (especially sports commentators) have no idea of this precise meaning when they use the word velocity in everyday language, and often use the word synonymously with speed or, even worse, acceleration, although those concepts have different operational meanings. Even students who have taken physics courses find it hard to use the word in its strict operational sense.

Take, for another example, the word 'theory'. By now, as a result of the intelligent design creationism (IDC) controversy, everyone should be aware that the way this word is used by scientists is quite different from its everyday use. In science, a theory is a powerful explanatory construct. Science depends crucially on its theories because they are the things that give it is predictive power. "There is nothing so practical as a good theory" as Kurt Lewin famously said. But in everyday language, the word theory is used as meaning 'not factual,' something that can be false or ignored.

I don't think that we can solve this problem by putting constraints on how words can be used. English is a wonderful language precisely because it grows and evolves and trying to fix the meanings of words too rigidly would perhaps be stultifying. I now think that we need to change our tactics.

I think that once the meanings of words enter mainstream consciousness we will not be successful in trying to restrict their meanings beyond their generally accepted usage. What we can do is to make people aware that all words have varying meanings depending on the context, and that scientific and other academic contexts tend to require very precise meanings in order to minimize ambiguity.

Heidi Cool has a nice entry where she talks about the importance of being aware of when you are using specialized vocabulary, and the need to know your audience when speaking or writing, so that some of the pitfalls arising from the imprecise use of words can be avoided.

We have to realize though that despite our best efforts, we can never be sure that the meaning that we intend to convey by our words is the same as the meaning constructed in the minds of the reader or listener. Words always contain an inherent ambiguity that allows the ideas expressed by them to be interpreted differently.

I used to be surprised when people read the stuff I wrote and got a different meaning than I had intended. No longer. I now realize that there is always some residual ambiguity in words that cannot be overcome. While we can and should strive for maximum precision, we can never be totally unambiguous.

I agree with philosopher Karl Popper when he said, "It is impossible to speak in such a way that you cannot be misunderstood." The best we can hope for is to have some sort or negotiated consensus on the meanings of ideas.

POST SCRIPT: Huckabee and Paul

Alexander Cockburn discusses why Mike Huckabee and Ron Paul are the two most interesting candidates on the Republican side.

December 06, 2007

Reflections on writing the posts on evolution and the law

When I started out to write the series of posts on evolution and the law, I originally intended it to be about ten posts in all, divided roughly equally between the Scopes trial, the Dover trial, and the period of legal evolution in between them. As those readers who have stayed with the series are painfully aware, the subject matter carried me away and the final result is much longer.

Part of the reason is that I always intend my blog posts to have some useful and reliable information and not just be speculative rants (though those can be fun), which meant that I needed to research the subject. Fortunately, I love the subject of constitutional law because it as a spin-off of my interest in how one creates a just society. If one traces people's constitutional protections to their source, they tend to be rooted in questions about power and control, the nature of liberty, about who gets to make decisions that govern all of us, and what constraints we impose on them.

As I started to research the subject more deeply, I became fascinated at the interplay of political, social, and religious factors surrounding the question of the role of public schools in a democratic society is and how we decide what should be taught in them. I could see that the legal history involved in the teaching of evolution in public schools was more complicated and fascinating than I had originally conceived.

I had two choices. I could close off some avenues of discussion and stick only to the main points. That would be like driving to some destination while sticking just to the highway in order for maximum speed. Or I could take some detours off the beaten track, to get a better flavor of the country I was passing through. I felt that the former option, while making for quicker reading, would result in posts that were a little too glib and not have enough supporting evidence for some of my assertions.

So I chose the latter option, feeling confident that those who read this blog tend to be those who are looking for at least some substantiation of arguments even if they disagree with my views.

The way these posts grew made me reflect on my philosophy of teaching as well. In my seminar courses, students have to write research papers on some topic. Usually a course requires two five-page papers and a final ten-page paper. Students have been through this drill of writing papers many times in many courses and they usually find that they do not have enough to say and struggle to fill what they see as a quota. They use some time-tested techniques: wide margins, large fonts and spacing, and when those things have reached their limit, unnecessary verbiage. Superfluous words and phrases are inserted, ideas are repeated, pointless examples and non sequitur arguments are brought in, and so forth.

The reason for this is that in most cases students are writing about things that they do not really care about and are just going through the motions to meet someone else's needs, not their own. The result is painful for both the student (who has to construct all this padding without it being too obvious that that is what it is) and for the instructor (who has to cut through all the clutter to find out what the author is really trying to say). It is largely a waste of time for both, and often unpleasant to boot.

To help overcome this problem, I give my students as much freedom as possible to choose a research topic within the constraints of the overall course subject matter. I tell students that the most important thing they will do in the course is choose a topic that they care passionately about and want to learn more about. Once they do that, and start investigating and researching such a subject, it is almost inevitable that they will get drawn in deeper and deeper, like I was with evolution and the law.

Once they are on that road, the problem is not how to fill the required number of pages but how to cut it down so that you don't exceed the page limits by too much. This has the added bonus of teaching students how to edit to tighten their prose, to use more judicious language, and to only keep those things that are essential to making their case.

The passion for the subject and the desire to know more about it is what makes genuine researchers carry out difficult and sometimes tedious tasks, because they really care about learning more.

The way this series of posts has grown is an example of this phenomenon at work. Because it is a blog without length restrictions, I have been able to indulge myself a bit. But if I had to restrict the length because of publication needs, then I would go back and do some serious pruning.

POST SCRIPT: The bullet trick

Penn and Teller do another of their famous tricks.

August 28, 2007

Reflections on the working poor

(Text of the talk given by me to the first year class at the Share the Vision program, Severance Hall, Cleveland, OH on Friday, August 24, 2007 at 1:00 pm. The common reading for the incoming class was David Shipler's book The Working Poor: Invisible in America.)

Welcome to Case Western Reserve University! The people you will encounter here are very different from the people described in David Shipler's book The Working Poor: Invisible in America and I would like to address the question: what makes that difference?

Two answers are usually given. One is that we live in a meritocracy, and that we got where we are because of our own virtues, that we are smarter or worked harder or had a better attitude and work ethic than those who didn't make the cut. I am sure that everyone in this auditorium has been repeatedly told by their family and friends and teachers that they are good and smart, and it is tempting to believe it. What can be more gratifying than to be told that one's success is due to one's own ability and efforts? It makes it all seem so well deserved, that there is justice in the world.

Another answer is that luck plays an important role in educational success. I suspect that most of us were fortunate enough to be born into families that had most, if not all, of the following attributes: stable homes and families, good schools and teachers, safe environments, good health, and sufficient food and clothing. Others are not so fortunate and this negatively affects their performance in school.

But there is a third possibility that is not often discussed and that is that the educational system has been deliberately designed so that large numbers of people end up like the people in the book, people who not only have failed but more importantly have learned to think of themselves as failures.

This idea initially seems shocking. How can we want people to fail? Aren't our leaders always exhorting everyone to aim high and succeed in education? But let's travel back in time to the beginnings of widespread schooling in the US. In those early days, schooling was unplanned and focused more on meeting the needs of the learner and less on meeting the needs of the economy.

Recall that this was the time when the so-called robber barons were amassing huge personal wealth while the workers were having appalling working conditions. There was increasing concern that as the general public got more educated, more and more would realize and resent this unequal distribution of wealth.

This fear can be seen in an 1872 Bureau of Education document which speaks about the "problem of educational schooling", according to which, "inculcating knowledge" teaches workers to be able to "perceive and calculate their grievances," thus making them "more redoubtable foes" in labor struggles. (John Taylor Gatto, The Underground History of US Education (2003) p. 153, now available online.)

This was followed by the report in 1888 that said, "We believe that education is one of the principal causes of discontent of late years manifesting itself among the laboring classes." (Gatto, p. 153)

The rising expectations of the general public had to be dampened and this was done by creating an education system that would shift the focus away from learning and more towards meeting the needs of the economy. And the economy then, like now, does not need or want everyone to be well educated.

After all, think what would happen if everyone got a good education and college degrees? Where would we get enough people like those in the book, willing to work for low wages, often with little or no benefits, at places like Wal-Mart so that we can buy cheap goods? Or at McDonalds so that we get cheap hamburgers? Or as cleaning staff at restaurants and hotels so that we can eat out often? Or in the fields and sweatshops so that we can get cheap food and clothes? As the French philosopher Voltaire pointed out long ago: "The comfort of the rich depends upon the abundance of the poor."

One of the most influential figures in shifting education to meet the needs of the work force was Ellwood P. Cubberley, who wrote in 1905 that schools were to be factories "in which raw products, children, are to be shaped and formed into finished products... manufactured like nails, and the specifications for manufacturing will come from government and industry." (Gatto, footnote on page 39 in the online edition of the book.)

He also wrote: “We should give up the exceedingly democratic idea that all are equal and that our society is devoid of classes.”

The natural conclusion of this line of reasoning was spelled out in a speech that Woodrow Wilson gave in 1909, three years before he was elected President of the United States. He said: "[W]e want to do two things in modern society. We want one class to have a liberal education. We want another class, a very much larger class of necessity, to forgo the privilege of a liberal education and fit themselves to perform specific difficult manual tasks." (The Papers of Woodrow Wilson, vol. 18, 1908-1909, Princeton University Press, Princeton NJ, 1974, p. 597.)

So a third possible answer to why all of us are different from the people described in Shipler's book is that the educational system is designed to make sure that only a small percentage (us) will succeed and a much larger percentage (like the people in the book) will fail.

But it is not enough to simply exclude people from success as they will resent it and rebel. After all, all people have had dreams of a good life. As Shipler writes on page 231: "Virtually all the youngsters I spoke with in poverty-ridden middle schools wanted to go on to college. . .Their ambitions spilled over the brims of their young lives." They dreamed of becoming doctors, lawyers, nurses, archeologists, and policemen. But those dreams have to be crushed to meet the needs of the economy. But crushing people's dreams carries risks.

The poet Langston Hughes warned what might happen in his poem A Dream Deferred:

What happens to a dream deferred?
Does it dry up 

like a raisin in the sun? 

Or fester like a sore-- 

And then run? 

Does it stink like rotten meat? 

Or crust and sugar over-- 

like a syrupy sweet?
Maybe it just sags 
like a heavy load.
Or does it explode?

In order to prevent people with crushed dreams from exploding, you have to make them resigned to their fate, to think it is their own fault, to consider themselves failures and unworthy. How do you do that? By making them repeatedly experience failure and discouragement so that by the time they reach high school or even middle school, their love for learning has been destroyed, they have been beaten down, their hopes and dreams crushed by being told repeatedly that they are lazy and no good, so that should not aim high and instead should they think of themselves as so worthless and invisible that it does not even matter if they show up for work or not.

And we have done that. Currently we have an educational system in which people do primarily blame themselves for failure. As Shipler writes in his preface: "Rarely are they infuriated by their conditions, and when their anger surfaces, it is often misdirected against their spouses, their children, or their co-workers. They do not usually blame their bosses, their government, their country, or the hierarchy of wealth, as they reasonably could. They often blame themselves, and they are sometimes right."

So does this mean that everything that our proud parents and teachers have told us about how smart we are is false? No, that is still true. What is false is the widespread belief that all the other people are poor because they are intrinsically stupid or lazy or incompetent.

You are now in a place that values knowledge and inquiry and has the resources to satisfy your curiosity about almost anything. And all this knowledge is freely shared with you, limited only by your own desire to learn. But all that knowledge that you can gain should not to be used to distance yourself even further from those who have not been as fortunate as you, or to think of yourself as superior to them.

All this knowledge is given to you so that you can become a better steward of the planet, so that you will try and create the kind of world where more people, in fact all people, can live the same kind of life that you will lead.

POST SCRIPT: Bye, Bye, Fredo

Alberto Gonzales surely must rank as a front-runner for the worst Attorney General ever, despite strong competition from people like President Nixon's John Mitchell. In fact, the administration of George W. Bush has strong candidates for the worst ever nods in all the major categories: President, Vice President, Secretary of Defense, Secretary of State, and National Security Advisor.

Truly this is an administration that can only be described in superlatives.

August 22, 2007

What makes us change our minds?

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today's one is from March 28, 2005, edited and updated.)

In an earlier post, I described the three kinds of challenges teachers face. Today I want to discuss how teachers might deal with each case.

On the surface, it might seem that the first kind of challenge (where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don't have strong feelings about it either way) is the easiest one. After all, if students have no strong beliefs or prior knowledge about what is being taught, then they should be able to accept the new knowledge more easily.

That is true, but the ease of acceptance also has its downside. The very act of not caring means that the new knowledge goes in easily but is also liable to be forgotten easily once the course is over. In other words, it might have little lasting impact. Since the student has little prior knowledge in that area, there is little in the brain to anchor the new knowledge to. And if the student does not care about it one way or the other, then no effort will be made by the student to really connect to the material. So the student might learn this material by mostly memorizing it, reproduce it on the exams, and forget it a few weeks later.

The research on the brain indicates that lasting learning occurs when students tie new knowledge to things they already know, when they integrate it with existing material. So teachers of even highly technical topics need to find ways to connect it with students' prior knowledge. They have to know their students, what interests them, what concerns them, what they care about. This is why good teachers tie their material in some way to stories or topics that students know and care about or may be in the news or to controversies. Such strategies tap into the existing knowledge structures in the brain (the neural networks) and connect the new material to them, so that it is more likely to 'stick.'

The second kind of challenge is where students' life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. A teacher who does not take these existing beliefs into account when designing teaching strategies is likely to be wasting her time. Because these beliefs are so strongly, but unconsciously held, they are not easily dislodged or modified.

The task for the teacher in this case is to make students aware of their existing knowledge structures and the implications of them for understanding situations. A teacher needs to create situations (say experiments or cases) and encourage students to explore the consequences of the their prior beliefs and see what happens when they are confronted by these new experiences. This has to be done repeatedly in newer and more enriched contexts so that students realize for themselves the existence and inadequacy of their prior knowledge structures and become more accepting of the new knowledge structures and theories.

In the third case, students are consciously rejecting the new ideas because they are aware that it conflicts with views they value more (for whatever reason). This is the situation with those religious people who reject evolutionary ideas because they conflict with their religious beliefs. In such cases, there is no point trying to force or browbeat them into accepting the new ideas.

Does this mean that such people's ideas never change? Obviously not. People do change their views on matters that they may have once thought were rock-solid. In my own case, I know that I now believe things that are diametrically opposed to things that I once thought were true, and I am sure that my experience is very common.

But the interesting thing is that although I know that my views have changed, I cannot tell you when they changed or why they changed. It is not as if there was an epiphany where you slap your forehead and exclaim "How could I have been so stupid? Of course I was wrong and the new view is right!" Rather, the process seems more like being on an ocean liner that is turning around. The process is so gentle that you are not aware that it is even happening, but at some point you realize that you are facing in a different direction. There may be a moment of realization that you now believe something that you did not before, but that moment is just an explicit acknowledgment of something that that you had already tacitly accepted.

What started the process of change could be one of many factors – something you read, a news item, a discussion with a friend, some major public event – whose implications you may not be immediately aware of. But over time these little things lodge in your mind, and as your mind tries to integrate them into a coherent framework, your views start to shift. For me personally, I enjoy discussions of deep ideas with people I like and respect. Even if they do not have any expertise in this area, discussions with such people tend to clarify one's ideas.

I can see that process happening to me right now with the ideas about the brain. I used to think that the brain was quite plastic, that any of us could be anything given the right environment. I am not so sure now. The work of Chomsky on linguistics, the research on how people learn, and other bits and pieces of knowledge I have read have persuaded me that it is not at all clear that the perfectly-plastic-brain idea can be sustained. It seems reasonable that some structures of the brain, especially the basic ones that enable it to interpret the input from the five senses, and perhaps even learn language, must be pre-existing.

But I am not completely convinced of the socio-biological views of people like E. O. Wilson and Steven Pinker who seem to argue that much of our brains, attitudes, and values are biologically determined by evolutionary adaptation. I am also not convinced of the value of much of popular gender-related differences, such as that men are better than women at math or that women are more nurturing than men. That seems to me to be a little too pat. I am always a little skeptical of attempts to show that the status quo is 'natural' since that has historically been used to justify inequality and oppression.

But the works of cognitive scientists are interesting and I can see my views on how the brain works changing slowly. One sign of this is my desire to read widely on the subject.

So I am currently in limbo as regards the nature of the brain, mulling things over. At some point I might arrive at some kind of unified and coherent belief structure. And after I do so, I may well wonder if I ever believed anything else. Such are the tricks the brain can play on you, to make you think that what you currently believe is what is correct and what you always believed.

POST SCRIPT: The Church of the Wholly Undecided

Les Barker has a funny poem about agnosticism.

August 21, 2007

The purpose of teaching

(I am taking a short vacation from new blog posts. I will begin posting new entries again, on August 27, 2007. Until then, I will repost some early ones. Today's one is from March 24, 2005, edited and updated.)

I have been teaching for many years and encountered many wonderful students. I remember in particular two students who were in my modern physics courses that dealt with quantum mechanics, relativity, and cosmology.

Doug was an excellent student, demonstrating a wonderful understanding of all the topics we discussed in class. But across the top of his almost perfect final examination paper, I was amused to see that he had written, "I still don't believe in relativity!"

The other student was Jamal and he is not as direct as Doug. He came into my office a few years after the course was over (and just before he was about to graduate) to say goodbye. We chatted awhile, I wished him well, and then as he was about to leave he turned to me and said hesitantly in his characteristically shy way: "Do you remember that stuff you taught us about how the universe originated in the Big Bang about 15 billion years ago? Well, I don't really believe all that." After a pause he went on, "It kind of conflicts with my religious beliefs." He looked apprehensively at me, perhaps to see if I might be offended or angry or think less of him. But I simply smiled and let it pass. It did not bother me at all.

Why was I not upset that these two students had, after having two semester-long courses with me, still not accepted the fundamental ideas that I had been teaching? The answer is simple. The goal of my teaching is not to change what my students believe. It is to have them understand what practitioners in the field believe. And those are two very different teaching goals.

As I said, I have taught for many years. And it seems to me that teachers encounter three kinds of situations with students.

One is where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don't have strong feelings about it either way. This is usually the case with technical or highly specialized areas (such as learning the symptoms of some rare disease or applying the laws of quantum mechanics to the hydrogen atom). In such cases, students have little trouble accepting what is taught.

The second type of situation is where students' life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. The physics education literature is full of examples that our life experiences conspire to create in people an Aristotelian understanding of mechanics. This makes it hard for them to accept Newtonian mechanics. Note that this difficulty exists even though the students have no particular attachment to Aristotle's views on mechanics and may not have the faintest idea what they are. Overcoming this kind of implicit belief structure is not easy. Doug was an example of someone who had got over the first hurdle from Aristotelian to Newtonian mechanics, but was finding the next transition to Einsteinian relativistic ideas much harder to swallow.

The third kind of situation is where the student has strong and explicit beliefs about something. These kinds of beliefs, as in the case of Jamal, come from religion or politics or parents or other major influences in their lives. You cannot force such students to change their views and any instructor who tries to do so is foolish. If students think that you are trying to force them to a particular point of view, they are very good at telling you what they think you want to hear, while retaining their beliefs. In fact, trying to force or bully students to accept your point of view, apart from being highly unethical teaching practice, is a sure way of reinforcing the strength of their original views.

So Doug's and Jamal's rejection of my ideas did not bother me and I was actually pleased that they felt comfortable telling me so. They had every right to believe whatever they wanted to believe. But what I had a right to expect was that they had understood what I was trying to teach and could use those ideas to make arguments within those frameworks.

For example, if I had given an exam problem that required that the student demonstrate his understanding of relativistic physics to solve, and Doug had refused to answer the question because he did not believe in relativity or had answered it using his own private theories of physics, I would have had to mark him down.

Similarly, if I had asked Jamal to calculate the age of the universe using the cosmological theories we had discussed in class, and he had instead said that the universe was 6,000 years old because that is what the Bible said, then I would have to mark him down too. He is free to believe what he wants, but the point of the course is to learn how the physics community interprets the world, and be able to use that information.

Understanding this distinction is important because of the periodic appearance of demagogues who try to frighten people by asserting that colleges are indoctrinating students to think in a particular way. Such people seem to assume that students are like sheep who can be induced to believe almost anything the instructor wants them to and thus require legal protection. Anyone who has taught for any length of time and has listened closely to students will know that this is ridiculous. It is not that students are not influenced by teaching and do not change their minds but that the process is far more complex and subtle than it is usually portrayed, as I will discuss in the next posting.

My own advice to students is to listen carefully and courteously to what knowledgeable people have to say, learn what the community of scholars thinks about an issue, and be able to understand and use that information when necessary. Weigh the arguments for and against any issue but ultimately stand up for what you believe and even more importantly know why you believe it. Don't ever feel forced to accept something just because some 'expert' or other authority figure (teacher, preacher, parent, political leader, pundit, or media talking head) tells you it is true. Believe things only when it makes sense to you and you are good and ready for it.

April 05, 2007

How to read scholarly works

Most of us in our lives will be required to read a lot of stuff and it will take a lot of time. To become more efficient at it, it helps to realize that there are many types of readings, and that you need to adopt different reading strategies for the different kinds of documents you will encounter. The purpose of the readings will also vary. Sometimes you will read for the gist, sometimes for the argument, and sometimes for certain details. Your reading strategy has to be adjusted accordingly.

For example, you don’t read a science textbook the same way you read a novel. (This may seem obvious but I am always surprised by the number of people who try to read such textbooks from beginning to end, just as they would a novel.) You don't read journal articles in the natural sciences the same way that you read articles in the history and philosophy of science.

In the case of science journal articles, expert readers tend to focus closely on the abstract, introduction, and conclusions, and much less on the background theory, methods, and even the data. Much of the theory and methods is boilerplate that can be skipped or skimmed over in the first pass.

When reading scholarly works in the history and philosophy of science (such as we encounter in my seminar course on the evolution of scientific ideas), the literature tends to take a particular form and it helps to read it with this form in mind. The form is as follows:

1. The author identifies the MAIN problem(s), explains why it of interest, and why it is important to find a solution.
2. The previous solutions to the problem are discussed and reasons are given (in the form of evidence and arguments) why those solutions are unsatisfactory.
3. The author proposes a new solution to the problem and gives reasons (in the form of evidence and arguments) why the new solution should be accepted.
4. In making the author’s case, other auxiliary problems will usually also be identified and addressed in the course of making the larger case.

So when reading these kinds of works, it is good to try and understand them using the above framework. While the underlying structure of the argument will be similar, different authors will present it in different sequences and styles, so these papers usually require several readings before the answers to the above four questions become clear. It takes a while for us to become comfortable reading papers this way, and practice helps.

This brings me to the notions of how you respond to the things you read. In academic discussions, we place a high priority on first understanding what the author is trying to say, to try and see the world through the author’s eyes. This requires us to be in an accepting mode of mind. This does not mean that we have to agree with everything the author says. But you have to also be able to switch into a skeptical mode at times in order to critique the author, and expert readers keep switching between accepting and skeptical modes repeatedly and know when they are doing so.

If you disagree with the author’s point of view, you need to state how your conclusions differ from the author’s, and why. This can be done negatively (by pointing out flaws in the author’s reasoning, or challenging the validity of the evidence presented) and positively (by presenting a different line of reasoning and contrary evidence, and arguing as to why your approach is superior.) In other words, you yourself have to go through the above four steps for your argument to be taken seriously in academic circles.

Notice that you usually have to conform to the canons of evidence and argument that are accepted in that particular field. For example, in physics and other sciences, evidence usually means experimental data or observations, but in the history and philosophy of science, evidence does not necessarily mean data or experimental results or surveys, though these are not excluded. Scholars in the latter field (such as Karl Popper, Thomasa Kuhn, Imre Lakatos, etc.) use the historical record, the ideas and writings of other authors, and appeals to everyday experience as evidence in structuring their arguments.

It is important to bear in mind that just saying that you do not agree with the author’s point of view does not carry much weight in academic discussions. However outrageous the author’s conclusions might seem to you, and however strongly you might disagree with them, you cannot assume that that is enough to discredit the argument. You still need to criticize it using the conventions of academic debate.

Criticizing the author’s style (by saying that the author is making his or her case badly or even offensively) is fine as far as helping you develop your own distinctive writing style, but is not sufficient as an argument against the author's ideas. You still have to address the substantive content of the writing.

Trying to understand the author’s motivation can also help in understanding the structure of the argument, but just because the motivation is not agreeable does not automatically make the author’s arguments invalid. For example, in the literature on the philosophy of science, it seems clear that Karl Popper wants to define science in such a way that it excludes the central ideas of Marx or Freud or Adler. Popper seems to want to protect the prestige of science and, for some reason, dislikes these particular three fields of study and objects to their supporters claiming scientific status for them. Those who would like any or all three subjects included as part of science might disapprove of Popper's motivation, but that does not make Popper wrong. To challenge him on the substance, you will need to show why his definition of science does not work, propose another definition that meets your purposes, and provide evidence and arguments to persuade the reader to prefer your definition over Popper’s. Again, you have to go through steps 1-4 above.

In short, to become better readers, we need to understand the modes of scholarly discourse in each discipline, the purpose of the reading, and use that knowledge to adjust our reading (and writing) strategies and styles accordingly.

Good reading and writing skills are two sides of the same coin and Heidi Cool has an excellent post on what makes for good writing, with lots of useful resource links.

POST SCRIPT: Rep. Ron Paul

Congressman Ron Paul (R-Texas) is running for the Republican presidential nomination. He is an old-style Libertarian-Republican (as opposed to the Authoritarian-Republicans that currently dominate the party) who has opposed the Iraq war from the beginning. Although I don't agree with some of the things he says, he is definitely a much more thoughtful person than the other Republican candidates, and his views should get a much wider hearing than what they are currently receiving.

Here he is interviewed by Bill Maher.

.

March 09, 2007

A low-brow view of books

In yesterday's post, I classified the appreciation of films according to four levels. At the lowest level is just the story or narrative. The next level above that is some message that the director is trying to convey and which is usually fairly obvious. The third level is that of technique, such as the quality of dialogue and acting and directing and cinematography and sound and lighting. And then there is the fourth and highest level, which I call deep meaning or significance, where there is a hidden message which, unlike the message at the second level, is not at all obvious but which has to be unearthed (or even invented) by scholars in the field or people who have a keen sensitivity to such things. I classified people whose appreciation does not get beyond the first two levels as low-brow.

The same classification scheme can be applied to books, especially fiction. In recent years I have started reading mostly non-fiction, but when it comes to fiction, I am definitely low-brow. To give an example of what I mean, take the novels of Charles Dickens. I like them because the stories he weaves are fascinating. One can enjoy them just for that reason alone. The second level meanings of his books are also not hard to discern. Many of his books were attempting to highlight the appalling conditions of poor children at that time or the struggles of the petite bourgeoisie of England. That much I can understand and appreciate.

What about his technique, the third level that I spoke of? The fact that I (and so many others over so many years) enjoy his books means that his technique must be good but I could not tell you exactly what his technique is. It is not that I am totally oblivious to technique. His habit of tying up every single loose end at the conclusion of his books, even if he has to insert extraordinary coincidences involving even minor characters, is a flaw that even I can discern, but this flaw of structure is not something fatal enough to destroy my enjoyment of his the work.

There is probably the fourth level to Dickens that scholars have noticed but which I will never discover by myself. Here we get into the writer's psyche such as whether certain characters reflect Dickens's own issues with his family's poverty and his father's time in a debtor's prison and his relationship to his mother and so on. This is where really serious scholars of Dickens come into their own, mining what is known of his life to discover the hidden subtext of his novels.

My inability to scale these heights on my own is the reason why there are some writers who are stated to be geniuses whom I simply cannot appreciate. Take William Faulkner. I have read his novels The Sound and the Fury and As I Lay Dying and his short stories A Rose for Miss Emily and Barn Burning but I just don't get his appeal.

In fact, I find his writing sometimes downright annoying. At the risk of incurring the wrath of the many zealous Faulkner fans out there, I think that Faulkner does not play fair with his readers, deliberately misleading them seemingly for no discernible reason. In The Sound and the Fury, for example, he abruptly keeps switching narrators on you without warning, each with their own stream of consciousness, but you soon get the hang of that and can deal with it. But what really annoyed me was that he has two characters have the same name but be of different genders and of different generations but this fact is not revealed until the very end. Since this character is central to the story and is referred to constantly by the different narrators, I was confused pretty much all the way through as to what was going on, since I had naively assumed that the references were to the same person, and the allusions to that person did not fit any coherent pattern. As a result, I found it hard to make sense of the story and that ruined it for me. I could not see any deep reason for this plot device other than to completely confuse the reader. I felt tricked at the end and I had no desire to re-read the book with this fresh understanding in mind.

This is not to say that writers should never misdirect their readers but there should be good reasons for doing so. I grew up devouring mystery fiction and those novels also hide some facts from their readers and drop red herrings in order to provide the dramatic denouement at the end. But that genre has fairly strict rules about what is 'fair' when doing this and what Faulkner did in The Sound and the Fury would be considered out of bounds.

More sophisticated readers insist to me that Faulkner is a genius for the way he recreates the world of rural Mississippi, the people and places and language of that time. That may well be true but that is not enough for me to like an author. When my low-level needs of story and basic message are not met, I simply cannot appreciate the higher levels of technique and deep meaning. Furthermore, there is rarely a sympathetic character in his stories. They all tend to be pathological and weird, which makes it even harder to relate to them.

I had similar problems with Melville's Moby Dick. For example, right at the beginning there are mysterious shadowy figures that board the ship and enter Captain Ahab's cabin but they never appear afterwards although it does not appear that they left the ship prior to its departure. What happened to them? What was their purpose? And what do all the details about whaling (that make the book seem like a textbook on the whaling industry) add to the story? Again, the main characters were kind of weird and unsympathetic and I finished the book feeling very dissatisfied.

James Joyce's Ulysses seems to me to be a pure exercise in technique and deep meaning that is probably a delight for scholars to pick through and interpret and search for hidden meanings, but that kind of thing leaves me cold. I simply could not get through it, and also failed miserably with The Portrait of the Artist as a Young Man.

Gabriel Garcia Marquez in his book Love in the Time of Cholera pulls a stunt similar to Melville. His opening chapter introduces some intriguing and mysterious characters who then disappear, never to appear again or be connected with the narrative in even the most oblique way. I kept expecting them to become relevant to the story, to tie some strands together, but they never did and I was left at the end feeling high and dry. Why were they introduced? What purpose were they meant to serve? Again, people tell me that Marquez is great at evoking a particular time and place, and I can see that. But what about the basic storytelling nature of fiction? When that does not make sense, I end up feeling dissatisfied.

I also have difficulty with the technique of 'magic realism' as practiced by Marquez in his A Hundred Years of Solitude and Salman Rushdie in The Satanic Verses. In this genre you have supernatural events, like ghosts appearing and talking to people, or people turning into animals and back again, and other weird and miraculous things, and the characters in the books treat these events as fairly routine and humdrum. I find that difficult to accept. I realize that these things are meant to be metaphors and deeply symbolic in some way, but I just don't get it. These kinds of literary devices simply don't appeal to me.

This is different from (say) Shakespeare's plays, which I do enjoy. He too often invokes ghosts and spirits in some of his plays but these things are easily seen as driving the story forward so it is easy to assimilate their presence. Even though I don't believe in the existence of the supernatural, the people of his time actually believed in those things and the reactions of the characters in his plays to the appearance of these ghosts and fairies seem consistent with their beliefs. But in a novel like The Satanic Verses that takes place in modern times, to have a character turn into a man-goat hybrid and back to fully man again with the other characters responding with only mild incredulity and not contacting the medical authorities, seems a little bizarre.

I would hasten to add that I am not questioning the judgment of experts that Faulkner and Melville and Joyce and Marquez and Rushdie are all excellent writers. One of the things about working at a university is that you realize that the people who study subjects in depth usually have good reasons for their judgments and that they are not mere opinions to be swept aside just because you happen to not agree with them. One does not go against an academic consensus without marshalling good reasons for doing so and my critiques of these writers are at a fairly low level and come nowhere close to being a serious argument against them. What I am saying is that for me personally, a creative work has to be accessible at the two lowest levels for me to enjoy it.

I think that there are two kinds of books and films. One the one hand there are those that can be enjoyed and appreciated by low-brow people like me on our own, and others that are best appreciated when accompanied by discussions led by people who have studied those books and authors and films and directors and know how to deal with them on a high level.

August 31, 2006

Keeping creationism out of Ohio's science classes

Recall that the pro-IDC (intelligent design creationism) forces in Kansas received a setback in their Republican primary elections earlier this month. Now there is a chance to repeat that in Ohio.

I wrote earlier about a challenge being mounted to the attempt by Deborah Owens-Fink (one of the most pro-IDC activists in Ohio) to be re-elected to the Ohio Board of Education from Ohio District Seven. It seems as if the pro-science forces have managed to recruit a good candidate to run against her. He is Tom Sawyer, who is a former US congressman. I received the message below from Patricia Princehouse who has been tireless in her attempts at keeping religious ideas out of the science curriculum.

The worst creationist activist on Ohio's Board of Education is up for re-election (Deborah Owens Fink).

But now she has competition! And with your help, we can win!

We have recruited former congressman Tom Sawyer to run against her. His website is here.

Contributions are urgently needed for Congressman Sawyer's campaign.

(Credit cards accepted here or send check to address below.)

Fink has pledged to raise lots of money & we have no doubt that creationists across the country will pour tens of thousands of dollars into her campaign. We may not be able to match them, but Sawyer is an experienced politician who can make wise use of what he gets. We need to see he gets as much as possible.

HOW MUCH SHOULD I GIVE?

1) Remember that almost every Ohioan that pays Ohio income tax, can take as a
TAX CREDIT (not just a deduction) up to $50 ($100 married couples filing jointly) in donations to Board of Ed candidates. So, please try to give at least the free $50 that you can get back on your taxes.

2) How much would you give if you could erase the past 4 years of damage to Ohio's public schools? $100? $1000? $5000? Please seriously consider giving more than you've ever given before. You stand poised to prevent worse damage over the next 4 years...

Fink is circulating a fund-raising letter in which she thumbs her nose at science & refers to America's National Academy of Sciences as a "group of so-called scientists."

We can protect Ohio from another 4 years of retrograde motion and put someone on the Board who can move Ohio forward toward solving real problems like school funding, literacy, and the achievement gap.

But your help is urgently needed...

www.votetomsawyer.com

I WANT TO DO MORE:

Great! Please spread the word about the web site --in & out of state! (Remember, what happens in Ohio gets exported around the country, so defeating creationism in Ohio benefits the entire country) You can do even more as a volunteer (at home, on the phone, or on the street, even 1 hour of your time can make a difference, especially as we get closer to the election) To volunteer, email Steve Weeks at eul1993@hotmail.com

For info on what Fink has done to science education in Ohio, see here.
For more info on Sawyer, see here.
For more info on other races in Ohio see the HOPE website.
For more info on races nationwide, see here.

To mail donations: Send a check made out to: Vote Tom Sawyer

and mail to:
Martin Spector, Treasurer
4040 Embassy Pkwy, Suite 500, Akron, OH 44333

I was not aware of this provision in Ohio's tax code that effectively gives you a full refund for up to $50 for contributions to campaigns like this. I have not been able to check this information myself and see what, if any, restrcitions apply and if it applies only to school board elections or other elections as well.

For more information on other School Board elections where the pro-science HOPE (Help Ohio Public Education) organization is supporting candidates, see their website.

It would be nice if Ohio voters take the lead from Kansas voters and also reject IDC-promoting candidates.

POST SCRIPT: Saying what needs to be said

Keith Olbermann on MSNBC's Countdown delivers a blistering commentary on Donald Rumsfeld and the rest of the Bush Administration. You can see it here.

May 19, 2006

What makes us good at learning some things and not others?

(I will be traveling for a few weeks and rather than put this blog on hiatus, thought that I would continue with my weekday posting schedule by reposting some of the very early items, for those who might have missed them the first time around.)

One of the questions that students ask me is why it is that they find some subjects easy and others hard to learn. Students often tell me that they "are good" at one subject (say writing) and "are not good" at another (say physics), with the clear implication that they feel that there is something intrinsic and immutable about them that determines what they are good at. It is as if they see their learning abilities as being mapped onto a multi-dimensional grid in which each axis represents a subject, with their own abilities lying along a continuous scale ranging from 'awful' at one extreme to 'excellent' at the other. Is this how it is?

This is a really tough question and I don't think there is a definitive answer at this time. Those interested in this topic should register for the free public lecture by Steven Pinker on March 14.

Why are some people drawn to some areas of study and not to others? Why do they find some things difficult and others easy? Is it due to the kind of teaching that one receives or parental influence or some innate quality like genes?

The easiest answer is to blame it on genes or at least on the hard-wiring of the brain. In other words, we are born the way we are, with gifts in some areas and deficiencies in others. It seems almost impossible to open the newspapers these days without reading that scientists have found the genes that 'cause' this or that human characteristic so it is excusable to jump to genes as the cause of most inexplicable things.

But that is too simple. After all, although the brain comes at birth with some hard-wired structures, it is also quite plastic and the direction in which it grows is also strongly influenced by the experiences it encounters. But it seems that most of the rapid growth and development occurs fairly early in life and so early childhood and adolescent experiences are important in determining future directions.

But what kinds of experiences are the crucial ones for determining future academic success? Now things get more murky and it is hard to say which ones are dominant. We cannot even say that the same factors play the same role for everyone. So for one person, a single teacher's influence could be pivotal. For another, it could be the parent's influence. The influences could also be positive or negative.

So there is no simple answer. But I think that although this is an interesting question, the answer has little practical significance for a particular individual at this stage of their lives in college. You are now what you are. The best strategy is to not dwell on why you are not something else, but to identify your strengths and use them to your advantage.

It is only when you get really deep into a subject (any subject) and start to explore its foundations and learn about its underlying knowledge structure that you start to develop higher-level cognitive skills that will last you all your life. But this only happens if you like the subject because only then will you willingly expend the intellectual effort to study it in depth. With things that we do not care much about, we tend to skim on the surface, doing just the bare minimum to get by. This is why it is important to identify what you really like to do and go for it.

You should also identify your weaknesses and dislikes and contain them. By "contain" I mean that there is really no reason why at this stage you should force yourself to try and like (say) mathematics or physics or Latin or Shakespeare or whatever and try to excel in them, if you do not absolutely need to. What's the point? What are you trying to prove and to whom? If there was a really good reason that you needed to know something about those areas now or later in life, the higher-level learning skills you develop by charging ahead in the things you like now could be used to learn something that you really need to know later.

I don't think that people have an innate "limit", in the sense that there is some insurmountable barrier that prevents them from achieving more in any area. I am perfectly confident that some day if you needed or wanted to know something in those areas, you would be able to learn it. The plateau or barrier that students think they have reached is largely determined by their inner sense of "what's the point?"

I think that by the time they reach college, most students have reached the "need to know" stage in life, where they need a good reason to learn something. In earlier K-12 grades, they were in the "just in case" stage where they did not know where they would be going and needed to prepare themselves for any eventuality.

This has important implications for teaching practice. As teachers, we should make it our goal to teach in such a way that students see the deep beauty that lies in our discipline, so that they will like it for its own sake and thus be willing to make the effort. It is not enough to tell them that it is "useful" or "good for them."

In my own life, I now happily learn about things that I would never have conceived that I would be interested in when I was younger. The time and circumstances have to be right for learning to have its fullest effect. As Edgar says in King Lear: "Ripeness is all."

(The quote from Shakespeare is a good example of what I mean. If you had told me when I was an undergraduate that I would some day be familiar enough with Shakespeare to quote him comfortably, I would have said you were crazy because I hated his plays at that time. But much later in life, I discovered the pleasures of reading his works.)

So to combine the words from the song by Bobby McFerrin, and the prison camp commander in the film The Bridge on the River Kwai, my own advice is "Don't worry. Be happy in your work."

Sources:

John D. Bransford, Ann L. Brown, and Rodney R. Cocking, eds., How People Learn, National Academy Press, Washington D.C.,1999.

James E. Zull, The Art of Changing the Brain, Stylus Publishing, Sterling, VA, 2002.

May 02, 2006

About SAGES -3: The difficult task of changing education

It is a natural human trait to confuse 'is' with 'ought,' to think that what currently exists is also how things should be, especially with long-standing practices. The same is true with teaching methods. Once a way of teaching reaches a venerable stage, it is hard to conceive that things could be any different.

This post will be largely excerpts of an excellent article titled Making the Case by David A. Garvin from the September-October 2003, Volume 106, Number 1 issue of Harvard Magazine (p. 56), showing how hard it is to change the way we teach. It gives as an example the way that legal education changed to what it is today, what is now called the case method. Although this has become the 'standard' way law colleges operate, initial efforts to introduce this method faced enormous resistance from students and faculty and alumni. This is because all of us tend to be most comfortable with doing what we have always done and fear that change will be for the worse.

The article suggests that to succeed, the changes must be based on a deep understanding of education and require support and commitment over the long haul.

Christopher Columbus Langdell, the pioneer of the case method, attended Harvard Law School from 1851 to 1854 - twice the usual term of study. He spent his extra time as a research assistant and librarian, holed up in the school's library reading legal decisions and developing an encyclopedic knowledge of court cases. Langdell's career as a trial lawyer was undistinguished; his primary skill was researching and writing briefs. In 1870, Harvard president Charles William Eliot appointed Langdell, who had impressed him during a chance meeting when they were both students, as professor and then dean of the law school. Langdell immediately set about developing the case method.

At the time, law was taught by the Dwight Method, a combination of lecture, recitation, and drill named after a professor at Columbia. Students prepared for class by reading "treatises," dense textbooks that interpreted the law and summarized the best thinking in the field. They were then tested - orally and in front of their peers - on their level of memorization and recall. Much of the real learning came later, during apprenticeships and on-the-job instruction.

Langdell's approach was completely different. In his course on contracts, he insisted that students read only original sources - cases - and draw their own conclusions. To assist them, he assembled a set of cases and published them, with only a brief two-page introduction.

Langdell's approach was much influenced by the then-prevailing inductive empiricism. He believed that lawyers, like scientists, worked with a deep understanding of a few core theories or principles; that understanding, in turn, was best developed via induction from a review of those appellate court decisions in which the principles first took tangible form. State laws might vary, but as long as lawyers understood the principles on which they were based, they should be able to practice anywhere. In Langdell's words: "To have a mastery of these [principles or doctrines] as to be able to apply them with consistent facility and certainty to the ever-tangled skein of human affairs, is what constitutes a true lawyer…."

This view of the law shifted the locus of learning from law offices to the library. Craft skills and hands-on experience were far less important than a mastery of principles - the basis for deep, theoretical understanding
. . .
This view of the law also required a new approach to pedagogy. Inducing general principles from a small selection of cases was a challenging task, and students were unlikely to succeed without help. To guide them, Langdell developed through trial and error what is now called the Socratic method: an interrogatory style in which instructors question students closely about the facts of the case, the points at issue, judicial reasoning, underlying doctrines and principles, and comparisons with other cases. Students prepare for class knowing that they will have to do more than simply parrot back material they have memorized from lectures or textbooks; they will have to present their own interpretations and analysis, and face detailed follow-up questions from the instructor.

Langdell's innovations initially met with enormous resistance. Many students were outraged. During the first three years of his administration, as word spread of Harvard's new approach to legal education, enrollment at the school dropped from 165 to 117 students, leading Boston University to start a law school of its own. Alumni were in open revolt.

With Eliot's backing, Langdell endured, remaining dean until 1895. By that time, the case method was firmly established at Harvard and six other law schools. Only in the late 1890s and early 1900s, as Chicago, Columbia, Yale, and other elite law schools warmed to the case method - and as Louis Brandeis and other successful Langdell students began to speak glowingly of their law-school experiences - did it diffuse more widely. By 1920, the case method had become the dominant form of legal education. It remains so today.

What we see being tried in SAGES has interesting parallels with what Langdell was trying to achieve. The idea is for students, rather than being the recipients of the distilled wisdom of experts and teachers and told directly what they should know, to study something in depth and to inductively argue their way to an understanding of basic but general principles. The Socratic format of the instructor interrogating students is not used in SAGES, replaced by the somewhat more gentle method of having peer discussions mediated by the instructor.

Taking a long view of past educational changes enables us to keep a sense of proportion. The way we teach now may seem natural and even the only way but usually when we look back it was deliberately introduced, often over considerable opposition, because of some developments in understanding of the nature of learning. As time goes by and our understanding of the learning process changes and deepens, it is natural to re-examine the way we teach as well.

I believe that SAGES takes advantage of what we understand now to be important new insights into learning. But we need to give it a chance to take root. Abandoning it at the first sign of difficulty is absurd because all innovations invariably run into difficulty at the beginning as everyone struggles to adapt to the new ways.

POST SCRIPT: 'Mission Accomplished' by the numbers

As usual, I am tardy in my recognition of anniversaries. But here is a sobering reminder of what has transpired since the infamous photo-op three years ago yesterday on the aircraft carrier.

May 01, 2006

About SAGES -2: Implementation issues

When I talk about the SAGES program (see here for a description of the program and how it came about) to faculty at other universities they are impressed that Case put into place something so ambitious. They immediately see how the program addresses the very same problems that all universities face but few attempt to tackle as comprehensively as we have sought to do.

Of course, the very ambitiousness of the program meant that there would be challenges in implementation. Some of the challenges are institutional and resource-related. Creating more small classes meant requiring more faculty to teach them, more classrooms (especially those suitable for seminar-style interactions), more writing support, and so on. This necessarily imposed some strain on the system.

But more difficult than these resource issues was that the SAGES program was taking both students and faculty away from their comfort zones. Faculty and students tend to know how to deal with the more traditional knowledge-giver/knowledge-receiver model of teaching. They have each played these roles for a long time and can slip comfortably into them. Now both were being asked to shift into different modes of behavior in class.

Students were being asked to play a more active role in creating knowledge, in participating in class, and in taking more responsibility for their own learning outside of class. Faculty were being asked to play a facilitator role, talking far less than they were used to, and learning how to support students as they struggled to learn how to learn on their own, and generating and sustaining focused discussions.

It should have come as no surprise to anyone that when the program was made fully operational, that there would be many problems as both faculty and students adjusted to these changes. What surprises me is that both faculty and students seem to have unrealistic expectations of how smooth the transition should be, and when there were breakdowns, as was inevitable, tended to take these as signs of the program's inadequacy rather than as the necessary growing pains of any change or bold innovation.

In my own teaching over these many years, I have tried all kinds of teaching innovations. The one common feature to all of them is that they rarely worked well (if at all!) the first time I tried them. Even though I read the literature on the methods and planned the changes carefully, I always made mistakes in implementation because the unexpected always occurs and until one has some experience with dealing with a new method of teaching, one does not always respond well to surprises. So even though I was an enthusiastic supporter of the seminar teaching mode, it still took me some time to work out some of the major kinks that occurred and to become comfortable with it. But even now, I keep thinking of many ways to improve it the next time I teach it.

I believe that teaching is a craft, like woodworking. One can and should learn the theory behind it but one only becomes good by doing it, and one has to anticipate that the first attempts are not going to be smooth. But during the period of transition from the old to the new, people tend to compare an old system that has been refined over many years with a new system that has not had its rough edges smoothed out. It was to be expected that faculty teaching in a new way would encounter situations that they had not anticipated and flounder a bit, even if they attended the preparatory sessions that were held for all faculty on how to teach in a seminar format.

I have noticed an odd feature whenever teachers are asked to try a new method of teaching. If the new method does not work perfectly right out of the box, it is jettisoned as a failure. But that is not the methodology that should be used. The actual comparison that should be made is not to some standard of perfection but whether the new method works better than whatever we are currently doing.

For example, I remember when I first introduced cooperative groups in my large (about 200 students) lecture classes and had them work in groups on problems in class. Some colleagues asked me whether it wasn't possible that some students were discussing other things during that time, instead of the physics problem I had assigned. The answer is that of course some do and I knew that. But they could just as easily do these other things if I lectured non-stop. In fact it would be easier since when I lecture I would be busy at the blackboard more and thus even more oblivious to what was going on in the auditorium. I felt that the active learning methods I introduced increased the amount and quality of student engagement from that in a pure lecture, and that was the relevant yardstick to measure things by, not whether I had perfect results. I would never go back now to teaching such large classes without groups.

The SAGES implementation problems, from the faculty point of view, arise from them being uncomfortable with not being in complete control of the flow of information and discussion, being uneasy with not constantly imparting authoritative knowledge, worrying about students learning incorrect things from their peers, concern about time 'wasted' in discussions, discomfort with silences, and not trusting students to be responsible for their own learning.

As a result of these concerns, faculty can succumb to the temptation to relapse into a lecture mode and students take their cue and relapse into the listener mode. This leaves both dissatisfied. Faculty (especially those in research universities who tend to be highly specialized) also sometimes worry that students in seminars will talk and write about topics in which the faculty are not experts, and they will thus not have the 'answers' at their fingertips.

Another major source of concern, especially for faculty in the sciences and engineering, was their feeling that they were not really competent to judge writing and give good feedback to their students on how to improve since they had had little prior experience with essay assignments.

Faculty also do not realize that it takes quite a bit of planning and organization on their part in order to create a free-flowing, substantive, and seemingly spontaneous discussion. Running good discussion seminars actually takes more preparatory work than giving a lecture. It involves a lot more than strolling into class and saying "So, what did you think about the readings?"

The problems that students had with SAGES again stem from a discomfort with an unfamiliar mode of teaching. In seminar-discussion classes, much of the learning has to occur outside the classroom, in the form of reading and writing, by the student. The classroom discussions are used to stimulate interest and provide focus, but students have to do a lot on their own. But some first year students may not be able to handle this responsibility yet. After all, many of them have been very successful by simply going to class, listening to what the teacher said, and doing the assignments. It is natural for such students to prefer to be told what to do and how to do it and this new responsibility thrust upon them may make them uneasy. Some are also shy and speaking in class is difficult, if not an ordeal, and making a formal presentation may be quite terrifying. Reassuring such students and making them comfortable with different types of behavior in class is also a role that faculty may not know how to play.

In the long run, I think both faculty and student will grow from this experience. Personally, I have found teaching the SAGES seminars to be a profoundly rewarding and transformative experience. I have got to know all the students in my classes much better and that has been delightful. I have learned a lot from the research topics they have selected (for their essays and presentation) in areas that are unfamiliar to me. And I have learned a lot about what makes for good writing and how to provide the kinds of feedback and structure to help students learn how to write better.

Of course, there is still a lot more that I still need to learn in order to run seminar classes better. But that is part of the fun of teaching, the fact that you are always learning along with your students. As I said, teaching is a craft and it is characteristic of craftsmen, like say a violin maker, that one just gets better with time, learning from one's mistakes and acquiring new skills and techniques.

In time, I am confident that faculty and students in SAGES will shed their nervousness about it and embrace the seminar method of teaching as well. But it will require patience and perseverance.

If the next posting, I will look back in history to see how law education was transformed in the US. The transition to the present system was extremely rough even though now the current mode is seen as so 'natural' and even inevitable.

POST SCRIPT: Mearsheimer and Walt Petition

As some of you may be aware, Professors John Mearsheimer (University of Chicago) and Stephen Walt (Harvard University) have written an article entitled The Israel Lobby and American Foreign Policy where they argue that this lobby has had too great an influence on American foreign policy. As a result of this, they have been subject to attacks and charges of anti-Semitism. You can see their article in the London Review of Books here and their longer and more detailed working paper on the same subject here, and read about the controversy generated by it here.

Professor Juan Cole (University of Michigan) has organized a petition to defend Mearsheimer and Walt from what he calls "baseless charges of anti-Semitism." Cole says "I feel it is time for teachers in higher education to stand up and be counted on this issue of the chilling of academic inquiry through character assassination. At a time when the use of congressional funding to universities to limit and shape curricula and research is openly advocated, all of us academics are on the line. And if scholars so eminent as Mearsheimer and Walt can be cavalierly smeared, then what would happen to others?"

You can read Cole's discussion of why he created the petition, its contents, and the signatories so far by going here. Cole is requesting that signatories be from those affiliated with universities, because of the way the petition is worded.

April 28, 2006

About SAGES -1: The genesis of the program

As might be expected, some people at Case are all of atwitter about the snide op-ed in a newspaper supposedly called the New York Times by someone supposedly called Stephen Budiansky. (Note to novice writers hoping to develop their snide skills: Putting words like 'supposedly' in front of an easily discernible fact is a weak attempt at sarcasm, by insinuating that something is sneaky when no cause for suspicion exists. Like the way Budiansky says "SAGES (this supposedly stands for Seminar Approach to General Education and Scholarship)" when he has to know this for a fact since he says he has been reading the SAGES website.)

But my point here is not to point out the shallowness of Budiansky's article and make fun of it, although it is a good example of the kind of writing that uses selective quotes and data to support a questionable thesis, and uses a snippy tone to hide its lack of meaningful content. My purpose here is to articulate why I think SAGES has been the best educational idea that I have been associated with in all my years in education in many different institutions. It is clear to me that many people, even those at Case, have not quite understood what went into it and why it is such an important innovation, and this essay seeks to explain it.

I have been involved with SAGES from its inception in the summer of 2002 when I was appointed to the task force by then College of Arts and Sciences Dean Sam Savin, to investigate how to improve the college's general education requirements (GER). American colleges have these kinds of requirements in order to ensure that students have breadth of knowledge, outside their chosen majors. Case's GER were fairly standard in that they required students to select a distribution of courses from a menu classified under different headings, such as Global and Cultural Diversity.

While better than nothing, the task force felt that these kinds of requirements did not have any cohesive rationale, and result in students just picking courses so that they can check off the appropriate boxes. The task force wondered how we could make the distribution requirements more meaningful and part of a coherent educational philosophy. In the process of studying this question, we learned of other problems that were long standing but just lurking beneath our consciousness. Almost all these problems are endemic to many universities, not just Case.

One of these was that students entering Case tended to come in with a sense of identity that was identified with a specific program rather that the school as a whole. They saw themselves primarily as engineering students or nursing students or arts students and so on, rather than as Case students. This fragmented identity was aided by the fact that in the first year they had no common experience that transcended these disciplinary boundaries. So we wondered what we could do to help create a sense of oneness among the student body, a sense of overall belonging.

Another problem we identified was that it was quite possible, even likely, for a first year student to spend the entire year in large lecture classes where they were largely passive recipients of lectures. This could result in students feeling quite anonymous and alone, and since this was also their first year away from home, it was not felt to be a good introduction to college life, let along for the emotional and intellectual health of the student. Furthermore, we know that first impressions can be very formative. When college students spend their first year passively listening in class, we feared that this might become their understanding of their role in the university, and that it would become harder to transform them into the active engagement mode that was necessary when they got into the smaller upper division classes in their majors.

Another problem was that students at Case did not seem to fully appreciate the knowledge creation role that is peculiar to the research function of universities. While they had chosen to attend a research university, many did not seem to know what exactly constituted research, how it was done, and its value.

Another thing that surprised us was when even some seniors told us that there was not a single faculty member they had encountered during their years at Case whom they felt that they knew well, in the sense that if they walked into that faculty member's office that he or she would know the student's name and something about them. We felt that this was a serious deficiency, because faculty-student interactions in and out of the class should play an important role in a student's college experience. We felt that it was a serious indictment of the culture of the university that a student could spend four years here and not know even one faculty member well.

Another very serious problem that was identified was that many students were graduating with poor writing and presentation skills. The existing writing requirement was being met by a stand-alone English course that students took in their first year. Students in general (not just at Case) tend to dislike these stand-alone 'skills' courses and one can understand why. They are not related to any of the 'academic' courses and are thus considered inferior, merely an extra hoop to be jumped through. The writing exercises are necessarily de-contextualized since they are not organically related to any course of study. Students tend to treat such courses as irritants, which makes the teaching of such courses unpleasant as well. But what was worse was that it is clear that a one-shot writing course cannot produce the kinds of improvement in writing skills that are desired.

Furthermore, some tentative internal research seemed to suggest that the one quality that universities seek above all to develop in their students, the level of 'critical thinking' (however that vague term is defined), was not only not being enhanced by the four years spent here, there were alarming hints that it was actually decreasing.

And finally the quality of first year advising that the students received was highly variable. While some advisors were conscientious about their role and tried hard to get to know their students, others hardly ever met them, except for the minute or two it took to sign their course registration slips. Even the conscientious advisors found it hard to get to know students on the basis of a very few brief meetings. This was unsatisfactory because in addition to helping students select courses, the advisor is also the first mentor a student has and should be able to help the student navigate through the university and develop a broader and deeper perspective on education and learning and life. This was unlikely to happen unless the student and advisor got to know each other better.

Out of all these concerns, SAGES was born, and it sought to address all these concerns, by providing a comprehensive and cohesive framework that, one way or another, addressed all the above issues.

The task force decided on a small-class seminar format early on because we saw that this would enable students to engage more, speak and write more, get more feedback from the instructor and fellow students, and thus develop crucially important speaking, writing and critical thinking skills.

Since good writing develops only with a lot of practice of writing and revising, we decided that one writing-intensive seminar was not enough. Furthermore, students need to like and value what they are writing if they are going to persevere in improving their writing. In order to achieve this it was felt that the writing should be embedded in courses that dealt with meaningful content that the students had some choice in selecting. So we decided that students should take a sequence of four writing intensive seminars consisting of a common First Seminar in their first semester, and a sequence of three thematic seminars, one in each subsequent semester, thus covering the first two years of college.

The need for a common experience for all students was met by having the First Seminar be based on a common theme (which we chose to be on The Life of the Mind), with at least some readings and out-of-class programs experienced in common by all first year students. The idea was that this would provide students with intellectual fodder to talk about amongst themselves in their dorms and dining halls and other social settings, irrespective of what majors they were considering or who they happened to be sitting next to. The common book reading program for incoming students was initiated independently of SAGES but fitted naturally into this framework. The First Seminar was also was designed to get students familiar with academic ways of thinking, provide an introduction to what a research university does and why, and provide opportunities for them to access the rich variety of cultural resources that surround the university.

The decision that the First Seminar instructor also serve as the first year advisor was suggested so that the advisor and student would get to know each other well over the course of that first semester and thus enable the kind of stronger relationship that makes mentoring more meaningful

The University Seminars that followed the First Seminar were grouped under three categories (the Natural World, the Social World, and the Symbolic World) and students were required to select one from each category for the next three semesters. Each of these areas of knowledge has a different culture, investigate different types of questions, use different rules for evidence and how to use that evidence in arriving at conclusions, have different ways of constructing knowledge, and develop different modes of thinking and expression. These seminars would be designed around topics selected by the instructor and designed to help students understand better the way that practitioners in those worlds view knowledge.

By taking one from each group based on their own interests, it was hoped that students would learn how to navigate the different academic waters that they encounter while at college. Taken together, we hoped they would complement each other and provide students with a global perspective on the nature of academic discourse.

In order to prevent the risk of content overload that eventually engulfs many university courses, it was decided that the University Seminars would have no pre-requisites and also could not serve as pre-requisites for other courses, thus freeing instructors from the oft-complained problem of feeling burdened to 'cover' a fixed body of material and thus cutting off student participation. Now they were free to explore any question to any depth they wished.

For example, in my own SAGES course The Evolution of Scientific Ideas (part of the Natural World sequence) we explore the following major questions: What is science and can we distinguish it from non-science? What is the process that causes new scientific theories to replace the old? In the process of investigating these questions, I hope that students get a better understanding of how scientists see the world and interact with it. And I do not feel pressure to cover any specific scientific revolution. I can freely change things from year to year depending on the interests of the students.

A senior capstone experience was added to provide students with an opportunity to have a culminating activity and work on a project of their own choosing that would enable them to showcase the investigative, critical thinking, speaking, and writing skills developed over their years at Case.

Next in this series: Implementation issues

POST SCRIPT: Talk on Monday about Abu Ghraib and Guantanamo

On Monday, May 1 at 4:00pm in Strosacker Auditorium, Janis Karpinski, who was a Brigadier General and commanding officer of the Abu Ghraib prison when the prisoner torture and abuse scandal erupted and who feels that she was made a scapegoat for that atrocity and demoted, and James Yee who was U.S. Army Muslim Chaplin at Guantanamo, was arrested for spying and later cleared, will both be speaking.

It should be interesting to hear their sides of the story.

The talk is free and open to the public. More details can be found here.

April 17, 2006

On writing-5: The three stages of writing

(See part 1, part 2, part 3, and part 4 in the series.)

I believe that part of the reasons students end up plagiarizing, either inadvertently or otherwise, is that they underestimate the time it takes to write. This is because they think that writing only occurs when they are actually putting words on paper or typing on a keyboard.

But writing involves really three phases: prewriting, writing, and post-writing.

Pre-writing probably takes the most time and often does not involve the physical act of writing at all. This is the time when the author is mulling things over in his mind, sorting ideas out, trying to find the main point he is trying to make, asking what kinds of evidence is necessary and what documents should be read for background, and seeking out those sources of information. It also involves (for some) sketching out an outline and making rough notes. It is during this process of slow digestion that you start the important process of synthesizing the ideas that you have got from many sources and making something of your own

Prewriting is best done in a conscious but unrushed manner. For me, most of this prewriting is done in my head while doing other things such as walking or driving somewhere or doing routine chores or in those moments before falling asleep or just after waking. During those times, I am thinking of what to write, even to the extent of mentally creating actual lines of text, examples, and turns of phrase. I do this deliberately and consciously. In the SAGES Peer Writing Crew blog, Nicole Sharp says she thinks about writing while walking between classes, composing sentences in her head. This is an example of using time wisely, because some of the best ideas come to us when we are not consciously trying to generate them. It is to avoid interrupting this kind of prewriting that I have resisted carrying a cell phone or a Blackberry.

I think students may not appreciate how important this pre-writing phase is to writing. When given an assignment, they may wait until shortly before it is due and set aside a large block of time that they think is sufficient to write the five page paper or whatever it is that is required. But then they hit a block and don't know what to say or how to say it because they have not gone through the important pre-writing phase. Without being aware of it, they are trying to compress the pre-writing and writing phases into one. But when you try to do that, it is hard to find your own perspective on a topic. So you end up using ideas from one or a few sources, mashing them together, while paraphrasing them to make it look like your own, thus running the risk of plagiarizing.

Instructors are partly to blame for this. We may not be informing students of the importance of prewriting, and in fact may be undermining that practice by giving short deadlines that do not really allow much time for the kind of thoughtful contemplation it requires. I am not sure how to structure writing assignments in my courses so that students get in the habit of prewriting but it is definitely something I am going to pay more attention to in my next course.

The post-writing phase is equally important, but equally neglected. This involves much more than simply editing the work. Editing for me means simply tightening things up, checking for grammar, improving word choice, and avoiding stylistic ugliness. The more important aspect of post-writing that once the writing phase has put my ideas into a concrete form, I can now keep returning to it, probing it, looking to see how to make it better. This may involve restructuring the argument, providing more evidence, finding a fresh image to capture an idea, inventing a telling metaphor, or looking for better sources. I like to let time percolate through the words I have written, creating a richer text.

All these things are best done in a conscious but unrushed manner. Most of this post-writing takes place in my mind while doing other things, like the prewriting phase. But this requires that we set aside time for it after the writing phase. If we are rushing to meet a deadline, this will not occur.

It is only the writing and editing parts that actually take up any 'real' time. All the other things can be done while living one's life doing other things.

The pre-writing phase takes up the most time for me, followed by the post-writing phase, with the actual writing taking up the least time. When people ask me how long it took me to write either of my books, it is hard for me to answer. I usually say about six months because that is the time the actual writing phase took, and this is what people usually mean by 'writing.' But the prewriting phase that led up to each lasted much, much longer.

The same thing holds for these blog entries. The entire week's entries take me about five to ten hours total of actual writing, depending on the topic. But before I write them, I have done a lot of pre-writing on each topic, doing research, collecting notes and creating the structure in my mind, all done in bits and pieces scattered over time, so that when I actually sit down and write (the writing phase), the words and ideas come fairly easily.

I also write almost all the week's entries during the weekend prior to their posting. One reason for this practice is that the weekend is when I have more time to write. But the main reason is that after the writing is done, I have time to let my thoughts simmer and do some post-writing in my mind, enabling me to polish the entries during the week, before I actually post them.

The exceptions to this rule occur when something comes up in the news during the week that I feel impelled to respond to immediately, like the call center item last week or the Tiktaalik discovery the previous week. But even in these cases, the reason I can respond so promptly is that these topics have touched on something that I either care about a lot or know quite a bit about, which means that I have pretty much done the prewriting in my mind already, although I did not have a plan to actually write about it. I still leave some time for post-writing, even in these cases, usually by completing the writing the night before the morning posting.

But since students working on a short deadline do not have, or are aware of the need for creating, the time for pre- or post-writing, they end up producing work that is of lower quality than they are capable of. The challenge for instructors and students is how to help students become aware of the immense importance of the prewriting and post-writing phases, and how to structure assignments and deadlines to help them get used to doing it and have the time to do so.

Peter Elbow, in his book Everyone Can Write, gives some valuable advice. He recommends that writers create two distinct mindsets when writing. One mindset is a very accepting one, where any idea that comes into one's head, any sentence, any image or metaphor, is accepted by the author as being wonderful and written down or stored away for use. This attitude is great in the prewriting phase, because it enables you to generate a lot of ideas.

The second mindset is the critical one, where we evaluate what we have written and ask whether it is worth retaining, whether it should be improved upon, or phrased better. This is best done in the post-writing phase.

Many of us get stuck in our writing because we are trying to do both things simultaneously. An idea comes into our head and we immediately start to analyze or critique it wondering whether it should be included or not. This blocks our progress and we get stuck.

Of course, none of these distinctions can be really rigid. When we are critiquing an idea in the post-writing phase, that might generate a new idea and we have to switch to an accepting phase. But being aware that an attitude that is accepting of ideas and one that is critical of ideas have to be adopted as the need arises can prevent one from having that awful feeling of thinking that one has 'nothing to say.' We all have something to say. It is just that we do not know if it is worth saying. It helps to postpone that judgment.

Realizing that we need to say whatever is on our minds and only later judge whether it is worth saying is a good habit to cultivate.

This series of postings on writing is, in itself, an illustration of how writing grows. I had initially only meant to write about the plagiarism issue, triggered by the Ben Domenech fiasco in the Washington Post. But as I wrote about it, the topic branched off into many related areas, and ideas occurred to me that were not there when I started.

So I guess the lesson to be taken from all this is that you should just start writing about anything you care about, and see where it goes. You will probably be surprised at where you end up.

POST SCRIPT: Where the religious people are

Ever wondered where Catholics are most concentrated in the US? How about Mennonites? Jews? Muslims? Lutherans? Well, now you can find out with this series of maps that shows, county by county, the density of populations of the various religious denominations.

It did not provide a breakdown for atheists. This is because they were getting their numbers from the membership lists of religious institutions in each area and atheists don't have formal groups. What was interesting, though, was that there were a surprisingly large numbers of counties where the total number of religious adherents of any stripe was less than 50%.

April 13, 2006

On writing-4: The role of originality

(See part 1, part 2, and part 3 in the series.)

So why do people end up sometimes plagiarizing? There are many reasons. Apart from the few who deliberately set out to do it because they are too lazy to do any actual writing of their own and lack any compunction about plagiarizing, I believe most end up doing it out of fear that they expected to say something that is interesting, original, and well written, usually (in the case of classroom assignments) about topics that they have little or no interest in.

This is a highly inflated and unrealistic expectation. I doubt that more than a few college or high school teacher really expect a high level of originality in response to classroom assignments, though that does not mean one should not try to achieve it.

A misplaced emphasis on originality creates unrealistic expectations that can cause insecure writers to plagiarize. I think that students who end up plagiarizing make the mistake of thinking that they must start by coming up with an original idea. Few people (let alone students who usually have very little writing experience) can reach such a high standard of originality. This is why they immediately hit a wall, lose a lot of time trying to get an idea, and in desperation end up plagiarizing by finding others who have said something interesting or relevant and "borrowing" their work. But since they want the reader to think that they have done the writing, they sometimes hide the borrowing by means of the 'pointless paraphrase' I wrote about previously.

Originality in ideas is often something that emerges from the writing and is not prior to the writing. A blindingly original idea may sometimes strike you, but this will be rare even for the most gifted and original writers. Instead, what you will usually find is a kind of incremental originality that emerges naturally out of the act of writing, where you are seemingly doing the mundane task of putting together a clear piece of writing using other people's (cited) ideas. If you are writing about things that interest you, then you will be surprised to find that the very act of writing brings about something original, where you discover new relationships between old ideas.

As an instructor, what I am really looking for in student writing is something that just meets the single criterion of being well written. As for being interesting, all I want is to see that at least the writer is interested in the topic, and the evidence for that takes the form of the writer making the effort to try and convince the reader of the writer's point of view. This seems like a modest goal but if followed can lead to pretty good writing.

In my experience, the most important thing is for writers to be interested enough in the topic that they want to say something about it, so the first condition for good writing is that the writer must care about the topic. The second condition is that the writer cares enough about it to want to make the reader care too. Once these two factors are in place, originality (to a greater or lesser degree) follows almost automatically from them.

It took me a long time to understand this. I had never written much in the earlier stages of my career (apart from scientific papers) because I was waiting for great new ideas to strike me, ideas that never came. But there came a time when I felt that a topic I cared a lot about (the nature of science) was one in which the point of view I held was not being articulated clearly enough by others. I began writing about it, not because I had an original idea, but because I felt a need to synthesize the ideas of many others into a simpler, more clearly articulated, position that I felt was missing from the discussion. In the process of creating that synthesis, some papers and my first book Quest for Truth: Scientific Progress and Religious Beliefs emerged. What turned out to be original (at least slightly) in them was the application of the ideas of certain classical philosophers and historians of science to the contemporary science-religion debate, something that I had not had in mind when I started writing. That feature emerged from the writing.

My second book The Achievement gap in US education: Canaries in the mine followed that same pattern. I was very concerned about what I felt were great misunderstandings about the causes of the achievement gap between black and white students in the US and how to deal with it. I felt that my experience and interests in science and education and politics and learning theory put me in a good position where I could bring ideas from these areas together. I did not have anything really original in mind when I started writing but whatever is original in the book emerged from the act of writing, the attempt to create a synthesis.

The same applies to these blog entries. I write about the things I care about, trying to make my point clear, without seeking to be original. After all, who can come up with original ideas five times per week? But very often I find that I have written things that I had not thought about prior to the writing.

To be continued. . .

POST SCRIPT: Is there no end to the deception?

One of the amazing things about they current administration is how brazen they are about misleading the public. The latest is that President Bush rushed to declare that "We have found [Iraq's] weapons of mass destruction" in the form of mobile biological weapons laboratories, even while some intelligence investigators were finding that there was nothing to that charge.

The defense being offered by the administration's spokespersons that these negative findings had not reached the president makes no sense. Before making a serious charge, it is the President and his staff's responsibility to check what information is being gathered and processed. To shoot off his mouth when there was no urgency to do so is to be irresponsible at best and deceitful at worst.

Kevin Drum of Washington Monthly is maintaining a list of the more egregious examples of things the administration knew were not true or for which there were serious doubts, but went ahead and declared them as 'facts' anyway, to justify decisions that they had already made about attacking Iraq.

He is up to #8 and there is no reason to think that the list will not keep growing.

April 11, 2006

On writing-3: Why do people plagiarize?

(See part 1 and part 2 in the series.)

Just last week, it was reported that twenty one Ohio University engineering graduates had plagiarized their master's theses. Why would they do that?

I think it is rare that people deliberately set out to use other people's words and ideas while hiding the source. Timothy Noah in his Chatterbox column has a good article in Slate where he points to Harvard's guidelines to students which state that unintentional plagiarism is a frequent culprit:

Most often . . . the plagiarist has started out with good intentions but hasn't left enough time to do the reading and thinking that the assignment requires, has become desperate, and just wants the whole thing done with. At this point, in one common scenario, the student gets careless while taking notes on a source or incorporating notes into a draft, so the source's words and ideas blur into those of the student.

But lack of intent is not a valid defense against the charge of plagiarism. That has not prevented even eminent scholars like Doris Kearns Goodwin from trying to invoke it. But as Noah writes, the American Historical Association's (AHA) and the Organization of American Historians' (OAH) statement on plagiarism is quite clear on this point:

The plagiarist's standard defense-that he or she was misled by hastily taken and imperfect notes-is plausible only in the context of a wider tolerance of shoddy work. . . . Faced with charges of failing to acknowledge dependence on certain sources, a historian usually pleads that the lapse was inadvertent. This excuse will be easily disposed of if scholars take seriously the injunction to check their manuscripts against the underlying texts prior to publication.

Noah cites many authorities that say that citing the source does not always absolve you from the charge of plagiarism either.

Here's the MLA Guide:

Presenting an author's exact wording without marking it as a quotation is plagiarism, even if you cite the source [italics Chatterbox's].

Here's the AHA and the OAH:

Plagiarism includes more subtle and perhaps more pernicious abuses than simply expropriating the exact wording of another author without attribution. Plagiarism also includes the limited borrowing, without attribution, of another person's distinctive and significant research findings, hypotheses, theories, rhetorical strategies, or interpretations, or an extended borrowing even with attribution [italics Chatterbox's].

Noah gives an example of this. In the original FDR, My Boss, the author Grace Tully writes:

Near the end of the dinner Missy arose from her chair to tell me she felt ill and very tired. I urged her to excuse herself and go upstairs to bed but she insisted she would stay until the Boss left. He did so about 9:30 and within a few minutes Missy suddenly wavered and fell to the floor unconscious.

Doris Kearns Goodwin in her book In No Ordinary Time writes:

Near the end of the dinner, Grace Tully recalled, Missy arose from her chair, saying she felt ill and very tired. Tully urged her to excuse herself and retire to her room, but she insisted on staying until the president left. He did so at 9:30 p.m. and, moments later, Missy let out a piercing scream, wavered and fell to the floor unconscious.

Is this plagiarism? After all, she cites the original author in the text itself, and the wording has been changed slightly. Yes, plagiarism has occurred says Noah, citing Harvard's guidelines:

If your own sentences follow the source so closely in idea and sentence structure that the result is really closer to quotation than to paraphrase . . .you are plagiarizing, even if you have cited the source [italics Chatterbox's].

The whole point of a paraphrase is to make a point more clearly, to emphasize or clarify something that may be hidden or obscure in the original text. Russ Hunt gives a good example of the wrongful use of the paraphrase, which he takes from Northwestern University's website The Writing Place:

Original

But Frida's outlook was vastly different from that of the Surrealists. Her art was not the product of a disillusioned European culture searching for an escape from the limits of logic by plumbing the subconscious. Instead, her fantasy was a product of her temperament, life, and place; it was a way of coming to terms with reality, not of passing beyond reality into another realm. 
Hayden Herrera, Frida: A Biography of Frida Kahlo(258)

Paraphrase

As Herrera explains, Frida's surrealistic vision was unlike that of the European Surrealists. While their art grew out of their disenchantment with society and their desire to explore the subconscious mind as a refuge from rational thinking, Frida's vision was an outgrowth of her own personality and life experiences in Mexico. She used her surrealistic images to understand better her actual life, not to create a dreamworld (258).

As Hunt says:

What is clearest about this is that the writer of the second paragraph has no motive for rephrasing the passage other than to put it into different words. Had she really needed the entire passage as part of an argument or explanation she was offering, she would have been far better advised to quote it directly. The paraphrase neither clarifies nor renders newly pointed; it's merely designed to demonstrate to a sceptical reader that the writer actually understands the phrases she is using in her text.

I think that this kind of common excuse, that the authors did not know they were plagiarizing because they had used the 'pointless paraphrase' or because they cited the source, is disingenuous. While they may not have been aware that this kind of paraphrasing technically does constitute plagiarism, it is hard to imagine that the perpetrators were not aware that they were doing something wrong.

The lesson, as I see it, is to always prefer the direct quote with citation to the 'pointless paraphrase.' Changing wording here and there purely for the sake of thinking that doing so makes the passage one's own should be avoided.

POST SCRIPT: Discussing controversial ideas

Chris Weigold, who is a reader of this blog and also a Resident Assistant in one of Case's dorms, has invited me to a free-wheeling discussion about some controversial propositions that I have discussed previously in my blog as well as those that I will probably address in the future, such as:

  • Should military service be mandatory for all citizens?
  • Should everyone be required to work in a service-oriented job for two years?
  • Is torture warranted in some situations?
  • Why shouldn't Iran be allowed to become a nuclear power?
  • Should hospitals be allowed to refuse to keep a patient on life-support if the patient cannot pay?
  • Is patriotism a bad thing?
  • Are atheists more moral than religious people?
  • Why is killing innocent people in war not considered wrong?
  • If we can experiment on non-human animals, why not on humans?
  • How do people decide which religion is right?
or any other topic that people might raise.

The discussion takes place in the Clarke Tower lobby from 8:00-9:30pm on Wednesday, April 12, 2006. All are welcome.

April 05, 2006

On writing-2: Why do we cite other people's work?

In the previous post on this topic, I discussed the plagiarism case of Ben Domenech, who had lifted entire chunks of other people's writings and had passed them off as his own.

How could he have done such a thing? After all, all high school and college students get the standard lecture on plagiarism and why it is bad. And even though Domenech was home schooled, it seems unlikely that he thought this was acceptable practice. When he was confronted with his plagiarism, his defense was not one of surprise that it was considered wrong but merely that he had been 'young' when he did it or that he had got permission from the author to use their words or that the offending words had been inserted by his editors.

The cautionary lectures that students receive about plagiarism are usually cast in a moralistic way, that plagiarism is a form of stealing, that taking someone else's words or ideas without proper attribution is as morally reprehensible as taking their money.

What is often overlooked in this kind of approach is that there are many other reasons why writers and academics cite other people's works when appropriate. By focusing too much on this stealing aspect, we tend to not give students an important insight into how scholarship and research works.

Russ Hunt at St. Thomas University argues that writers cite others for a whole complex of reasons that have little to do with avoiding charges of plagiarism:

[P]ublished scholarly literature is full of examples of writers using the texts, words and ideas of others to serve their own immediate purposes. Here's an example of the way two researchers opened their discussion of the context of their work in 1984:

To say that listeners attempt to construct points is not, however, to make clear just what sort of thing a 'point' actually is. Despite recent interest in the pragmatics of oral stories (Polanyi 1979, 1982; Robinson 1981), conversations (Schank et al. 1982), and narrative discourse generally (Prince 1983), definitions of point are hard to come by. Those that do exist are usually couched in negative terms: apparently it is easier to indicate what a point is not than to be clear about what it is. Perhaps the most memorable (negative) definition of point was that of Labov (1972: 366), who observed that a narrative without one is met with the "withering" rejoinder, "So what?" (Vipond & Hunt, 1984)

It is clear here that the motives of the writers do not include prevention of charges of plagiarism; moreover, it's equally clear that they are not. . .attempting to "cite every piece of information that is not a) the result of your own research, or b) common knowledge." What they are doing is more complex. The bouquet of citations offered in this paragraph is informing the reader that the writers know, and are comfortable with, the literature their article is addressing; they are moving to place their argument in an already existing written conversation about the pragmatics of stories; they are advertising to the readers of their article, likely to be interested in psychology or literature, that there is an area of inquiry -- the sociology of discourse -- that is relevant to studies in the psychology of literature; and they are establishing a tone of comfortable authority in that conversation by the acknowledgement of Labov's contribution and by using his language --"withering" is picked out of Labov's article because it is often cited as conveying the power of pointlessness to humiliate (I believe I speak with some authority for the authors' motives, since I was one of them).

Scholars -- writers generally -- use citations for many things: they establish their own bona fides and currency, they advertise their alliances, they bring work to the attention of their reader, they assert ties of collegiality, they exemplify contending positions or define nuances of difference among competing theories or ideas. They do not use them to defend themselves against potential allegations of plagiarism.

The clearest difference between the way undergraduate students, writing essays, cite and quote and the way scholars do it in public is this: typically, the scholars are achieving something positive; the students are avoiding something negative. (my italics)

I think that Hunt has hit exactly the right note.

When you cite the works of others, you are strengthening your own argument because you are making them (and their allies) into your allies, and people who challenge what you say have to take on this entire army. When you cite reputable sources or credible authorities for facts or ideas, you become more credible because you are no longer alone and thus not easily dismissed, even if you personally are not famous or a recognized authority.

To be continued. . .

POST SCRIPT: It's now Daylight Saving Time. Do you know where your spiritual plane is?

It seems like idiotic statements attributing natural events to supernatural causes are not restricted to Christian radical clerics like Pat Robertson. Some Sri Lankan Buddhist clergy are challenging him for the title of Religious Doofus.

Since Sri Lanka sits very close to the equator, the length of the day is the same all year round, not requiring the 'spring-forward-fall-back' biannual adjusting of the US. Sri Lankan time used to be 5.5 hours ahead of Universal Time (UT) but in 1996 the government made a one-time shift it to 6.5 hours in order to have sunset arrive later and save energy. But the influential Buddhist clergy were not happy with the change. As a compromise, the clocks were then again adjusted to make it just 6.0 ahead of UT as a compromise. Now the government is thinking of going back to the original 5.5. hours.

Some of the country's Buddhist clergy are rejoicing at the prospect of a change because they say Sri Lanka's "old" time fitted better with their rituals.

They believe a decade living in the "wrong" time has upset the country's natural order with terrible effect.

The Venerable Gnanawimala says the change moved the country to a spiritual plane 500 miles east of where it should be.

"After this change I feel that many troubles have been caused to Sri Lanka. Tsunamis and other natural disasters have been taking place," he says.

This is what happens when you mix religion and the state. You now have to worry about what your actions are doing to the longitudinal coordinates of your nation's spiritual plane.

April 03, 2006

On writing-1: Plagiarism at the Washington Post

If you blinked a couple of weeks ago, you might have missed the meteor that was the rise and fall of the career of Ben Domenech as a blogger for WashingtonPost.com.

This online version of the newspaper is apparently managed independently of the print edition and has its own Executive Editor Jim Brady. For reasons that are not wholly clear, Brady decided that he needed to hire a "conservative" blogger for the website.

The problem with this rationale for the hiring was that no "liberal" counterpart blogger existed at the paper. They did have a popular blogger in Dan Froomkin, someone with a journalistic background, who wrote about politics for the Post and who had on occasion been critical of the Bush White House. As I have written earlier, Glenn Greenwald has pointed out that anything but unswavering loyalty to Bush has become the basis for identifying someone as liberal, and maybe Brady had internalized this critique, prompting him to hire someone who could be counted upon to support Bush in all his actions.

For reasons that are even more obscure, rather than choose someone who had serious journalistic credentials for this new column, Brady selected the untested 24-year old Ben Domenech. It is true that Domenech was something of a boy wonder, at least on paper. He had been home-schooled by his affluent and well-connected Republican family. He then went to William and Mary and wrote for their student newspaper The Flat Hat. He dropped out of college before graduating and co-founded a conservative website called Redstate, where he wrote under the pseudonym Augustine.

His father was a Bush political appointee and his new online column for the Washington Post (called Red America) said in its inaugural posting on March 21 that young Ben "was sworn in as the youngest political appointee of President George W. Bush. Following a year as a speechwriter for HHS Secretary Tommy Thompson and two as the chief speechwriter for Texas Senator John Cornyn, Ben is now a book editor for Regnery Publishing, where he has edited multiple bestsellers and books by Michelle Malkin, Ramesh Ponnuru, and Hugh Hewitt."

Not bad for a 24-year old without a college degree. And his bio lists even more accomplishments. But getting his own column in WashingtonPost.com was the peak. Soon after that things started going downhill very rapidly.

His decline began when bloggers looked into his writings and found that, as Augustine, he had written a column of the day of Coretta Scott King's funeral calling her a Communist. This annoyed a lot of people who then started looking more closely at his other writings. It was then that someone discovered that he had plagiarized. And the plagiarism was not subtle. Take for example this excerpt from his review of the film Bringing out the Dead.

Instead of allowing for the incredible nuances that Cage always brings to his performances, the character of Frank sews it all up for him.

But there are those moments that allow Cage to do what he does best. When he's trying to revive Mary's father, the man's family fanned out around him in the living room in frozen semi-circle, he blurts out, "Do you have any music?"

Now compare it with an earlier review posted on Salon.com,

Instead of allowing for the incredible nuance that Cage always brings to his performances, the character of Frank sews it all up for him. . . But there are those moments that allow Cage to do what he does best. When he's trying to revive Mary's father, the man's family fanned out around him in the living room in frozen semi-circle, he blurts out, "Do you have any music?"

Or this sampling from P. J. O'Rourke's book Modern Manners, which also found its way into Domenech's columns:

O'Rourke, p.176: Office Christmas parties • Wine-tasting parties • Book-publishing parties • Parties with themes, such as "Las Vegas Nite" or "Waikiki Whoopee" • Parties at which anyone is wearing a blue velvet tuxedo jacket

BenDom: Christmas parties. Wine tasting parties. Book publishing parties. Parties with themes, such as "Las Vegas Nite" or "Waikiki Whoopee." Parties at which anyone is wearing a blue velvet tuxedo jacket.

O'Rourke: It's not a real party if it doesn't end in an orgy or a food fight. • All your friends should still be there when you come to in the morning.

BenDom: It's not a real party if it doesn't end in an orgy or a food fight. All your friends should still be there when you come to in the morning.

These are not the kinds of accidental plagiarisms that anyone can fall prey to, where a turn of phrase that appealed to you when you read it a long time ago comes out of you when you are writing and you do not remember that you got it from someone else. These examples are undoubtedly deliberate cut-and-paste jobs.

Once the charges of plagiarism were seen to have some credibility, many people went to Google and the floodgates were opened, Kaloogian-style, with bloggers all over poring over his writings. Within the space of three days a torrent of further examples of plagiarism poured out. These new allegations dated back to his writings at his college newspaper and then later for National Review Online, and Domenech was found to have lifted material from Salon and even from National Review Online, the latter being the same publication for which he was writing, which adds the sin of ingratitude to the dishonesty.

On March 24, just three days after starting his Washington Post column, Ben Domenech resigned under pressure. Soon after, he also resigned as book editor at Regnery.

What can we learn from this? One lesson seemingly is that people can get away with plagiarism for a short while, especially if they are writing in obscurity for little known publications. While he was writing for his college newspaper and even for his own website, no one cared to closely look into his work. Even his future employers at WanshintonPost.com did not seem to have checked him out carefully. Apparently his well-connected family and sterling Bush loyalty was enough to satisfy them that he was a good addition to their masthead.

But as soon as a writer becomes high profile, the chances are very high these days that any plagiarism will come to light.

At one level, this is a familiar cautionary tale to everyone to cite other people's work when using it. For us in the academic world, where plagiarism is a big no-no, the reasons for citing are not just there are high penalties if you get caught not doing it. The more important reasons arise from the very nature of scholarly academic activity, which I shall look at in a future posting.

To be continued. . .

March 24, 2006

Grade inflation-3: How do we independently measure learning?

Recall (see here and here for previous postings) that to argue that grade inflation has occurred, it is not sufficient to simply show that grades have risen. It must be shown grades have risen without a corresponding increase in learning and student achievement. And that is difficult to do because there are really no good independent measures of student learning, apart from grades.

Some have argued that the SAT scores of matriculating classes could be used as a measure of student 'ability' and could thus be used to see if universities are getting 'better' students, thus justifying the rise in grades.

But the use of SAT scores as a measure of student quality or abilities has always been deeply problematic, so it is not even clear that any rise in SAT scores of incoming students means anything. One reason is that the students who take the SAT tests are a self-selected group and not a random sample, so one cannot infer much from changes in SAT scores. Second, SAT scores have not been shown to be predictive of anything really useful. There is a mild correlation of SAT scores with first year college grades but that is about it.

Even at Case, not all matriculating students have taken the SAT's. Also the average total SAT scores from 1985-1992 was 1271, while the average from 1993-2005 was 1321. This rise in SAT scores of incoming students at Case would be affected by two factors, the first being the re-centering of SAT scores that occurred in 1995. It is not known whether the pre-1995 scores we have at Case are the original ones or have been raised to adjust for re-centering. This lack of knowledge makes it hard to draw conclusions about how much, if at all, SAT scores have risen at Case.

Alfie Kohn cites "Trends in College Admissions" reports that say that the average verbal-SAT score of students enrolled in all private colleges rose from 543 in 1985 to 558 in 1999. It is also the fact that it was around 1991 that Case instituted merit scholarships based on SAT scores and started aggressively marketing it as a recruiting tool. So it is tempting to argue that there has been a genuine rise in SAT scores for students at Case.

Another local factor at Case that would influence GPAs is the practice of "freshman forgiveness" that began in 1987. Under this program, students in their first year would be "forgiven" any F grades they received and this F would not be counted towards their GPA. This is bound to have the effect of increasing the overall GPA, although a very rough estimate suggests only a 1-2% increase. This practice was terminated in 2005.

The Rosovsky-Hartley monograph points to the fact that many more students in colleges are now enrolled in remedial courses than was the case in the past, arguing that this implies that students are actually worse now. But again, that inference is not clear. Over the recent past there has been a definite shift in emphasis in colleges of now wanting to retain the students they recruit. The old model of colleges recruiting more students than they needed and then 'weeding' them out using certain courses in their first year, is no longer in vogue, assuming that there was substance to that belief and it is not just folklore.

Now universities go to great lengths to provide assistance to their students, beefing up their advising, tutoring, and other programs to help student stay in school. So the increased enrollment of students in remedial courses may simply be the consequence of universities taking a much more proactive attitude to helping students, rather than a sign of declining student quality. All these measures are aimed at improving student performance and are another possible benign explanation for any rise in grades. In fact, all these remedial and assistance programs could be used to argue that a rise in grades could be due to actual improved student performance.

Alfie Kohn argues that taking all these things into account, there is no evidence for grade inflation, that this is an issue that has been blown way out of proportion by those who have a very narrow concept of the role of grades in learning. Kohn says there are many reasons why grades could rise:

Maybe students are turning in better assignments. Maybe instructors used to be too stingy with their marks and have become more reasonable. Maybe the concept of assessment itself has evolved, so that today it is more a means for allowing students to demonstrate what they know rather than for sorting them or "catching them out." (The real question, then, is why we spent so many years trying to make good students look bad.) Maybe students aren't forced to take as many courses outside their primary areas of interest in which they didn't fare as well. Maybe struggling students are now able to withdraw from a course before a poor grade appears on their transcripts. (Say what you will about that practice, it challenges the hypothesis that the grades students receive in the courses they complete are inflated.)

The bottom line: No one has ever demonstrated that students today get A's for the same work that used to receive B's or C's. We simply do not have the data to support such a claim.

In addition to the factors listed by Kohn, psychologist Steve Falkenberg points out a number of other reasons why average grades could rise. His essay is a particularly thoughtful one that is worth reading.

Part of the problem in judging whether grade inflation exists is that we don't know what the actual grade distribution in colleges should be. Those who argue that it should be a bell curve (or 'normal' distribution) with an average around C are mixing up a normative approach to assessment (as is used for IQ tests and SATs) with an achievement approach.

IQ tests and SATs are designed so that the results are spread out over a bell curve. They seek to measure a characteristic (called "intelligence'") that is supposedly distributed randomly in the population according to a normal distribution. (This assumption and the whole issue of what constitutes intelligence is the source of a huge controversy that I don't want to get into here.) So the goal of such tests is to sort students into a hierarchy, and they design tests that spread out the scores so that one can tell who is in the top 10% and so on.

But when you teach a class of students, you are no longer dealing with a random sample of the population. First of all, you are not giving your assessments to people off the street. The students have been selected based on their prior achievements and are no longer a random sampling of the population. Secondly, by teaching them, you are deliberately intervening and skewing the distribution. Thirdly, your tests should not be measuring the same random variable that things like the SATs measure. If they were, you might as well give your students their grades based on those tests.

Tests should not be measures of some intrinsic ability, even assuming that such a thing exists and can be measured and a number assigned to it. Tests are (or at least should be) measuring achievement of how much and how well a selected group of students have learned as a result of your instruction. Hence there is no reason at all to expect a normal distribution. In fact, you would expect to have a distribution that is skewed towards the high end. The problem, if it can be considered a problem, is that we don't know a priori what that skewed distribution should look like or whether there is a preferred distribution at all. After all, there is nothing intrinsically wrong with everyone in a class getting As, if they have all learned the material at a suitably high level.

In fact, as Ohmer Milton, Howard Pollio, and James Eison write in Making Sense of College Grades (Jossey-Bass, 1986): "It is not a symbol of rigor to have grades fall into a 'normal' distribution; rather, it is a symbol of failure -- failure to teach well, failure to test well, and failure to have any influence at all on the intellectual lives of students."

There is nothing intrinsically noble about trying to keep average grades unchanged over the years, which is what those who complain about grade inflation usually want to do.

On the other hand, one could make the reasonable case that as we get better at teaching and in creating the conditions that make students learn better, and as a consequence we get students who are able to learn more, then perhaps we should raise our expectations of students and provide more challenging assignments, so that they can rise to greater heights. This is a completely different discussion. If we do so, this might result in a drop in grades. But this drop is a byproduct of a thoughtful decision to make learning better, not caused by an arbitrary decision to keep average grades fixed.

This approach would be like car manufacturers and consumers raising their standards over the years so that we now expect a lot more from our cars than we did fifty years ago. Even the best cars of fifty years ago would not be able to meet the current standards of fuel efficiency, safety, and emissions. But the important thing to keep in mind is that standards have been raised along with the ability to make better cars able to meet the higher standards.

But in order to take this approach in education, it requires teachers to think carefully about what and how we assess, what we can reasonably expect of our students, and how we should teach so they can learn more and learn better. Unfortunately much of the discussion of grade inflation short-circuits this worthwhile aspect of the issue, choosing instead to go for the quick fix like putting limits for the number of grades awarded in each category.

It is perhaps worthwhile to remember that fears about grade inflation, that high grades are being given for poor quality work, have been around for a long time, especially at elite institutions. The Report of the Committee on Raising the Standard at Harvard University said: "Grades A and B are sometimes given too readily -- Grade A for work of no very high merit, and Grade B for work not far above mediocrity. ... One of the chief obstacles to raising the standards of the degree is the readiness with which insincere students gain passable grades by sham work."

That statement was made in 1894.

POST SCRIPT: Cindy Sheehan in Cleveland tomorrow

Cindy Sheehan will speak at a Cleveland Town Hall Meeting Saturday, March 25, 1-3 pm

Progressive Democrats of Ohio present Gold Star Mother and PDA Board Member Cindy Sheehan at a Town Hall Meeting on Saturday, March 25, 2006 from 1 - 3 p.m. at the Beachland Ballroom, 15711 Waterloo Road in Cleveland's North Collinwood neighborhood. (directions.)

Topic: Examining The Cost of Iraq: Lives, Jobs, Security, Community

Panelists include:

US Congressman Dennis Kucinich, OH-10
Cindy Sheehan - Gold Star mother & activist
Tim Carpenter, National Director, Progressive Democrats of America
Francis Chiappa, President, Cleveland Peace Action
Paul Schroeder, NE Ohio Gold Star Father and co-founder of Families of the Fallen For Change
Farhad Sethna, Immigration attorney and concerned citizen

March 23, 2006

Grade inflation-2: Possible benign causes for grade increases

Before jumping to the conclusion that a rise in average grades must imply inflation (see my previous posting on this topic), we should be aware of the dangers that exist when we are dealing with averages. For example, suppose we consider a hypothetical institution that has just two departments A and B. Historically, students taking courses in A have had average grades of 2.5 while those in B have had 3.0. Even if there is no change at all in the abilities or effort of the students and no change in what the faculty teach or the way that faculty assess and grade, so that the average grades in each department remain unchanged, it is still possible for the average grades of the institution to rise, simply because the fraction of students taking courses in B has become larger.

There is evidence that this shifting around in the courses taken by students is just what is happening. Those who are convinced that grade inflation exists and that it is evil, tend to interpret this phenomenon as game playing by students, that they are manipulating the system, choosing courses on the basis of how easy it is to get high grades rather than by interest or challenge.

For example, the ERIC report says "In Grade Inflation: A Crisis in College Education (2003), professor Valen E. Johnson concludes that disparities in grading affect the way students complete course evaluation forms and result in inequitable faculty evaluations. . . Students are currently able to manipulate their grade point averages through the judicious choice of their classes rather than through moderating their efforts. Academic standards have been diminished and this diminution can be halted, he argues, only if more principled student grading practices are adopted and if faculty evaluations become more closely linked to student achievement."

This looks bad and the author obviously wants to make it look bad, as can be seen from his choice of the word 'manipulate' to describe the students' actions and the way he implies that faculty are becoming more unprincipled in their grading practices. But there is no evidence for the evil motivations attributed to such students and faculty. In fact, one can look at the phenomenon in a different way. It is undoubtedly true that students now have many more choices than they did in the past. There are more majors and more electives. When you offer more choices, students are more likely to choose courses they are interested in and thus are more likely to do better in them.

Furthermore, even if students are choosing courses partly based on their expectation of the grade they will receive in it, we should not be too harsh in our judgments. After all, we have created a system in which grades seem to be the basis for almost everything: admission to colleges and graduate schools, honors, scholarships, and financial aid. As I said, grades have become the currency of higher education. Is it any wonder that students factor in grades when making their choices? If a student tries to balance courses they really want to take with those that know they can get a high grade in order to be able to maintain the GPA they need to retain their scholarships, why is this to be condemned? This seems to me to be a sensible strategy. After all, faculty do that kind of thing all the time. When faculty learn that the NIH or NSF is shifting its grants funding emphasis to some new research area, many will shift their research programs accordingly. We do not pour scorn on them for this, telling them that they should choose research topics purely based on their interests. Instead, we commend them for being forward thinking.

It certainly would be wonderful if students chose courses purely on the basis of their interest or usefulness or challenge and not on grade expectations, but to put students in the current grade-focused environment and expect them to ignore grades altogether when making their course selection is to be hypocritical and send mixed messages.

What about the idea that faculty grading standards have declined and that part of the reason is that they are giving easy grades in order to get good evaluations? This is a very popular piece of folklore on college campuses. But this question has also been studied and the data simply do not support it. It does seem to be true that students tend to get higher grades in the courses in the courses they rate higher. But to infer a causal relationship, that if a faculty member gives higher grades they will get better evaluations, is wrong.

People who have studied this find that if a student likes a course and a professor (and thus gives good evaluations), then they will tend to work harder at that course and do better (and thus get higher grades) thus bringing about the grades-evaluations correlation that we see. But what tends to determine how much a student likes a course and professor seems to depend on whether they student feels like she or he is actually leaning interesting and useful stuff. Students, like anybody else, don't like to feel they are wasting their time and money and do not enjoy being with a professor who does not care for them or respect them.

Remember that these studies report on general trends. It is quite likely that there exist individual professors who give high grades in a misguided attempt to bribe student to give good evaluations, and that there exist students willing to be so bribed. But such people are not the norm.

To be continued. . .

POST SCRIPT: And the winner is. . .

Meanwhile, there are Americans who have already have decided which country the US should invade next in the global war on terror, even if they haven't the faintest idea where that country is on the globe or why it should be invaded. Even Sri Lanka gets a shot at this particularly dubious honor.

Here's an idea for a reality show, along the lines of American Idol. It will be called Who's next?. The contestants will be the heads of states of each country and this time their goal will be to get voted off because the last remaining country gets bombed and invaded by the US. The judges could be Dick Cheney (to provide the sarcastic put-downs a la Simon Cowell. Sample: "You think we're going to waste our smart bombs on your dumb country?"), Donald Rumsfeld, and Condoleeza Rice.

Fox television, my contract is ready and I'm waiting for your call.

March 22, 2006

Grade inflation-1: What is it and is it occurring?

There is nothing that gets the juices flowing in higher education academic circles than the topic of grade inflation. Part of the reason for this passion may be because grades and test scores, and not learning, seem to have become the currency of education, dominating the thinking of both students and faculty. Hence some people monitor grades as an important symptom of the health of universities.

But what is curious is that much of the discussion is done in the absence of any hard data. It seems as if perceptions or personal anecdotes are a sufficient basis to draw quite sweeping conclusions and prescriptions for action.

One of the interesting things about the discussion is how dismayed some faculty get simply by the prospect that average grades have risen over time. I do not quite understand this. Is education the only profession where evidence, at least on the surface, of a rise in quality is seen as a bad thing? I would hope that like in every other profession we teachers are getting better at what we do. I would hope that we now understand better the conditions under which students learn best and have incorporated the results of that knowledge into our classrooms, resulting in higher achievement by students. Any other profession or industry would welcome the news that fewer people are doing poorly or that fewer products are rejected for not meeting quality standards. But in higher education, rising grades are simply assumed to be bad.

Of course, if grades are rising because our assessment practices are becoming lax, then that is a cause for concern, just as if a manufacturer reduces the rejection rate of their product by lowering quality standards. This is why having an independent measure of student learning and achievement to compare grades with has to be an important part of the discussion.

Grade inflation is a concept that has an analogy with monetary inflation, and to infer that inflation (as opposed to just a rise) in grades has occurred implies that grades have risen without a corresponding increase in learning and student achievement. But in much of the discussion, this important conditional clause is dropped and a rise in grades is taken as sufficient evidence by itself that inflation has occurred.

Let's take first the question of whether average grades have actually risen. At Case, as some of you may know, beginning January 2003, the GPA cutoffs to achieve honors were raised to 3.56 (cum laude), 3.75 (magna cum laude), and 3.88 (summa cum laude) so that only 35% of students would be eligible for honors. (The earlier values were 3.20, 3.50, and 3.80 respectively.) This measure was taken because the number of people who were graduating with honors had risen steadily over the years, well above the 35% originally anticipated when the earlier bars were set.

A look at grade point averages at Case shows that it was 2.99 in 1975 (the first year for which we have this data), dropped slowly and steadily to 2.70 in 1982, rose to 3.02 in 1987, stayed around that value until 1997, and since then has oscillated around 3.20 until 2005, with the highest reaching 3.27 in 2001. The overall average for the entire period was 3.01 and the standard deviation was about 0.70. (I am grateful for this and other Case data to Dr. Julie Petek, Director of Degree Audit and Data Services.)

It is hard to know what to make of this. On the one hand we could start at the lowest point in the grade curve and say that grades have risen by half a letter grade from 1982 to 2005. Or we could start at 1975 and say that grades are fluctuating in the range 2.70-3.30, or about half a standard deviation about the mean of 3.0.

What does the data say nationwide? Henry Rosovsky and Matthew Hartley, writing in a monograph for the American Academy of Arts and Sciences are convinced that inflation has occurred. For evidence of grades increasing, they point to various nationwide surveys that show that average grades rose by 0.43 from 1960 to 1974; that in 1993 the number of grades of A- or higher was 26%, compared to 7% in 1969; and the number of C's dropped from 25% to 9% in that same period; and that averages rose from 3.07 in the mid 1980s to 3.34 in the mid 1990s.

In this last result, it is interesting to note in another study that grades rose on average only at selective liberal arts colleges and research universities, while they declined at general liberal arts colleges and 
comprehensive colleges and universities, and in the humanities and social sciences

The Rosovsky-Hartley monograph has been criticized for depending on research that itself depended on studies that used surveys, and such studies can be questioned on the fact that it is not clear how reliable the responses to such surveys are, depending as they do on self-reporting.

A 2003 ERIC Digest of the literature on this topic finds results that that cast doubt on the basic question of whether average grades have even risen. For example, "Clifford Adelman, a senior research analyst with the U.S. Department of Education, reviewed student transcripts from more than 3,000 colleges and universities and reported in 1995 that student grades have actually declined slightly over the last 20 years." (my emphasis). His study of 16.5 million graduates from 1999-2000 also found that 14.5% of these students received As while 33.5% received grades of C or lower.

What is significant about the Adelman study is that he used actual student transcripts, not surveys, and thus seems to me to be more reliable.

It seems from this data and other studies that average grades have not increased across the board, but it is plausible that they have increased at selective liberal arts colleges and research universities. The Rosovsky-Hartley monograph says that "In 1966, 22 percent of all grades given to Harvard undergraduates were in the A range. By 1996 that percentage had risen to 46 percent and in that same year 82 percent of Harvard seniors graduated with academic honors. In 1973, 30.7 percent of all grades at Princeton were in the A range and by 1997 that percentage had risen to 42.5 percent."

Even though it has not been conclusively established suppose that, for the sake of the argument, we concede that at least at selective liberal arts colleges and research universities (such as Case) have seen a rise in average grades. Is this automatically evidence of grade inflation? Or are there more benign causes, such as that we getting better prepared and more able students now, or our teaching methods have improved? Another important issue is whether Case's experience of rising grades is part of a national trend or an exception.

To be continued. . .

POST SCRIPT: Where's the balance?

Over at Hullabaloo, Tristero catches the Washington Post in a blatant act of bias in favor of atheistic science. The Post article says:

Scientists said yesterday they have found the best evidence yet supporting the theory that about 13.7 billion years ago, the universe suddenly expanded from the size of a marble to the size of the cosmos in less than a trillionth of a second.

Tristero points out that the article completely fails to mention the controversy around this question, that there is another side to this story, that the big bang is "only a theory" since no one was there to actually observe this event and prove that it happened.

And not a word of balance from the other side, as if the sincere faith of millions of Americans in a Christian God didn't matter at all to the Post's editors.

I just hate it when the media reports carefully vetted scientific data as fact and not as just one of many valid points of view. I'm not asking for them to ignore the opinions of these so-called scientists, but they really should report the fact there's a lot of controversy about whether this kind of evidence is valid. Like, were you there, huh, Mr. Hotshot Washington Post?

February 23, 2006

The state of literacy in the US

The government's National Center for Education Statistics (NCES) is an invaluable source of information about the state of education in the US. Among other things, it periodically measures the state of literacy and determines what percentage of the population falls into four categories: below basic, basic, intermediate and proficient. These levels are defined (with samples of abilities and skill sets) are given below:

Below Basic indicates no more than the most simple and concrete literacy skills.

Key abilities
• locating easily identifiable information in short, commonplace prose texts
• locating easily identifiable information and following written instructions in simple documents (e.g., charts or forms)
• locating numbers and using them to perform simple quantitative operations (primarily addition) when the mathematical information is very concrete and familiar

Sample tasks typical of level
• searching a short, simple text to find out what a patient is allowed to drink before a medical test
• signing a form
• adding the amounts on a bank deposit slip

Basic indicates skills necessary to perform simple and everyday literacy activities.

Key abilities
• reading and understanding information in short, commonplace prose texts
• reading and understanding information in simple documents
• locating easily identifiable quantitative information and using it to solve simple, one-step problems when the arithmetic operation is specified or easily inferred

Sample tasks typical of level
• finding in a pamphlet for prospective jurors an explanation of how people were selected for the jury pool
• using a television guide to find out what programs are on at a specific time
• comparing the ticket prices for two events

Intermediate indicates skills necessary to perform moderately challenging literacy activities.

Key abilities
• reading and understanding moderately dense, less commonplace prose texts as well as summarizing, making simple inferences, determining cause and effect, and recognizing the author’s purpose
• locating information in dense, complex documents and making simple inferences about the information
• locating less familiar quantitative information and using it to solve problems when the arithmetic operation is not specified or easily inferred

Sample tasks typical of level
• consulting reference materials to determine which foods contain a particular vitamin
• identifying a specific location on a map
• calculating the total cost of ordering specific office supplies from a catalog

Proficient indicates skills necessary to perform more complex and challenging literacy activities.

Key abilities
• reading lengthy, complex, abstract prose texts as well as synthesizing information and making complex inferences
• integrating, synthesizing, and analyzing multiple pieces of information located in complex documents
• locating more abstract quantitative information and using it to solve multistep problems when the arithmetic operations are not easily inferred and the problems are more complex

Sample tasks typical of level
• comparing viewpoints in two editorials
• interpreting a table about blood pressure, age, and physical activity
• computing and comparing the cost per ounce of food items

The 2003 results can be found on page 4 of the NCES document.

For prose literacy, the population breaks up as 14% below basic, 29% basic, 44% intermediate, and 13% proficient.

For documents literacy, 12% are below basic, 22% basic, 53% intermediate, and 13% proficient.

For quantitative literacy, 22% are below basic, 33% basic, 33% intermediate, and 13% proficient.

It does not surprise me that the literacy levels go down for quantitative skills. What does surprise me is that the bars are set so low. Maybe I am living in a dream world (after all, I do work in a university!) but it seems like the tasks required of people at the proficient level are the minimal ones needed to function effectively in modern society, if you are to have a sense that one is aware of what is going on around.

For example, last week I was preparing my tax returns and it seemed to me that to be able to do them requires proficiency level (at least as far as prose and document literacy went), since there was a lot of "if-then' reasoning involved. And my taxes are fairly simple since my finances are straightforward, as is the case for most people whose income comes mainly in the form of salary or wages.

To think that only 13% reach proficiency level in all three categories is troubling. How do the rest even do their taxes, let alone make sense of the complexities of the modern world?

December 20, 2005

Wikipedia as good as the Encyclopedia Brittanica?

In my seminar courses, students are expected to research and write papers on topics related to science. Invariably, many of them will submit papers that cite Wikipedia as a source for some assertion. I tell them that Wikipedia is not a credible source for authoritative information and should never be used when submitting any paper.

The reason for this is that wikipedia is an open source encyclopedia where absolutely anyone can edit and update entries and the submissions are largely anonymous. Since there is no identifiable and authoritative person behind the information, there is no way to judge the credibility of the information. This contrasts with things like the Encyclopedia Brittanica which solicits articles from experts in the fields and the resulting articles are then peer-reviewed and vetted by editors to ensure quality in both the content and the writing.

So my message to students has been quite simple: no to Wikipedia and yes to Encyclopedia Brittanica.

My anti-Wikipedia stance received some support from the recent disclosure of a hoax by an author who wrote a scurrilous biography of someone that contained palpable untruths. The person whose 'biography' was faked discovered its existence and was justifiably incensed, and his actions subsequently led to the unmasking of the hoaxer.

But then comes along another study that compared the accuracy of entries in Wikipedia and Encyclopedia Brittanica and found them to be comparable. Aaron Shaffer has a nice entry on this that compares the two and finds that on some measures, Wikipedia may be even better.

So should I change my advice to students and allow Wikipedia? The answer is no. As long as the articles are anonymous, they remain a no-no for academic publications. Academia has no use for anonymous information. Much of our work is based on trusting the work of our peers. The assumption is that someone who has a responsible position in an academic institution has too much at stake to willfully mislead or even be sloppy in their work. Signing their name and giving their institutional affiliation means that the institution now also has a stake in the information being correct.

Having said all that, I must add that I like Wikipedia and am impressed with the whole concept and with the quality of the information that it provides. I often use it myself to learn about things quickly. It is an interesting example of 'the wisdom of crowds,' how when a large enough number of people are actively involved in something, the resulting quality of the finished product can be quite high. It is a highly intriguing experiment in information democracy.

So my advice to students is to use it to get a quick overview of something and to get started on learning about it. But then to go to some authored source for substantiation and citation. Because although Wikipedia may be right most of the time, in academic discourses, who said it is sometimes as important as what is said.

September 01, 2005

The problem with grades and other summary evaluations

In previous postings (see here and here), I discussed why college rankings vary so much depending on who does the survey. One of the reasons is that different criteria are used to arrive at the rankings, making it difficult to arrive at apples-to-apples comparisons. In this posting, I will discuss why I think that rankings may actually be harmful, even if the measures used to arrive at them are good.

The main problem with rankings is that it requires a single summary score obtained by combining scores from a variety of individual measures, and it seems as if people focus exclusively on that final score and not pay too much attention to the scores on individual measures that went into the summary.

This is a general problem. For example, in course evaluations by students of their teachers, there are usually many questions that ask students to evaluate their teachers on important and specific issues, for example, whether the teacher encourages discussions, is respectful to students, etc.

But there is usually also a question that asks students to give an overall evaluation of the teacher and when such questions exist, those people who usually read the results of the surveys (students, teachers, and department chairs) tend to focus almost exclusively on this summary score and not pay much attention to the other questions. But it is the other questions that provide useful feedback on what kinds of actions need to be taken to improve. For example, a poor score on "encouraging students to discuss" tells a teacher where to look to make improvements. But an overall evaluation of "good" or "poor" for teaching does not tell the teacher anything useful on which to base specific actions.

Teachers face the same problems with course grades. To arrive at a grade for a student, a teacher will make judgments about writing, participation, content knowledge, etc. using a variety of measures. Each of those measures gives useful feedback to the students on their strengths and weaknesses. But as soon as you combine them into a single course grade using a weighted average, then people tend to look only at the grade, even though that really does not tell you anything useful about what a student's capabilities are. But teachers are required to give grades so we cannot avoid this.

I often hear faculty complain that they give extensive and detailed feedback on students' written work, only to see students take a quick look at the grade for the paper and then put it away in the their folders. Faculty wonder if students ever read the comments. I too give students a lot of feedback on their writing and have been considering the following idea to try to deal with this issue. Instead of writing the final grade for the paper on the paper itself, I am toying with the idea of omitting that last step and ask the students to estimate the grade that I gave the paper based on their reading of my comments. I am hoping that this will make them examine their own writing more carefully in the light of the feedback they get from others. Then when they have shared with me what grade they think they got and why, I'll tell them their grade. I am willing to even change it if they make a good case for a change.

I am a little worried that this process seems a little artificial somehow, but perhaps because that is because it is not common practice yet and anything new always feels a little strange. I am going to try it this semester.

Back to college ratings, those can be harmful for another reason and that is that the goals of a school might not mesh with the way that scores are weighted. For example, the US News & World Report rankings take into account incoming students scores on things like the SAT and ACT. But a school that feels that such scores do not measure anything meaningful in terms of student qualities (and a good case can be made for this view) might wish to look at other things it values, like creativity, ingenuity, citizenship, writing, problem solving, etc. Such a school is doomed to sink in the USN&WR rankings, even though it might be able to provide a great college experience for its students.

I am a great believer that getting useful feedback, in whatever area of activity, is an excellent springboard for improving one's performance and capabilities. In order to do so, one needs criteria, and targeted and valid measures of achievement. But all that useful information can be completely undermined when one takes that last step and combines these various measures in order to get a single score for ranking or overall summary purposes.

August 31, 2005

The problem with rankings

In a previous post, I spoke about how the college rankings put out by the magazine its Washington Monthly differed considerably from those put out by US News & World Report.

There is a fundamental problem involved in ranking things in some order. In order to do so, it becomes necessary to reduce all the quality measures used to a single number so that they can be compared along a single scale.

This raises three issues that have to be decided. What are the criteria to be used? How can the selected criteria be translated into quantifiable measures? How are the different measures to be weighted in the mix in order to arrive at the final number?

All these questions rarely have unique answers and there is seldom consensus on how to answer any of these questions, and the two college rankings mentioned above are examples of disagreements in answering just the first question alone.

The Washington Monthly said that they felt that, "Universities should be engines of social mobility, they should produce the academic minds and scientific research that advance knowledge and drive economic growth; and they should inculcate and encourage an ethic of service" and they devised measures accordingly.

US News & World Report mainly looks instead at the resources that universities have and their prestige among their peers. For example, I think that 25% of their final score is based on the "peer assessment score," which is how people rate the universities. Such a measure is going to guarantee a high ranking for those universities that are already well known and regarded. The ratings also look at the scores of entering students, graduation and retention rates, the size of the endowment, the amount of money the schools have, the amount that alumni give to the schools, etc. All these things are also related to the prestige perception (high scoring students are likely to apply to high prestige institutions, and are more likely to graduate, get well-paying jobs, and earn more money, and so forth.) There is very little that an institution can do in the short term to change any of these things, which is why the USN&WR ratings tend to be quite stable from year to year.

The problem with both sets of ratings is that they do not really measure how well students are taught or how well they learned and grew intellectually, socially, and emotionally. In other words, neither survey tells us how much and what kind of growth the students experience during their school years. To me, that is a really important thing to know about a school.

There is one survey that I think does give useful information about some of these things and that is the NSSE, which stands for National Survey of Student Engagement. This is a research-based study that looks at how much students experience good educational practices during their college years. It does this by surveying students in their first and final years of school. Many schools (including Case) do these surveys in their first and fourth years and they provide each school with important information on their strengths and weaknesses in various areas. The results of the surveys are provided confidentially to schools for internal diagnostic purposes and are not compiled into a single overall school score for ranking purposes.

Should NSSE also produce a single quality score to enable schools to be compared? In a future posting, I will argue why such rankings may actually do more harm than good, even if the measures used to arrive at them are valid.

August 23, 2005

The college rankings game

I was walking around the campus yesterday and it was wonderful. The day was cool and sunny and the campus was green and inviting, reinforcing my feeling that over the last fifteen years Case has transformed itself from an ugly-building and surface-parking-lot dominated landscape to one of the most attractive urban campuses in the nation. This is especially so this year with the new dorms that have opened up (I went on the tour last week and was really impressed by their spaciousness and tastefulness) and the new playing fields.

But the best thing was to see all the new and returning students wandering around, many with their parents. Summer is a nice time to be here but nothing beats the sense of liveliness and eager anticipation that I associate with the beginning of a new school year. And to top it all, we have the large incoming class (last I heard it was around 1180) and the SAGES program going full throttle. I am eager to get back in the classroom again.

I got back in my office and discovered that the magazine Washington Monthly announced that it has devised a new method for ranking colleges, using a different set of criteria from those used by the better known US News & World Report. As you may know, the latter magazine revealed its latest rankings just a couple of days ago and Case dropped from 35 last year to 37 this year. This is the season for the rankings to come out and Princeton Review releases its rankings today.

Washington Monthly explains that its criteria are based on what they perceived should be the function of universities: "Universities should be engines of social mobility, they should produce the academic minds and scientific research that advance knowledge and drive economic growth; and they should inculcate and encourage an ethic of service." The accompanying article explains how these criteria were translated into quantifiable measures for each school.

Since these criteria seemed worthwhile, I decided to check out the rankings. Of course, the first thing I looked for was Case's ranking and was pleasantly surprised that Case ranked at #24. When you compare private universities alone, Case came out at #12 compared with #29 for US News & World Report. Case came ahead of a lot of private universities who regularly rank above us in the other ratings, such as Georgetown, Washington University in St. Louis, Carnegie Mellon, Princeton, and Rochester.

It seems like it is the "engines of social mobility" and "ethic of service" criteria that caused a lot of shifting of rankings. The former criterion was measured using the number of Pell grants, and this helped the top-tier state universities rise in the rankings since they offer more poor people the chance for education and advancement. The latter criterion was measured by "whether a school devotes a significant part of its federal work study funding to placing students in community service jobs (as the original work study law intended); the percentage of students enrolled in ROTC; and the percentage of graduates currently enrolled in the Peace Corps." As a result, a lot of state universities rose and private universities dropped. Harvard, for example, was #75 on the service criterion.

So what is one to make of this variability in rankings from magazine to magazine? Does this mean that we should not take them seriously? Not quite. The measures used are useful pieces of information. The fundamental problem arises when multifaceted measures, each possibly worthwhile in itself, are combined to produce a single score for ranking purposes.

I’ll explore this question in subsequent postings.

POST SCRIPT

The British newspaper The Independent finally tallies up the official lies told about the killing of Jean Charles de Menezes in a London tube station, which I have been writing about. Here is the key section:

What police said - and what really happened

The police claim: A man of "Asian appearance", behaving suspiciously, is shot dead by police on a Tube train in Stockwell.
The truth: The dead man, Jean Charles de Menezes, 27, was Brazilian.

The police claim: His shooting was "directly linked" to the investigation into the London bombings.
The truth: Mr de Menezes was an electrician and had nothing to do with the London bombings.

The police claim: Witnesses described him running into the Tube station, vaulting the barriers.
The truth: He walked into the station and picked up a free newspaper before entering with a travel pass. He made his way to the platform. He started to run only when the train arrived.

The police claim: Witnesses said he was wearing an "unseasonable" heavy coat, and Scotland Yard said his clothing had "added to suspicions".
The truth: Photographs of the body show Mr de Menezes wearing a blue denim jacket.

The police claim: "As I understand the situation the man was challenged and refused to obey police instructions" - Sir Ian Blair.
The truth: There was no police challenge.

The police claim: Mr de Menezes ran on to the Tube train, tripped and was shot five times by police as he lay on the floor.
The truth: CCTV footage is said to show Mr de Menezes pausing, looking left and right, and sitting on a seat facing the platform. A police witness says Mr de Menezes stood up when the police arrived. The policeman then pinned his arms to his sides and pushed him back in the seat. Mr de Menezes was then shot 10 times - three of the bullets missed.

August 02, 2005

Harry Potter's school life and mine (safe to read - no spoilers!)

One of the appealing things for me personally about the Potter books are the similarities with my own education, which results in waves of nostalgia sweeping over me as I read the stories. I went to a single-sex private school in Sri Lanka that was modeled on the British boarding school like Hogwarts, although about half the students (including me) commuted from home. We were called 'day-scholars' which, looking back now, seems like a quaint but dignified label when compared to the more accurate 'commuters.'

As in Hogwarts, we had teachers (some of whom we liked and others whom we disliked), who mostly taught in a didactic style, and we did have punishments like detention, writing lines, and even canings. In my own school, only the principal and vice principals could officially cane students, though some teachers still resorted to painful raps on the knuckles with rulers or even slaps across the face. Our chemistry teacher, who was an exceedingly kind and gentle man, nevertheless could be provoked to fits of violent rage which completely transformed him for a short time into a raging monster, during which he would lash out with the rubber hoses that were readily available in the laboratories, sometimes raising welts on an offending student's arm. The rage would subside as quickly as it was triggered and the teacher would be immediately overcome with remorse, apologizing profusely and begging for forgiveness, which we always agreed to because we liked him. We were fascinated by his Jekyll-and-Hyde transformations.

We also had the system of 'houses', which involved the separation of students into separate groups (such as Gryffindor, Slytherin, Hufflepuff, and Ravenclaw), each of which had a master in charge. The boarded students (or 'boarders') even had separate dormitories based on the houses. These houses were set in competition with each other, earning points for various achievements, These points were totaled at the end of the year, with a trophy going to the winning house, giving them bragging rights for a year.

The houses were a good way of encouraging team spirit and intramural competition, and provided opportunities for students who were not good enough to be in the school teams (or 'varsity' teams as they are known here) to still take part in a competitive program with their fellow students. I think that this system helped to increase participation of students in extracurricular activities because most students took seriously their responsibilities to help their house do well. The downside was that the competition could sometimes be too fierce, leading to churlish and unsportsmanlike behavior. The intramural quidditch games that take place at Hogwarts were mirrored in the cricket, rugby, and hockey matches at my school.

We also had the 'prefect' system, which must sound strange to American readers. (Hermione is a prefect in book 6 and I too was a prefect during my last two years in school.) A prefect was essentially a student who was given authority over his fellow students. A prefect was selected by the master in charge of each house and appointed by the school principal. Very few students were prefects. We had special privileges that others did not, such as being allowed to leave school premises during the day and a special lounge reserved exclusively for our use. We had the power to enforce rules during the school day, at special functions, and at athletic events, and could issue punishments such as detentions to 'evil doers.' In earlier times, prefects at my school were also allowed to use corporal punishments (such as caning misbehaving students), but that was taken away before my time as the use of corporal punishments became more restricted.

At that time, we saw it as a great privilege and honor to be selected as a prefect. It was viewed as recognizing and building leadership qualities. Looking back now, it does not seem to be such an unadulterated good thing. I sometimes wonder whether the house and prefect system was not also a cheap means of extending the reach of the school administration by creating a free labor force of rule enforcers. The house system and the prefect system may also have been a means of enhancing teacher and administration control over students by weakening overall student cohesion, another manifestation of the 'divide and rule' philosophy that the British used so successfully to maintain control over their colonies but which often resulted in ethnic strife and civil wars when they left.

But at other times I think that I am reading too much into this, and seeing too many dark undercurrents in well meaning, if perhaps misguided, attempts at encouraging student participation and developing student leadership. Perhaps I should lighten up.

POST SCRIPT

I find William Faulkner difficult to read and understand, and struggled through The Sound and the Fury. But I found the winning essay in the 2005 FAUX FAULKNER contest hilarious. It is by Sam Apple and is called The Administration and the Fury: If William Faulkner were writing on the Bush White House. You can read it here.

August 01, 2005

Harry Potter's school life (safe to read - no spoilers!)

I just finished reading the latest episode of the Harry Potter saga. I cannot claim to be a rabid fan since I have read only book 2 (Chamber of Secrets) and book 6 (Half-Blood Prince), although I have seen all three film versions, but they have all been enjoyable.

Reading these books reminds me of my own school days and of much of the British schoolboy literature I read as a child, especially the Billy Bunter series and the Tom Merry series, both written by the same author Frank Richards. (These books were produced at such a prodigious rate that there were suspicions that 'Frank Richards' was the pseudonym of a whole stable of authors just churning out the stuff.)

There was a rigid formula to these books, the main features of which the Potter series largely adheres to. The schools were all boarding schools, and the stories started with students arriving at the beginning of the academic year and having various adventures that fortuitously ended just at the end of the school year. (There was a complementary series of children's books by Enid Blyton which took place during the summer, with a group of friends arriving at their home town from various boarding schools, and having an adventure that ended just in time for them to go their separate ways the next academic year.)

The big difference between Harry Potter and the earlier Billy Bunter and Tom Merry series is that although the context of a British boarding school is the same, the Potter books are far better written, with complex plots and characters developed realistically, dealing with important issues of good and evil, and real human emotions. The books I read as a child had stereotypical characters (the smart student, the bully, the figure of fun, the lisping aristocrat, the athlete, the sarcastic one, etc.) who all behaved in highly predictable ways. Those characters were two-dimensional and never changed, never grew or matured. This was reassuring in some ways because you knew exactly what you were getting with the books, but you cannot enjoy them as an adult the way you can with Potter.

The earlier books and schools were also single sex and we young boys only read the books about boys' schools, while girls only read equivalent books dealing with girls' boarding schools. The only members of the opposite sex that appeared in the books were siblings who made cameo appearances. For all we knew, the books written for the boys may have been identical to those written for the girls with just the genders (and sports) of the characters switched, such was the rigid separation between what boys and girls read when we were growing up. There was no romance whatsoever in any of the story lines. Hogwarts, on the other hand, is co-ed, a major difference.

Another similarity between Potter and the earlier books is that the educational practices in all the schools are pretty conventional. The classes are run in an authoritarian way. As someone pointed out, Hogwarts seems a lot like a trade school, with students learning very specific skills involving potions, hexes, and the like, mostly by rote memory and repetitive practice, similar to the way the earlier books had students learning Latin and Greek. There does not really seem to be a theory of magic or even any interest in developing one. Some magic works, others don't, with no serious attempts to discover why. There is little or no questioning of the teachers or class discussions, or inquiry-oriented teaching.

Rowling is mining a very rich vein of British school literature. As we will see in the next posting, the world she creates is probably very familiar to anyone (like me) who grew up in an English-language school anywhere in the British colonies. What she has done is added magic (and good writing) to a tried and true formula. But since that tradition of boarding school-based fiction is not present in the US, it is interesting that she has managed to strike such a chord in readers here as well.

POST SCRIPT

An anonymous commenter to an earlier post gave a very useful link to the various shades of meaning attached to atheism and definitions of atheism and agnosticism.

July 27, 2005

Simplifying difficult texts - 2

To illustrate the problems of simplifying original texts, we can look at examples from Shakespeare and the Bible. I came across a site that seeks to make Shakespeare's plays easier to understand by re-writing them:

Here is the original text from HAMLET Act III, Scene i, lines 57-91

To be, or not to be? That is the question—
Whether ’tis nobler in the mind to suffer
The slings and arrows of outrageous fortune,
Or to take arms against a sea of troubles,
And, by opposing, end them?
….
Thus conscience does make cowards of us all,
And thus the native hue of resolution
Is sicklied o’er with the pale cast of thought,
And enterprises of great pith and moment
With this regard their currents turn awry,
And lose the name of action.

Here is the simplified text:

The question is: is it better to be alive or dead? Is it nobler to put up with all the nasty things that luck throws your way, or to fight against all those troubles by simply putting an end to them once and for all?
….
Fear of death makes us all cowards, and our natural boldness becomes weak with too much thinking. Actions that should be carried out at once get misdirected, and stop being actions at all.

Do the two passages have the same meaning? They convey different senses to me.
Or take the famous passage from Ecclesiastes 9:11 of the Bible. Here is the familiar King James Version:

I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.

And here is the simplified modern language of the New Living Translation:
I have observed something else in this world of ours. The fastest runner doesn't always win the race, and the strongest warrior doesn't always win the battle. The wise are often poor, and the skillful are not necessarily wealthy. And those who are educated don't always lead successful lives. It is all decided by chance, by being at the right place at the right time.

Again, does the simplified passage capture the meaning of the original?

I am not criticizing the quality of the simplifications, although there may be better ones around. If you asked me what Shakespeare's passages mean, I probably would have come out with a more confused meaning than what was given above. But the point is that it is in the process of struggling to understand the author's original meaning that we make individual sense of the passage. I think that the best we can hope for is a shared consensus of the meaning, and we can never hope to exactly enter into the author's mind.

This problem is always present when the US Supreme Court tries to rule on the constitutionality of present day issues using a document written over two hundred years ago. People who call themselves "strict constructionists" say that the constitution should be interpreted according to the text and the intent of the frames. But how can you glean intent? The text of the document, by itself, is not sufficient, because words can never capture exact meanings. Literary theorist and legal scholar Stanley Fish has an interesting article that is worth reading. In it he says:

It follows that any conclusion you reach about the intention behind a text can always be challenged by someone else who marshals different evidence for an alternative intention. Thus interpretations of the Constitution, no matter how well established or long settled, are inherently susceptible to correction and can always (but not inevitably) be upset by new arguments persuasively made in the right venues by skilled advocates.
This does not mean, however, that interpreting the Constitution is a free-form activity in which anything goes. The activism that cannot be eliminated from interpretation is not an activism without constraint. It is constrained by the knowledge of what its object is - the specifying of authorial intention. An activism that abandons that constraint and just works the text over until it yields a meaning chosen in advance is not a form of interpretation at all, but a form of rewriting.

This is why I am so much a fan of collaborative learning and discussions to tease out meaning. I think you get more out of having a group of people reading the original, (difficult) text, and then arguing about what it means, than by reading a simplified text alone, however 'clear' the latter might be.

Here is a Zen koan:

Hyakujo wished to send a monk to open a new monastery. He told his pupils that whoever answered a question most ably would be appointed. Placing a water vase on the ground, he asked: "Who can say what this is without calling its name?" The chief monk said: "No one can call it a wooden shoe." Isan, the cooking monk, tipped over the vase with his foot and went out. Hyakujo smiled and said: "The chief monk loses." And Isan became the master of the new monastery.

What is the message this koan is trying to convey? The words are simple but the ideas are deep and captured succinctly. I think that it illustrates the point I am making here and I can try and tell you what it means to me, using a lot more words than in the original. But what does it mean to you?

July 26, 2005

Simplifying difficult texts

Some time ago, Aaron Shaffer in his blog expressed his disappointment with the texts he was reading in his philosophy class, particularly the fact that the writers seemed to not take the trouble to be concise, with individual sentences running as long as paragraphs. He felt that this poor writing diminished them in his eyes, since the ability to express one's ideas briefly and concisely demonstrates intellect.

I have been thinking about his comment for some time. I too, on occasion, try to read some philosophy and tend to find it heavy going. The somewhat dense and obscure style of some branches of the arts and humanities (especially the post-modernist philosophers and the area known as cultural studies) led to a notable hoax being pulled by physicist Alan Sokal, who deliberately wrote a paper whose conscious meaninglessness was disguised using dense language and the jargon common to the field of cultural studies. His article Transgressing the Boundaries: Toward a Transformative Hermeneutics of Quantum Gravity was published in the journal Social Text on the same day he wrote a newspaper article exposing his hoax. (I will write about the Sokal hoax in a later posting. As is usually the case, the issue was more complicated than it might first appear, and raises serious ethical issues.)

Of course, physicists are not in a good position to throw stones at philosophers because it has long been the case that physics papers have stopped being intelligible to anyone other than those in the same sub-sub-field. But the reason for this is that scientists long ago made the transition from writing for the general public in the form of books, to writing for fellow scientists, using the form of the short paper. Once the link to the public was broken, there was no necessity to try to make oneself intelligible since other scientists know your jargon. Some argue that scientists have carried this too far, which is why the public generally has such a poor idea of what scientists actually do.

But philosophers are still, by and large, writing for a general audience, so why is their writing so hard to understand? Part of the answer is that philosophers are dealing with a difficult subject, very abstract, and this requires very skilful writing to make clear to the non-specialist. Bertrand Russell was pretty good at this but he was an exception.

Some people have tried to tackle this problem by rewriting philosophical works to make them easier to understand. Some time ago, I received an email from a professor of philosophy about his project to simplify the works of philosophers by rewriting them, to remove redundancies, simplify language, etc. But this raises the issue: Can you rewrite someone else's work without introducing distortions?

If we look at the path that ideas take, we can start with an idea in an author's brain. The author's meaning is translated into words, then the words are read by the reader and their meaning recreated in the reader's brain. Ideally we would like the process:

author's meaning ---> written words ---> reader's meaning

to occur with no loss of precision. I think that this ideal cannot be attained because it is intrinsically impossible for words to exactly capture ideas. At best they come close and make a good approximation. The reason that an author may think he/she has expressed an idea exactly is because of the implicit meanings we individually assign to words, in addition to the explicit and agreed upon meanings that we all share.

The reader also uses implicit meanings of words in reconstructing the ideas but there is no guarantee that the reader's implicit meanings are the same as that of the writer's. Hence we end up with distortions. The author, if conscientious, tries to find the words and phrases that minimizes the amount of implicit meaning and best captures the idea, but this cannot be done with 100% accuracy. The more you try to replace implicit meanings with words, the wordier the writing gets.

So when someone tries to "clarify" the ideas of another author, that introduces a third filter of implicit meanings, and possibly greater distortions. This does not mean that it is not a worthwhile exercise. A good translator might be able to infer the original meaning of the author better than the novice reader can, and render those ideas in a form that makes it easier for the novice reader to understand. But there will always be some element of ambiguity that is intrinsic to the original work. And there is always the danger that the "simplified" work introduces new ideas that the original author did not intend.

In some areas, revisionist writings completely replace the original. For example, in teaching science, we almost never use the original papers of (say) Newton and Einstein. We use textbooks instead that explain the ideas originated by them. The difference for this may be that in modern science, the community works with consensus meanings. The original idea is usually elaborated on and expanded by many others before it becomes the paradigm which the community accepts. This paradigm then represents a kind of consensus scientific view of the field and people can set about trying to present the ideas in a simplified form, suitable for novices, which is how textbooks originate. We just give a nod to the originator of the idea but the idea has ceased to be his or hers alone. When we talk of "Newton's theory" or "Einstein's theory", what we are referring to is not usually exactly what those people may have intended.

But in other areas (such as philosophy) there is no consensus paradigm to the field so there is no consensus belief structure. Hence we keep going back to the original sources, trying to tease out what the author intended. So while in physics, we never refer to someone using quantum mechanics as an "Einsteinian" or "Bohrian" (two people who had competing interpretations of quantum mechanics) but simply refer to the current consensus view, in other fields such as philosophy it is quite common to refer to someone as a "Kantian" or a "Lockian", and this implies adherence to that person's original views.

I'll write more about this tomorrow.

July 13, 2005

Should professors reveal their views?

During the last academic year, UCITE organized a faculty seminar on whether, and how much, of their own views professors should reveal to the students in their classes.

One faculty member recalled one of her own teachers admiringly. She said that he had guided the discussions in her college classes very skillfully and in such a way that no one knew what his own views were on the (often controversial) topics they discussed. She felt that his avoidance on revealing his own views led to a greater willingness on the part of students to express their own, since they were not agreeing or disagreeing with the authority figure. She felt that his model was one that others should follow.

Underlying this model is the belief that students may fear that going against the views of the professor might result in them being penalized or that agreeing with the professor might be viewed as an attempt at ingratiation to get better grades.

I am not convinced by this argument, both on a practical level and on principle, but am open to persuasion.

As a purely practical matter, I am not sure how many of us have the skills to pull off what this admired professor did. It seems to me that it would be enormously difficult to spend a whole semester discussing things with a group of people without revealing one's own position on the topics. It is hard to keep aloof from the discussion if one is intensely interested in the topic. As readers of this blog know, I have opinions on a lot of things and if such topics come up for discussion, I am not sure that I have the ability to successfully conceal my views from students. So many of us will betray ourselves, by word or tone or nuances, despite our best efforts at concealment.

But I am also not convinced that this is a good idea even in principle, and I'd like to put out my concerns and get some feedback, since I know that some of the readers of this blog are either currently students or have recently been students.

One concern about hiding my own views is precisely that the act of hiding means that I am behaving artificially. After all, I assume that students know that academics tend to have strong views on things, and they will assume that I am no exception. Those students who speak their minds irrespective of the instructor's views won't care whether I reveal my views or not, or whether they agree with me or not. But for those students for whom my views are pertinent, isn't it better for them to know exactly where I stand so that they can tailor their comments appropriately and accurately, rather than trying to play guessing games and risk being wrong?

Another concern that I have arises from my personal view that the purpose of discussions is not to argue or change people's views on anything but for people to better understand why they believe whatever they believe. And one of the best ways to achieve such understanding is to hear the basis for other people's beliefs. By probing with questions the reasoning of other people, and by having others ask you questions about your own beliefs, all of the participants in a discussion obtain deeper understanding. In the course of such discussions, some of our views might change but that is an incidental byproduct of discussions, not the goal.

Seen in this light, I see my role as a teacher as modeling this kind of behavior, and this requires me to reveal my views, to demonstrate how I use evidence and argument to arrive at my conclusions. I feel (hope?) that students benefit from hearing the views of someone who has perhaps, simply by virtue of being older, thought about these things for a longer time than they have, even if they do not agree with my conclusions. To play the role of just a discussion monitor and not share my views seems to defeat one of the purposes of my being in the classroom.

The fly in the ointment (as always) is the issue of grades. I (of course) think that I will not think negatively of someone who holds views opposed to mine and it will not affect their grades. But that is easy for me to say since I am not the one being graded. Students may not be that sanguine about my objectivity, and worry about how I view them if they should disagree with me.

When I raised this topic briefly with my own class last year, they all seemed to come down in favor of professors NOT hiding their personal views. But I am curious as to what readers of this blog think.

Do you think professors should reveal their views or keep them hidden?

POST SCRIPT 1

The website Crooks and Liars has posted a funny video clip from the Daily Show that addresses how high levels of fear are generated in America, a topic that I blogged about earlier.

This article by John Nichols compares the British response to the tragedy with the way the American media tried to frame it.

POST SCRIPT 2

Also, for those of you struggling to keep up with the complicated set of issues involved with the Valerie Plame-Joseph Wilson-Robert Novak-Judith Miller-Matthew Cooper-confidential journalistic sources issue, there is an excellent article by Robert Kuttner that (I hope) clears things up.

May 05, 2005

David Horowitz and the art of the cheap shot

Oddly enough, just after posting two consecutive days on David Horowitz's cheap shots against academics, yesterday I received the latest (May 6, 2005) issue of The Chronicle of Higher Education which featured a long cover story on him. (For someone who is constantly whining about not getting enough attention from academia, Horowitz seems to be extraordinarily successful in getting publications such as this to cover him and his ideas. See Michael Berube's blog for a response.)

Anyway, the Chronicle article has a lot of information about him and it also provides some interesting background information on his funding. So I thought that today I would use that information to try my hand at manufacturing a cheap shot, an art I can learn from the master, David Horowitz himself.

Recall that in my previous postings (see here and here), I showed how he distorts and misrepresents academic life, saying things like: "Shiftless, lazy good-for-nothings? Try the richly paid leftist professors securely ensconced in their irrelevant ivory towers" and again "You teach on average two courses and spend six hours a week in class. You work eight months out of the year and have four months paid vacation. And every seven years you get ten months paid vacation."

Well, the Chronicle uncovered the fact that "Mr. Horowitz received an annual salary of $310,167 in 2003. He declines to give his current income, but in addition to his salary, Mr. Horowitz receives about $5,000 for each of the 30 to 40 campus speeches he gives each year." Horowitz says that college Republicans always invite him. Other student groups never do. "My kids have to scrounge up the money off campus." He drives a 2004 model Lincoln Town Car.

Despite earning the kind of money that most people (including academics) can only dream about, Horowitz still whines. The Chronicle article says

If he were liberal, [Horowitz] contends, he could be an editor at the [New York] Times or a department chairman at Harvard University. And his life story would have already been told on the big screen. Radical Son: A Generational Odyssey, his autobiography, has been out for eight years. "Someone would have made a film out of it if I was a leftist," he says bitterly.
"He claims he would make more money as a liberal, too, "at least three times," what he earns now."

That's right, he claims he would have been earning about a million dollars per year if he were liberal. This is a man who is seriously delusional and needs professional help fast.

And there's more. His Center for the Study of Popular Culture receives millions of dollars from various right wing foundations. The Chronicle article says that: "The center itself is located on the fourth floor of an office building in downtown Los Angeles, but Mr. Horowitz prefers to work from home." Horowitz is quoted as saying: "I love my work space," and "I sit at my desk with my laptop. I listen to music. I take the dogs for a walk. Like most writers, I live in my head."

So here's my attempt at a cheap shot, to show how bits of accurate information can be rearranged for effect. Drum roll, please.

"Shiftless, lazy good-for-nothings? Try the richly paid right wing David Horowitz. He plays these gullible right wing foundations for suckers, taking millions from them in order to pay himself a fat salary just to stay at home, listen to music, and take his dogs for walks, when he is not out driving his fancy expensive cars. The only thing that gets him out of his house is if he is given the opportunity to pocket $5,000 for one hour's work delivering the same old tired speeches, extracting this money from impoverished campus student organizations, who have to struggle desperately to pay the high fees he charges them to support his luxurious lifestyle."

Ok, I'll admit that my cheap shot is not that great and needs considerable refining. But in my defense, I haven't had the years of experience doing this kind of thing that Horowitz has. And I don't aim to either.

May 03, 2005

Why David Horowitz attacks academia - part 2

I have been puzzled by the vehemence of Horowitz's attacks on the academic life. After all, his accusations of faculty laziness are contradicted by actual studies. Jerry A. Jacobs (of the University of Pennsylvania) in his Presidential Address to the Eastern Sociological Society in February 2003 (and published in Sociological Forum, vol. 19, #1, February 2004), points out that college faculty work an average of nearly 55 hours per week. By contrast, professionals in other fields or managers worked nine hours per week less than college professors. His study also found that professors report that they feel constantly under stress of work-related pressures.

Of course each profession has its share of people like Wally (the character in the Dilbert comic strip) who do the minimum amount of work expected of them. I am sure academia has its representatives, though I am hard pressed to think of a single one of my colleagues in my whole academic career who comes anywhere close to the Beetle Bailey-like stereotype that Horowitz alleges is the norm.

I do not expect Horowitz to change his message simply because actual data contradicts him. As Graham Larkin (a professor of Fine Arts at Stanford University) points out in his article David Horowitz’s War on Rational Discourse that appeared in the April 25, 2005 issue of Inside Higher Ed, facts have never been an impediment to his diatribes. Horowitz's strategy is to simply repeat things over and over again, even if they have been refuted. Since he is extremely well paid by a host of wealthy right-wing foundations that support organizations that provide him with platforms to keep him in the public view, his charges gain publicity well out of proportion to their actual merit or even their truth content.

It is easy to dismiss Horowitz as a crackpot who uses inflammatory rhetoric to get publicity. But somehow that seems insufficient to me. There is a vehemence to his attacks on academics that seem to require explanation beyond simple ignorance or that he is so naïve that he does not actually understand what a university is all about and about the extent of faculty work outside the classroom.

It is Michael Berube who, I think, nails the best possible reason for Horowitz's bizarre attacks on college faculty. Berube teaches literature and cultural studies at Penn State and writes with a style and wit that I can only envy. Check out his blog to see what I mean.

In his essay Why Horowitz Hates Professors, Berube writes:

I think we’re finally getting to the real reason David hates professors so much. It has nothing to do with our salaries or our working hours: he hates our freedom. Horowitz knows perfectly well that I can criticize the Cockburns and Churchills to my left and the Beinarts and Elshtains to my right any old time I choose, and that at the end of the day I’ll still have a job – whereas he has to answer to all his many masters, fetching and rolling over whenever they blow that special wingnut whistle that only far-right lackeys can hear. It’s not a very dignified way to live, and surely it takes its toll on a person’s sense of self-respect.

Berube is right. Academics have the freedom, as long as they are not being outright offensive or advocating criminal activity or bringing dishonor to their institutions, to take positions on any subject, generally without fear of retribution from their universities. I can support evolution one day and, if I find some convincing reason to switch my views, I can oppose it the next. I can even switch my views without any reason at all, just for the fun of it, and the only loss I suffer is to my credibility. But people like Horowitz have no such freedom. They have to be very sensitive to what their paymasters want and take exactly that line or they get thrown out on their ear.

Actually, this thesis might explain a lot of the animosity that the Third-Tier Punditâ„¢ class have towards academics. All these commentators (and even reporters for the media) have a good sense of what their employers expect from them. It is the very predictability of their stances that gives them access to the media. If they start taking contrary position and become ideologically unpredictable, they risk losing their jobs. The Coulters, Malkins, and Goldbergs of the world cannot (for example) go beyond extremely mild criticisms of Bush or the Iraq war (even if they wanted to) because to do so would be career suicide.

It is true that there exists a doctrinaire left whose people also have similar constraints but those people do not have mainstream access, and most people have never heard of them. Most of the well-known people who are considered left wing by the mainstream media (such as Paul Krugman) are not as constrained in their views, because there is no equivalent to the scale of the right-wing foundations.

But academics (like Krugman) and more recently independent bloggers have no such constraints. It is because of this very lack of ideological oversight that universities can create new knowledge. It enables faculty and students to explore new ideas wherever it might take them. We are hired for our knowledge in physics or history or law, not for our ideological bent. But we also are expected to be public citizens and contribute to society, and this enables us to take stands on issues that may not be directly related to our academic research interests.

So is Horowitz's crusade driven by faculty envy, as Berube suggests? It makes sense to me. Because even as college professors complain about the amount of work they have to do, I know very few who would switch out of this life and do something else. This is because the faculty life is, in fact, a great life. Horowitz thinks that we enjoy it because we can goof off. But only a person who hates his or her own job will have such a view of what constitutes an ideal working life. An ideal job is when what we do as work is what we would do for pleasure. And that is what draws people to teaching.

Those of us in academia think it is a great life despite the workload because it is rewarding to grapple with ideas, it is stimulating to work with students who look at things in fresh ways, it is gratifying to solve a research problem, it is exhilarating to publish articles and papers and books and feel that one is contributing to the store of the world's knowledge.

We love our work and cannot imagine doing anything else. And, best of all, we can say what we honestly think about the important issues of the day. This must drive people like Horowitz crazy, and the result is not pretty.

April 15, 2005

Should college presidents take a stand on evolution?

In response to a previous post, Becky posted an interesting comment that I responded to briefly but which requires a more extended reply. (One of the unexpected pleasures of starting this blog is that it has put me in touch again with former students like Becky who was in my course about eight years ago and is now doing a PhD in Astronomy. Her own very lively blog is well worth a visit.)

Becky pointed me to an interesting article that was posted on the blog of the editors of Scientific American, entitled Cowardice, Creationism and Science Education: An Open Letter to the Universities.

At a dinner with the presidents of about a dozen private and state universities, John Rennie (one of the editors of Scientific American) and Steve Jaschik (editor of Inside Higher Education) asked the assembled presidents the following:

Suppose we have a petition here that says, “As university presidents, we affirm that evolution by means of natural selection is a demonstrated fact of science. We also assert that any failure to teach evolution, or to teach ‘intellectual design’ as an alternative theory, harms students’ educational standing.� Who here would not sign, and why?

Rennie continues: "Disappointingly, not one of the presidents in attendance was willing to go on the record as supporting such a petition. When they could finally be drawn out on why, their answers were equally unsatisfying."

He concludes: "Let’s not tiptoe around the truth. University presidents are afraid to speak out in favor of evolution because they know that they will antagonize anti-evolution Christians."

I think he is being too harsh. It may well be that the presidents were trying to duck the issue, knowing full well that they have to deal with a whole slew of constituencies ranging from current students and faculty, alumni, donors, legislators, etc. and any stand that they take on such an issue would be bound to cause them some grief.

But I think that there also exists a principled reason for them not taking a stand on issues such as evolution, and I was surprised that none of the college presidents present had made it.

I do not think it is the role of college presidents to take stands on this kind of specific issue. College presidents should not have to take positions on the pressing issues of the day, however clear cut they might seem to us. If they take a stand on the issue of evolution, then they would be expected to take stands on a whole range of other political and social issues and the process would never end. They would be just churning out press releases all day.

Where they should take stands is in support of the basic mission of the university, which is to provide a place for scholars and students to seek, create and disseminate knowledge, in an atmosphere of collegiality, and free from coercion or political pressure. Their goal should be to protect the right of their students and faculty to pursue knowledge in as unfettered an atmosphere as is possible, so that the university's mission can be realized.

Thus they can, and should, be expected to take a stand on those issues that directly affect the health of universities. So for example, taking a stand on Ohio's Senate Bill 24 is fine. Taking a stand on affirmative action in admissions is also fine. Taking a stand on issues of discrimination and harassment in universities is fine. All these issues go to the core of what universities stand for. There may be tactical reasons for not always staking out a public position on some of these, but it would be quite appropriate to do so.

But I cannot see anything special about the evolution/creationist split that requires a college president to articulate a position. While I find it bizarre that 45% of Americans can still, in this day and age (according to a Gallup poll in November 2004), believe that "God created man in present form within the last 10,000 years," I don't see why that should trigger a specific comment from college presidents, any more than the equally disturbing fact that 44% believe that several of the hijackers who attacked the U.S. on September 11 were Iraqis. (Here's a question for a sociological study: Are the two groups of people actually one and the same?)

Taking a stand on specific issues that affect particular scientific or other academic struggles should be left to individual faculty members and students or their representative bodies. What college presidents should do is protect those faculty and students who do take stands on evolution or other similar issues (whichever side they support) from retribution from politicians and interest groups who try to limit the exercise of free inquiry or try to prevent the members of academic from making scholarly judgments.

So I think we should give college presidents a break on this one.

April 06, 2005

Politics in the Universities

There has been a lot of play in the media recently about the so-called liberal tilt of university faculty. Let's see what the actual numbers are. As far as I can tell, the most comprehensive and authoritative data comes from HERI (Higher Education Research Institute) based in the UCLA Graduate School of Education and Information Studies, which has been studying trends in higher education for a long time.

HERI's 2001-2002 report on national norms for college teachers, finds that "34 percent of college and university faculty identify as "middle-of-the road" politically (down from 40 percent in 1989). Although the percentage of faculty identifying as "conservative" or "far right" (18 percent) has changed very little, the percentage identifying as either "liberal" or "far left" has grown from 42 percent to 48 percent", compared to a previous survey in 1989.

It turns out that women faculty are more liberal than men. The report finds that "54 percent of women, compared to only 44 percent of men, identify as politically "liberal" or "far left." In 2001, 21 percent of male professors and 14 percent of female professors defined their political views as either "conservative" or "far right.""

The report continues:

The latest survey involved 55,521 faculty and administrators at 416 colleges and universities nationwide. Of those, questionnaires from 32,840 full-time undergraduate teaching faculty at 358 institutions were used to compute the national norms. The numbers were adjusted statistically to represent the nation's total population of approximately 442,000 college and university faculty.

So those are the numbers. What are we to make of them? Is this imbalance in political leanings a sign of blatant political discrimination in the hiring of university faculty?

(At this point I have to reiterate my own belief that the terms 'liberal', 'conservative', 'Republican', 'Democrat' have ceased to have much meaning in terms of defining coherent political philosophies, but since this discussion and the data are framed in those terms, I have little choice but to use them for this post.)

That conclusion of hiring discrimination does not follow automatically. For one thing, the word 'liberal' in university circles does not have the same meaning it has outside. A 'liberal education' is what universities strive to provide for their students. It is used in contrast to 'vocational education'. To call someone a 'liberally educated person' is not to describe his or her political beliefs but to describe a person with breadth of knowledge and depth of understanding, as opposed to someone who has acquired a fairly specific set of knowledge and skills in order to perform a trade or profession. So the word 'liberal' has a fairly well-defined and valued meaning in universities, and one would expect people to want to identify with it.

Another point is that while it is true that universities have intense political struggles, they are based on parochial academic politics, and those divisions do not parallel national political splits. In academic departments the biggest battles over a new hire are likely to be based on field of study (in physics, it might be whether the department wants to grow the condensed matter field or the astrophysics field, or whether it should be a theoretician or an experimentalist) or rank (whether they want to hire a promising newcomer or an established star), and so forth. Similar battles occur in other departments.

These battles can be quite hard-fought, but leave little room for other considerations based on party affiliation and the like. Those are not considered important. The prestige of a physics department depends on the physics knowledge it produces, not on the ideological spectrum its faculty encompasses. No department is likely to hire an incompetent researcher to a rare and potentially lifetime appointment just on the basis of that person's party political affiliation.

But if national political considerations are not the cause of this difference in political leanings in universities, what could be the cause? I am not aware of any studies that have looked carefully at this causal question. But people have been willing to speculate.

Jennifer Lindholm, associate director of the Higher Education Research Institute's Cooperative Institutional Research Program and lead author of the faculty survey said: "The disproportionately greater shift we see toward liberal political views among women faculty may be attributable to their dissatisfaction with the Republican Party's current position on issues that often impact women's lives more directly such as abortion, welfare and equal rights."

Writing in the New York Times on April 5, columnist and Princeton economist Paul Krugman points out that registered Republicans are almost as rare in the hard sciences and in engineering (where clues as to ones political affiliation are hard to discern) as in the social sciences, suggesting that the reasons lie with more subtle causes..

Krugman postulates that "One answer is self-selection - the same sort of self-selection that leads Republicans to outnumber Democrats four to one in the military. The sort of person who prefers an academic career to the private sector is likely to be somewhat more liberal than average, even in engineering."

But the more serious charge that he levels is that the Republican party (and by association the conservative movement) are making themselves unappealing to academics by taking stands on issues that ignore evidence and that are anti-research. He pointed to a recent April Fools' Day issue spoof editorial by Scientific American entitled O.K., We Give Up in which the magazine "apologized for endorsing the theory of evolution just because it's "the unifying concept for all of biology and one of the greatest scientific ideas of all time," saying that "as editors, we had no business being persuaded by mountains of evidence." And it conceded that it had succumbed "to the easy mistake of thinking that scientists understand their fields better than, say, U.S. senators or best-selling novelists do.""

Krugman continues:

Scientific American may think that evolution is supported by mountains of evidence, but President Bush declares that "the jury is still out." Senator James Inhofe dismisses the vast body of research supporting the scientific consensus on climate change as a "gigantic hoax." And conservative pundits like George Will write approvingly about Michael Crichton's anti-environmentalist fantasies.
Think of the message this sends: today's Republican Party - increasingly dominated by people who believe truth should be determined by revelation, not research - doesn't respect science, or scholarship in general. It shouldn't be surprising that scholars have returned the favor by losing respect for the Republican Party.

Krugman argues that such an anti-research message is unappealing to any academic (whatever their political stripe), and so it should be no surprise that academics are distancing themselves from it. When Dennis Baxley, a state legislator from Florida who has introduced in that state a bill similar to Ohio's Senate Bill 24, cites professors who teach that evolution is a fact as a prime example of "academic totalitarianism", he should not be surprised that serious academics start giving him a wide berth.

As I said in an earlier post, universities are ultimately reality-based communities, which depend on evidence as an essential part of their knowledge structure. Academics in any field respect that scholars in other fields also use evidence in reaching their conclusions. They may not know that field in any detail but they tend to respect the way scholars go about reaching their conclusions and know that they can back it up with evidence if called upon to do so. The fact that their conclusions are evidence-based does not make them infallible, of course, just that they are grounded in reality.

Academics also suspect that the people who are upset about biology professors teaching that evolution is a fact are closely aligned with those who think that the Earth is only 6,000 years old and that Adam and Eve are historical figures. They suspect that the current attack on biology teaching is just the precursor to similar attacks on geology, physics, anthropology, archeology, and everything else that challenges a particular religious revelatory interpretation of the world.

Krugman argues that it should not be surprising that overtly linking such a world-view to a political movement should result in that movement losing ground in universities, even though it might be politically advantageous.

As I said, I don't know of any studies that have examined the causal reasons for this seeming ideological imbalance, but Krugman makes a point that is worth considering seriously.

March 28, 2005

What makes us change our minds?

In the previous post, I described the three kinds of challenges teachers face. Today I want to discuss how teachers might deal with each case.

On the surface, it might seem that the first kind of challenge (where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way) is the easiest one. After all, if students have no strong beliefs or prior knowledge about what is being taught, then they should be able to accept the new knowledge more easily.

That is true, but the ease of acceptance also has its downside. The very act of not caring means that the new knowledge goes in easily but is also liable to be forgotten easily once the course is over. In other words, it might have little lasting impact. Since the student has little prior knowledge in that area, there is little in the brain to anchor the new knowledge to. And if the student does not care about it one way or the other, then no effort will be made by the student to really connect to the material. So the student might learn this material by mostly memorizing it, reproduce it on the exams, and forget it a few weeks later.

The research on the brain indicates that lasting learning occurs when students tie new knowledge to things they already know, when they integrate it with existing material. So teachers of even highly technical topics need to find ways to connect it with students’ prior knowledge. They have to know their students, what interests them, what concerns them, what they care about. This is why good teachers tie their material in some way to stories or topics that students know and care about or may be in the news or to controversies. Such strategies tap into the existing knowledge structures in the brain (the neural networks) and connect the new material to them, so that it is more likely to ‘stick.’

The second kind of challenge is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. A teacher who does not take these existing beliefs into account when designing teaching strategies is likely to be wasting her time. Because these beliefs are so strongly, but unconsciously held, they are not easily dislodged or modified.

The task for the teacher in this case is to make students aware of their existing knowledge structures and the implications of them for understanding situations. A teacher needs to create situations (say experiments or cases) and encourage students to explore the consequences of the their prior beliefs and see what happens when they are confronted by these new experiences. This has to be done repeatedly in newer and more enriched contexts so that students realize for themselves the existence and inadequacy of their prior knowledge structures and become more accepting of the new knowledge structures and theories.

In the third case, students are consciously rejecting the new ideas because they are aware that it conflicts with views they value more (for whatever reason). In such cases, there is no point trying to force or browbeat them into accepting the new ideas.

Does this mean that such people’s ideas never change? Obviously not. People do change their views on matters that they may have once thought were rock-solid. In my own case, I know that I now believe things that are diametrically opposed to things that I once thought were true, and I am sure that my experience is very common.

But the interesting thing is that although I know that my views have changed, I cannot tell you when they changed or why they changed. It is not as if there was an epiphany where you slap your forehead and exclaim “How could I have been so stupid? Of course I was wrong and the new view is right!� Rather, the process seems more like being on an ocean liner that is turning around. The process is so gentle that you are not aware that it is even happening, but at some point you realize that you are facing in a different direction. There may be a moment of realization that you now believe something that you did not before, but that moment is just an explicit acknowledgment of something that that you had already tacitly accepted.

What causes the change could be many factors – something you read, a news item, a discussion with a friend, some major public event – whose implications you may not be immediately aware of. But over time these little things lodge in your mind, and as your mind tries to integrate them into a coherent framework, your views start to shift. For me personally, I enjoy discussions of deep ideas with people I like and respect. Even if they do not have any expertise in this area, discussions with such people tend to clarify one’s ideas.

I can see that process happening to me right now with the ideas about the brain. I used to think that the brain was quite plastic, that any of us could be anything given the right environment. I am not so sure now. The work of Chomsky on linguistics, the research on how people learn, and other bits and pieces of knowledge I have read have persuaded me that it is not at all clear that the perfectly-plastic-brain idea can be sustained.

On the other hand, I am not convinced that the socio-biological views of E. O. Wilson, and more recently Steven Pinker, who seem to argues that much of our brains, attitudes, and values are biologically determined by evolutionary adaptation, are correct either. That seems to me to be too pat and too much like Kipling’s Just-So Stories, where Kipling’s fictional characters accepted the present state of affairs as ‘normal’ and biologically determined, and concocted fanciful tales to ‘explain’ how they came about. I am always skeptical of theories that try to make the status quo seem ‘natural’ and just. It seems to be very convenient for those who benefit from that status quo.

It seems reasonable that some structures of the brain, especially the basic ones that enable it to interpret the input from the five senses, and perhaps even learn language, must be pre-existing. But I am not convinced that the more sweeping claims, such as that men are better than women at math or that women are more nurturing than men or that our behaviors can be explained by the desire to maximize the spread of our own genes, are biologically determined.

So I am currently in limbo as regards the nature of the brain, mulling things over. At some point I might arrive at some kind of unified and coherent belief structure. And after I do so, I may well wonder if I ever believed anything else. Such are the tricks the brain can play on you, to make you think that what you currently believe is what is correct and what you always believed.

March 25, 2005

The purpose of teaching

I have been teaching for many years and encountered many wonderful students. I remember in particular two students who were in my modern physics courses that dealt with quantum mechanics, relativity, and cosmology.

Doug was an excellent student, demonstrating a wonderful understanding of all the topics we discussed in class. But across the top of his almost perfect final examination paper, I was amused to see that he had written, “I still don’t believe in relativity!�

The other student was Jamal and he is not as direct as Doug. He came into my office a few years after the course was over (and just before he was about to graduate) to say goodbye. We chatted awhile, I wished him well, and then as he was about to leave he turned to me and said hesitantly in his characteristically shy way: “Do you remember that stuff you taught us about how the universe originated in the Big Bang about 15 billion years ago? Well, I don’t really believe all that.� After a pause he went on, “It kind of conflicts with my religious beliefs.� He looked apprehensively at me, perhaps to see if I might be offended or angry or think less of him. But I simply smiled and let it pass. It did not bother me at all.

Why was I not upset that these two students had, after having two semester-long courses with me, still not accepted the fundamental ideas that I had been teaching? The answer is simple. The goal of my teaching is not to change what my students believe. It is to have them understand what practitioners in the field believe. And those are two very different teaching goals.

As I said, I have taught for many years. And it seems to me that teachers encounter three kinds of situations with students.

One is where students do not have much prior experience (either explicitly or implicitly) with the material being taught and don’t have strong feelings about it either way. This is usually the case with technical or highly specialized areas (such as learning the symptoms of some rare disease or applying the laws of quantum mechanics to the hydrogen atom). In such cases, students have little trouble accepting what is taught.

The second type of situation is where students’ life experiences have resulted in strongly held beliefs about a particular knowledge structure, even though the student may not always be consciously aware of having such beliefs. The physics education literature is full of examples that our life experiences conspire to create in people an Aristotelian understanding of mechanics. This makes it hard for them to accept Newtonian mechanics. Note that this difficulty exists even though the students have no particular attachment to Aristotle’s views on mechanics and may not have the faintest idea what they are. Overcoming this kind of implicit belief structure is not easy. Doug was an example of someone who had got over the first hurdle from Aristotelian to Newtonian mechanics, but was finding the next transition to Einsteinian relativistic ideas much harder to swallow.

The third kind of situation is where the student has strong and explicit beliefs about something. These kinds of beliefs, as in the case of Jamal, come from religion or politics or parents or other major influences in their lives. You cannot force such students to change their views and any instructor who tries to do so is foolish. If students think that you are trying to force them to a particular point of view, they are very good at telling you what they think you want to hear, while retaining their beliefs. In fact, trying to force or bully students to accept your point of view, apart from being highly unethical teaching practice, is a sure way of reinforcing the strength of their original views.

So Doug’s and Jamal’s rejection of my ideas did not bother me and I was actually pleased that they felt comfortable telling me so. They had every right to believe whatever they wanted to believe. But what I had a right to expect was that they had understood what I was trying to teach and could use those ideas to make arguments within those frameworks.

For example, if I had given an exam problem that required that the student demonstrate his understanding of relativistic physics to solve, and Doug had refused to answer the question because he did not believe in relativity or had answered it using his own private theories of physics, I would have had to mark him down.

Similarly, if I had asked Jamal to calculate the age of the universe using the cosmological theories we had discussed in class, and he had instead said that the universe was 6,000 years old because that is what the Bible said, then I would have to mark him down too. He is free to believe what he wants, but the point of the course is to learn how the physics community interprets the world, and be able to use that information.

Understanding this distinction is important because it is this type of misunderstanding of the purpose of education that leads to things like Senate bill 24, which seems to assume that students are like sheep who can be induced to believe almost anything the instructor wants them to and thus require legal protection. Anyone who has taught for any length of time and has listened closely to students will know that this is ridiculous. It is not that students are not influenced by teaching and do not change their minds but that the process is far more complex and subtle than it is usually portrayed. (This is a topic I will come back to in a later posting)

My own advice to students is: “Listen carefully and courteously to what knowledgeable people have to say, learn what the community of scholars thinks about an issue, and be able to understand and use that information when necessary. Weigh the arguments for and against any issue but ultimately stand up for what you believe and even more importantly know why you believe it. Don’t ever feel forced to accept something just because some ‘expert’ (whether teacher, preacher, political leader, pundit, or media talking head) tells you it is true. Believe things only when it makes sense to you and you are good and ready for it.�

Can ethical behavior be legislated?

If there is one underlying idea that drives the effort to pass Ohio’s Senate Bill 24, it seems to be the idea that college faculty cannot be trusted to behave ethically in their dealings with students, in what they teach and how they assess and grade.

College faculty are probably no better or worse than other people in their ethics. But in my experience, both university administrators and faculty know that it is in their interest to have everyone behave ethically. What this bill ignores is that there are already remedies available within the universities and in the courts for the most egregious violations of ethics, and tries to micromanage ethical behavior by detailing what can and cannot be read, taught, discussed, and examined in each course.

I am not convinced that people can be forced to behave ethically. The presence of rules can prevent the more obvious or overt forms of unethical behavior, but it cannot completely eliminate them. For example, we know that there are laws on the books, and official university policies, that prohibit discrimination against people based on their gender, ethnicity, or religion. We also know that we have to make reasonable accommodations for people with disabilities.

But do we really believe that the existence of such laws and policies has eliminated discrimination? People who want to can always find ways within the laws to discriminate against people. In fact, the creation of lots of rules might work against more ethical behavior because it shifts the burden of proof. Now someone can say that as long as they are following the rules, they are behaving ethically, although they may be violating the spirit of ethical behavior.

People who value ethical behavior will, if left to themselves, of their own accord go beyond the letter of the law. Putting a lot of rules and regulations around them is likely to create resentment and hostility and a rule-following mentality.

For example, there are lots of things that instructors can do to help or hinder students that cannot be governed by rules. How an instructor responds when a students asks a question in class or asks for assistance outside of class can have a huge impact on a student’s attitude and learning. The amount of encouragement that an instructor gives, the level of guidance the instructor provides, even the letters of recommendation that they write, are very important for students, but such things cannot be legislated.

The same thing applies to students. An instructor who puts in a lot of rules designed to ‘make students learn’ is, in my opinion, doing something counterproductive. When confronted with a lot of rules and requirements, most students will simply do what is asked for and no more. What an instructor should do is to try and create the conditions which makes students want to learn and then give them the resources to do so. Learning is an inherently voluntary act and you cannot force people to learn any more than you can force them to act ethically.

There will be the rare student who will abuse this freedom, just as there will be the rare professor who abuses the freedom given to him or her. But they have to be treated as special cases and dealt with accordingly. Putting in a lot of rules to take care of such isolated cases results in the learning experience being spoiled for everyone else.

In my own experience most, if not all, students react very positively to being entrusted to take charge of their own learning. Our goal in universities should be to create students who are self-directed and ethical learners, people who enjoy learning even when no one is looking over their shoulders, and to encourage faculty to trust students and be ethical in their dealings with them.

How can people learn to achieve this higher level of self-direction if they are always viewed with suspicion and constrained by detailed rules? What we should be aiming for are fewer rules, not more.

March 22, 2005

What should we teach?

I tend to be one of those ‘glass-half-full’ kind of people. Maybe it is because of my fundamental sense of identity as a teacher. I see most things, even things that I do not agree with, also as possible ‘teachable moments’ that can be used to obtain a deeper understanding of issues. This is why, even though I think that so-called intelligent design (ID) theory is not science, discussing why this is so can lead to a deeper understanding of the nature of science.

The same is true with the attempts to legislate a so-called “academic bill of rights� for students to supposedly protect them from alleged abuse by college professors and which in Ohio is taking the form of Senate Bill 24, currently pending in committee. While I think this is a really bad idea, articulating why this is so can lead to fruitful discussions on what education should be like.

Last Thursday I was on a panel that met to discuss the implications for universities if such a bill were to be enacted (Thanks to Veronica of the Case ACLU for organizing it.) A mix of faculty and students met over the inevitable pizza to discuss the issue.

As I said in my opening comments to the group, it is my belief that it is in such types of informal gatherings of faculty and students to discuss issues of mutual interest that real learning occurs. We should have a lot more of such gatherings and fewer structured courses in college. But since formal courses and grades are a seemingly unchangeable component on the current educational structure, what we should try to do is to replicate as much as possible this kind of informal atmosphere in our formal courses.

This means that we should, as far as possible, move away from highly detailed syllabi and course requirements, and allow for more flexibility so that the direction each course takes can be driven by the shared interests of students and faculty, while still maintaining the integrity of the overall curriculum. Of course, it is only in small enrollment courses (say with fifteen students or less) that achieving this kind of consensus becomes feasible and in my own small enrollment SAGES course I have been moving in this direction and will keep doing so.

With large enrollment courses, however, many of the course and curricular decisions have to be made even before the course begins, in order to manage the logistical issues. But even there we should try to build in room for as much flexibility as possible.

I have addressed in a previous posting that things like Senate bill 24 will move things in the opposite direction, in effect writing curricula and mandating what should be in syllabi and exams, and the mind boggles as to where this can lead. For example, section A of the bill says that “curricula and reading lists in the humanities and social studies shall respect all human knowledge in these areas and provide students with dissenting sources and viewpoints.� This immediately raises problems of interpretation and enforcement.

For example, when discussing history can the instructor assume that the concentration camps of World War II are an established fact or is he/she obliged to also provide readings of holocaust deniers and use class time to discuss their ideas? If the instructor does not do so, does this mean that a student who does not believe that the holocaust occurred has grounds for complaint?

Also Marxist economics and social theory are not taught much in US universities although they have had a major influence worldwide. Should instructors be forced to have more of it and to analyze each topic in the light of what this theory says? If an economics course ignores Marxist theory, does a student have grounds for complaint? And even the terms “Marxist economics� or “capitalist economics� are open to many interpretations, with diverging schools of thought. Which schools of thought are worthy of inclusion?

If a student does complain in either of the above situations, who should be the judge of whether the instructor acted appropriately or not? Who gets to decide what is worthy and not worthy of inclusion? It is not hard to see that this kind of thing can lead to a bureaucratic nightmare.

What this bill does is infantilize faculty and students. It assumes that faculty cannot be trusted to exercise their trained judgment on what should and should not be allowed in curricula, and that students are not capable of judging when their professors are doing their job well. This bill also underestimates students’ ability to hold on to their beliefs in the face of opposing views, a topic I will discuss further in future postings.

Other panelists addressed the political and legal implications. I learned from Professor Durschlag some very interesting information about how the US Supreme Court has in the past interpreted the first amendment’s application to university education and the precedents that have been set. I will write about that at a future date when I get hold of the actual ruling. It involves a trek to the Law school library.

POST SCRIPT

There is an interesting post and discussion going on at Research in Progress about the Lawrence Summers controversy about the representation of women in academia and the professions, and the connection to Stephen Pinker's work and talk at Case last week. You really should visit.

Update: There is also now a new post on the topic, also well worth reading.

March 21, 2005

Safe Zones

As you enter my office, directly across from the door is a bulletin board and on it is a little sticker. It has the words ‘SAFE ZONE’ in large purple letters over an inverted pink triangle background.

It was given to me by the Spectrum group at Case which, according to its website seeks to “provide an environment where GLBTQQIA (gay, lesbian, bisexual, transgender, queer, questioning, intersexed, and allied) persons can socialize, learn, and grow.�

The sticker on my bulletin board is meant to be a signal that a student who fits into any of those categories can let me know without fearing any adverse or hostile reaction from me.

I have to say that I feel a little sad whenever my eye falls on that sticker. Have we come to this, that we have to publicly announce zones of safety for people for no other reason than their sexual orientation? Shouldn’t that be something that is taken for granted? The fact that it is not is a sign of how far we are from creating a tolerant society.

I have never quite been able to understand why some people get so upset by other people’s private lives. Yes, I can understand that because of your own religious beliefs or culture or upbringing or whatever there are certain things that you personally might not approve of. But you are always free not to do them. But why should the private lives of other consenting adults, even total strangers, matter to you?

And yet, it seems that many people are concerned about just such things. To me, one of the more disturbing features on last November’s election was the adoption of so many anti-gay measures across the nation. In Ohio Issue 1, that sought to prohibit gay couples from getting some of the benefits that married heterosexual couples take for granted, was adopted by 62% to 38%, an alarmingly large margin.

It seems pretty clear that there are at least two groups who currently run the risk of open discrimination – non-heterosexuals and Arabs/Muslims. It seems to be perfectly acceptable to say disparaging things against either of these two groups without being shamed or called to account.

When it comes to Arabs, for example, Third-Tier Pundit™ Hall of Famer Ann Coulter recently in her column referred to veteran journalist Helen Thomas as “that old Arab.� James Wolcott speculates as to the outrage that would ensure if that kind of language was applied to other groups. And Coulter’s fellow traveler on the Third-Tier Pundit™ circuit Michelle Malkin’s approval of the internment of ethnic Japanese during World War II and her advocacy of racial, religious and nationality profiling now is another example of this appalling tendency to select specific groups for discriminatory treatment.

Back to the issue of ‘safe zones’, I am not naïve. I know that people who are not ‘straight’ run the risk of being discriminated against, or much worse, in the broader society and that they are justified in being cautious about who knows about them. But it is a little disheartening that even in a university there is this fear of intolerance. A university should be different, even though it is populated by the same kinds of people as elsewhere, because in the university there exists something that does not exist outside in any organized way and which should act as a uniting force that overcomes the friction and divergence that can be caused by differences.

This unifying force is the love of learning and a respect for academic values that universities are built upon. If we immerse ourselves in that shared love of learning, then we will find that people who are sometimes very different from us can be the very sources of our own intellectual, spiritual, and moral growth.

In a university you will find people who are different in many ways, not just in terms of their sexual orientation. It is such individual differences that make life so interesting and enjoyable and these same qualities have been the fuel for some of the most creative people that ever lived. Our society, and our universities, should find room for all these people and not seek to shred them of their distinctiveness and make them conform to some idealized ‘norm.’

In other words, we need to make the whole university a safe zone for everyone.

March 15, 2005

Universities as a reality-based community

In a previous posting I described the disturbing phenomenon that so many Americans seemed to be living is a reality-free world. I argued that this was because they were being systematically misled by people who should, and do, know better.

Further support for my somewhat cynical view comes from an article by former Wall Street Journal reporter Ron Suskind that appeared in the October 17, 2004 New York Times Magazine and that deserves to be better known because of the light it sheds on the extent to which the current administration is ideologically driven. His article has this chilling anecdote:

"In the summer of 2002, after I had written an article in Esquire that the White House didn't like about Bush's former communications director, Karen Hughes, I had a meeting with a senior adviser to Bush. He expressed the White House's displeasure, and then he told me something that at the time I didn't fully comprehend - but which I now believe gets to the very heart of the Bush presidency.

"The aide said that guys like me were 'in what we call the reality-based community,' which he defined as people who 'believe that solutions emerge from your judicious study of discernible reality.' I nodded and murmured something about enlightenment principles and empiricism. He cut me off. 'That's not the way the world really works anymore,' he continued. 'We're an empire now, and when we act, we create our own reality. And while you're studying that reality - judiciously, as you will - we'll act again, creating other new realities, which you can study too, and that's how things will sort out. We're history's actors . . . and you, all of you, will be left to just study what we do.'"

What you have on display here is a world-view that is so arrogant that it believes that it has the power to create its own realities.

It is not unusual in the hey-day of empires for its leaders to have the feeling that they alone can direct the course of events, that they can overcome the realities they face, and that nothing can stop them from achieving their goals, whatever they may be.

What is perhaps extreme in this case is that this arrogance seems to be causing the leaders to ignore the actual realities and to think that they can create their own version of it. In other words, they believe that what they want to believe actually exists. Now, in some ways, it is always possible to do this. Reality is a complex business, composed of many disparate elements, and it is always possible to pick out those elements that support one's fantasy, ignore the rest, and act accordingly.

But what is happening here is deeper and more disturbing. What this administration is doing is trying to make reality irrelevant by creating an alternate "reality." They do this by quickly and repeatedly and strongly saying the things that they wish the public to believe are true and depending on the media or the Democratic Party to not call them on the lack of support for the assertions. As a result, after a short time, the administration's assertions enter the public consciousness, become the new "reality", and thus become the basis for vacuous 'policy debates' that have nothing to do with the actual situation.

We saw this happen in the run-up to the war with Iraq and we are seeing it again with the recent killing in Lebanon of Rafik Hariri. Using a combination of innuendo and bombast, the administration has managed to make people think that Syria is the culprit even though, until today, no evidence in support of this claim has been presented and there even exists some counterevidence. On the other hand, Robert Fisk reports today that the UN investigation team is due to make a report that will allege that there may have been a cover-up of the investigation by Lebanese and Syrian authorities, so that the situation is still murky.

What most reality-based people realize is that while forcing your own version of reality on events can win you short-term political victories, it is a prescription for long-term disaster because eventually the contradiction between the 'virtual reality' and reality become too stark to make your actions viable. The "judicious study of discernible reality," sneered at by the senior Bush advisor, is the way to arrive at reasoned judgments that have a chance of producing policies that make sense.

In many ways, universities have to be reality-based. The work of universities rests on empirical bases, on data, on evidence. This does not mean that they restrict themselves to describing just what is. Speculative ideas are the life-blood of academia because that is how new knowledge is created. Making bold speculations and pushing the limits of theories is part of the job of universities.

But such efforts must always rest on an empirical basis because otherwise they cease to be credible. You can build on reality, but you can't totally depart from it. Academics know that their credibility rests on their ability to balance speculation and theorizing with empirical data. For example, a physicist who proposes theories that do not have a basis in data would be ridiculed.

But no such constraints seem to restrain the current political leaders. At one time, the media might have played the role of injecting reality into the public discussion, by comparing official statements with the facts on the ground and providing historical context. But now that the press has largely abdicated that role (see the previous postings on The questions not asked part I and part II) in favor of either acting as a mouthpiece for the fantasies of political leaders or debating tactical points while not questioning the core fantasies, it is up to the universities to fill that void.

This is why efforts like Ohio's Senate Bill 24 that seek to restrict what university instructors can and cannot say are so dangerous. They seek to bring universities also under political control, to suffocate one of the few remaining viable reality-based institutions. While opponents decry universities as being too "liberal", what really make universities "dangerous" is that they are fundamentally reality-based institutions that cannot be easily co-opted into accepting fantasies as reality.

It seems ironic that universities, long derided as ivory towers occupied by pointy-headed intellectuals out of touch with the "real world", may in fact need to be the force that brings reality back into public life.

POST SCRIPT 1

On Thursday, May 17) in the Guilford Parlor from 11:30-1:00pm there will be a forum on Ohio's Senate Bill #24 (the so-called academic bill of rights. I will be on the panel along with Professor Mel Durschlag (Law), Professor Jonathan Sadowsky (History), and Professor Joe White (Political Science).

POST SCRIPT 2

Update on a previous posting:

I received a call yesterday (March 14) from a person associated with Students for Academic Freedom informing me that my op-ed had triggered the release of more information on their website, where more details are given.

Although the student referred to had not in fact given this testimony at the Colorado Senate hearings as had been alleged earlier, the level of detail (which had not been released until now) provided on the SAF website is sufficient to remove this story from the category of urban legends since it does give some names and places and dates. But a judgment on whether this constitutes academic bullying will have to await more details on what actually transpired between professor and student. My contact at SAF says that the incident is still under investigation and confidentiality prevents the release of more information.

Update on the update (3/15/05): It gets curioser and curioser.

The blog Canadian Cynic reports that new information on this case has come out and that Horowitz is now backtracking on almost all of the key charges that were originally made. Canadian Cynic highlights Horowitz's statements now that "Some Of Our Facts Were Wrong; Our Point Was Right" and ""I consider this an important matter and will get to the bottom of it even if it should mean withdrawing the claim."

See the article on the website Inside Higher Education. It seems to be the most authoritative source of information on this case.

March 09, 2005

The purpose of college

Why go to college?

For some, college is just a stage in the educational ladder after high school and before entering the working world or going to graduate school. In this view, college is primarily the place where you obtain an important credential that is the pre-requisite for securing well-paying jobs. This is not an insignificant consideration.

Others might see college as the place where you both broaden and deepen your knowledge in a range of subjects and develop higher-order skills such as critical thinking and writing and researching skills.

All these things are undoubtedly valuable and worth pursuing. But for me, I think the primary purpose of college is that it is the place where you start to lay the foundations for a personal philosophy of life.

What I mean by this is that at least in college we need to start asking ourselves the question: "Why do I get up in the morning?" For some, the answer might be "Why not? What other option is there?" For others it might just be a habit that is unquestioned. For yet others, it might be that they have particular ambitions in life that they want to achieve. For yet others, it might be because other people depend on us to do various things.

But while all these considerations undoubtedly play a part for all of us, the question that I am addressing goes somewhat beyond that and asks what we think of as our role in the universe. What is it that gives our lives meaning? What should be the basis of our relationships with our family and friends and society? What is our obligation to all those to whom we are tied together by a common humanity? What should be our relationship with nature and the environment?

All of us think about these things from time to time. But I suspect that these various areas of our lives remain somewhat separate. By 'developing a personal philosophy of life', I mean the attempt to pull together all these threads and weave a coherent tapestry where each part supports and strengthens the other.

I think that the university is a wonderful place to start doing this because it has a unique combination of circumstances that can, at least in principle, enable this difficult task to be pursued. It has libraries, it has scholars, it has courses of study that can enable one to explore deeply into areas of knowledge. It provides easy access to the wisdom of the past and to adventures towards the future. But most importantly, it has people (students and staff and faculty) of diverse backgrounds, ages, ethnicities, nationalities, gender, etc.

But I wonder if we fully take advantage of this opportunity or whether the day-to-day concerns of courses, homework, research, teaching, studying prevent us from periodically stepping back and trying to see the big picture. In fact, it looks like the search for broader goals for college education is declining alarmingly. In 1969, 71% of students said they felt it essential that college help them in "formulating the values and goals of my life." 76% also said that "learning to get along with people" was an essential goal of their college experience.

But by 1993, those percentages had dropped to 50% and 47% respectively, from the top ranked items to the bottom, being displaced by an emphasis on training and skills and knowledge in specialized fields. (Source: When Hope and Fear Collide by Arthur S. Levine and Jeannette S. Cureton, 1998, table 6.1, page 117.)

In my mind, this is an alarming trend and needs to be reversed.

One thing that events like the tsunami do, even for those not directly affected by it, is to bring us up short, to realize the fragility of life and the importance of making the most out of our time here. It reminds us that there are big questions that we need to ask and try to answer, and we cannot keep avoiding them.

This kind of thoughtful introspection mostly occurs outside formal classes, in the private discussions that we have in informal settings, in dorms, lounges, parks, offices, and coffee shops. But how often does it happen? And how can we create a university atmosphere that is conducive to making people realize the importance of having such discussions?

The meaning that we attach to life will depend on a host of individualized factors, such as our personal histories, what we value most, and what we are willing to give up. And we may never actually create a fully formed personal philosophy of life. The philosophy we do develop will most likely keep changing with time as our life experiences change us.

But the attempt to find out what our inner core is so that we act in life in ways that are consistent with it is something that I think college is perfectly suited for. I only hope that most people take advantage of it.

March 04, 2005

Urban legends in academia?

Did you hear the story about the college professor who asked his class to write a mid-term essay on “Why George Bush is a war criminal,� and then gave an F grade to a student who had been offended by the assignment and had instead turned in one on “Why Saddam Hussein is a war criminal�?

I wrote about this in an op-ed piece that appeared in today’s (March 4, 2005) Plain Dealer.

You will be asked by the site to fill in your zip-code, year of birth, and gender for some kind of demographic survey. It takes about 10 seconds.

Update on 3/14/05

I received a call today from a person associated with Students for Academic Freedom informing me that this op-ed had triggered the release of more information on their website, where more details are given.

Although the student referred to had not in fact given this testimony at the Colorado Senate hearings as had been alleged earlier, the level of detail (which had not been released until now) provided on the SAF website is sufficient to remove this story from the category of urban legends since it does give some names and places and dates. But a judgment on whether this constitutes academic bullying will have to await the release of the facts of the case on what actually transpired between professor and student. My contact at SAF says that the incident is still under investigation.

Update on the update (3/15/05): It gets curioser and curioser.

The blog Canadian Cynic reports that new information on this case has come out and that Horowitz is now backtracking on almost all of the key charges that were originally made. Canadian Cynic highlights Horowitz's statements now that "Some Of Our Facts Were Wrong; Our Point Was Right" and ""I consider this an important matter and will get to the bottom of it even if it should mean withdrawing the claim."

See the article on the website Inside Higher Education. It seems to be the most authoritative source of information on this case.

March 02, 2005

Putting thought police in the classroom

Most of you would have heard by now about the bill pending in the Ohio legislature (Senate Bill 24) to “establish the academic bill of rights for higher education.�

The bill is both silly and misguided. It mixes motherhood and apple pie language (“The institution shall provide its students with a learning environment in which the students have access to a broad range of serious scholarly opinion pertaining to the subjects they study.�) with language that is practically begging students with even minor grievances to complain to higher authorities.

In a previous posting, I spoke about how lack of trust leads to poor learning conditions and that we need to recreate the conditions under which trust can flourish. This bill goes in the wrong direction because it effectively creates a kind of ‘thought police’ mentality, where any controversial word or idea in class can end up causing a legal battle.

Let me give you an example. The bill says “curricula and reading lists in the humanities and social studies shall respect all human knowledge in these areas and provide students with dissenting sources and viewpoints.�

As an instructor, how would I respect “all� the knowledge in the area? What do we even mean by the word “knowledge.� How do we even separate knowledge in the humanities and social sciences from those in the sciences? What constitutes “dissenting viewpoints?� And how far should “dissenting� be taken? If a particular point of view is not mentioned by the instructor, is that grounds for complaint?

Take another example.

“Students shall be graded solely on the basis of their reasoned answers and appropriate knowledge of the subjects and disciplines they study and shall not be discriminated against on the basis of their political, ideological, or religious beliefs.�

Grading is an art not a science. It is, at some level, a holistic judgment made by an instructor. To be sure the instructor has a deep ethical obligation to the profession to assign the grade with as much competence and impartiality as he or she can muster. But even granting that, a letter grade or a point allocation for an assignment is not something that can be completely objectified. Give the same essay or problem to two different teachers and they will likely arrive at different grades even if it were “graded solely on the basis of their reasoned answers and appropriate knowledge.� And this can occur irrespective of how agreeable or disagreeable the student’s views might be perceived by the instructor. So if a student complains about a grade, how can this be adjudicated?

As I said in a previous posting, the reason we currently have so many rules in our classrooms is that we seem to have forgotten the purpose of courses, and have lost that sense of trust that is so vital to creating a proper learning atmosphere.

This bill, rather than increasing trust in the classroom, will decrease it. Because as soon as there is legislation prescribing what can and cannot be done in the classroom, it will inevitably lead to teaching and grading issues ending up in the courtroom. And in order to avoid that tedious and expensive process, universities will start instituting detailed lists of rules about what can and cannot be done in the classroom, and teachers will start teaching and assessing defensively, simply to avoid the chance of litigation.

Is this what we want or need?

POST SCRIPT

Tomorrow (Thursday, March 3) from 7:00-9:00 pm in Thwing Ballroom, Case’s Hindu Students Association is hosting an inter-religious dialogue on how to reconcile a belief in God in light of major disasters like the recent tsunami.

There will be a panel of religious scholars representing all the major religious traditions (drawn from the faculty of the Religion department at Case and elsewhere) and plenty of time for discussions. I will be the moderator of the discussion.

The event is free and open to the public and donations will be accepted for tsunami relief efforts.

February 28, 2005

The importance of trust in the classroom

The more I teach, the more I feel that there is an inverse correlation between the quality of learning that occurs and the number of rules that govern the classroom. At its best, teaching involves trust between students and teacher, and among fellow students. The assumption should be that we are all there to learn and that we will help each other learn.

To be sure, the teacher has a responsibility to the students and the institution he or she works for to ensure that learning is occurring and that the unavoidable grades that have gained a stranglehold in our educational world are assigned fairly.

But apart from this minimal expectation, I feel that there should be no other rules, except those that are created collectively by the entire class in order that things run smoothly. It is for this reason that my courses are becoming progressively rule-free over time. This is also why I oppose efforts to treat course syllabi as quasi-legal contracts and to mandate what they should and should not contain

But I know that I am swimming upstream on this one. Many course syllabi are becoming increasingly crammed full of rules and regulations. Why? To my mind, this is a measure of the lack of trust that has developed between student and teachers. Students and faculty don’t really know each other as people. We don’t see ourselves as having come together for an endeavor (learning) which should be enjoyable and from which all of us will benefit and which will form the basis of a lifetime relationship. Instead we seem to see ourselves as temporary acquaintances engaged in a commercial transaction. The faculty member has a product or service (knowledge, grades) that the student ‘purchases’ with money, time, and effort.

A natural consequence of this commerce mentality is the need for rules, just like those in the marketplace. Students seem to feel the need to have rules to protect themselves from arbitrary actions by faculty members who are strangers to them, and faculty feel the need to have rules to protect themselves from complaints by students whom they don’t really know. This dynamic inevitably leads to a spiral of increasing rules since having written rules at all implies a lack of trust, which then results in people testing the limits of the rules, which creates the need for more protective rules, which leads to even greater distrust, and so on.

But the reality is that there are only a tiny handful of faculty and students who might take unfair advantage of one another in the absence of a detailed set of rules. In my work in many universities, it is hard for me to recollect cases of faculty members who did not take seriously their ethical obligation to treat students fairly.

This does not mean that faculty members cannot be arrogant, condescending, and unrealistically demanding. We are, after all, human. But it is rare that a teacher will act out of spite against a specific student. And if it does happen, there are mechanisms in universities to try and redress these wrongs when they occur, because the other faculty members know that we can only succeed if we as a learning community try to uphold the highest standards.

We don’t have written rules of behavior among friends. We don’t have written rules of behavior among family members. The reason is that the common interests that bring us together are strong enough to make us want to resolve the issues in a manner of friendly give-and-take. Why is it that we do not even try to create a similar situation in class? Surely a common interest in learning is strong enough to serve a similar role?

When I think about what is the one change that I would recommend to dramatically improve education at all levels, I come to the conclusion that we must create a greater sense of trust in the classroom so that we can minimize the number of rules and thus allow the natural enjoyment that true learning provides to emerge.

February 22, 2005

What makes us good at learning some things and not others?

One of the questions that students ask me is why it is that they find some subjects easy and others hard to learn. Students often tell me that they “are good� at one subject (say writing) and “are not good� at another (say physics), with the clear implication that they feel that there is something intrinsic and immutable about them that determines what they are good at. It is as if they see their learning abilities as being mapped onto a multi-dimensional grid in which each axis represents a subject, with their own abilities lying along a continuous scale ranging from ‘awful’ at one extreme to ‘excellent’ at the other. Is this how it is?

This is a really tough question and I don't think there is a definitive answer at this time. Those interested in this topic should register for the free public lecture by Steven Pinker on March 14.

Why are some people drawn to some areas of study and not to others? Why do they find some things difficult and others easy? Is it due to the kind of teaching that one receives or parental influence or some innate quality like genes?

The easiest answer is to blame it on genes or at least on the hard-wiring of the brain. In other words, we are born the way we are, with gifts in some areas and deficiencies in others. It seems almost impossible to open the newspapers these days without reading that scientists have found the genes that ‘cause’ this or that human characteristic so it is excusable to jump to genes as the cause of most inexplicable things.

But that is too simple. After all, although the brain comes at birth with some hard-wired structures, it is also quite plastic and the direction in which it grows is also strongly influenced by the experiences it encounters. But it seems that most of the rapid growth and development occurs fairly early in life and so early childhood and adolescent experiences are important in determining future directions.

But what kinds of experiences are the crucial ones for determining future academic success? Now things get more murky and it is hard to say which ones are dominant. We cannot even say that the same factors play the same role for everyone. So for one person, a single teacher's influence could be pivotal. For another, it could be the parent's influence. The influences could also be positive or negative.

So there is no simple answer. But I think that although this is an interesting question, the answer has little practical significance for a particular individual at this stage of their lives in college. You are now what you are. The best strategy is to not dwell on why you are not something else, but to identify your strengths and use them to your advantage.

It is only when you get really deep into a subject (any subject) and start to explore its foundations and learn about its underlying knowledge structure that you start to develop higher-level cognitive skills that will last you all your life. But this only happens if you like the subject because only then will you willingly expend the intellectual effort to study it in depth. With things that we do not care much about, we tend to skim on the surface, doing just the bare minimum to get by. This is why it is important to identify what you really like to do and go for it.

You should also identify your weaknesses and dislikes and contain them. By “contain� I mean that there is really no reason why at this stage you should force yourself to try and like (say) mathematics or physics or Latin or Shakespeare or whatever and try to excel in them, if you do not absolutely need to. What's the point? What are you trying to prove and to whom? If there was a really good reason that you needed to know something about those areas now or later in life, the higher-level learning skills you develop by charging ahead in the things you like now could be used to learn something that you really need to know later.

I don't think that people have an innate “limit�, in the sense that there is some insurmountable barrier that prevents them from achieving more in any area. I am perfectly confident that some day if you needed or wanted to know something in those areas, you would be able to learn it. The plateau or barrier that students think they have reached is largely determined by their inner sense of “what's the point?�

I think that by the time they reach college, most students have reached the “need to know� stage in life, where they need a good reason to learn something. In earlier K-12 grades, they were in the “just in case� stage where they did not know where they would be going and needed to prepare themselves for any eventuality.

This has important implications for teaching practice. As teachers, we should make it our goal to teach in such a way that students see the deep beauty that lies in our discipline, so that they will like it for its own sake and thus be willing to make the effort. It is not enough to tell them that it is “useful� or “good for them.�

In my own life, I now happily learn about things that I would never have conceived that I would be interested in when I was younger. The time and circumstances have to be right for learning to have its fullest effect. As Edgar says in King Lear: “Ripeness is all.�

(The quote from Shakespeare is a good example of what I mean. If you had told me when I was an undergraduate that I would some day be familiar enough with Shakespeare to quote him comfortably, I would have said you were crazy because I hated his plays at that time. But much later in life, I discovered the pleasures of reading his works.)

So to combine the words from the song by Bobby McFerrin, and the prison camp commander in the film The Bridge on the River Kwai, my own advice is “Don't worry. Be happy in your work.�

Sources:

John D. Bransford, Ann L. Brown, and Rodney R. Cocking, eds., How People Learn, National Academy Press, Washington D.C.,1999.

James E. Zull, The Art of Changing the Brain, Stylus Publishing, Sterling, VA, 2002.

February 10, 2005

Ossie Davis, stereotype threat, and academic underachievement

Veteran actor Ossie Davis died last Friday. In reading the tributes to him, I was struck by what he had said just a year earlier when he received the Kennedy Center awards.

“We knew that every time we got a job and every time we were on a stage, America was looking to make judgments about all black folks on the basis on how you looked, how you sounded, how you carried yourself. So, any role you had was a role that was involved in the struggle for black identification. You couldn’t escape it.�

This comment underscores research by Claude Steele and Joshua Aronson on what makes black students underachieve academically. They identified one possible cause as something they named ‘stereotype threat.’ When members of an identifiable group are placed in a testing situation where failure would reinforce a negative stereotype of that group, this places a pressure on them that makes them under-perform. This was true for blacks in any academic situation, for women being tested in mathematics, and even for white men competing academically against Asians, as is illustrated by this cartoon.

doonesbury.jpg

In Davis’ case, we can see that he felt immense pressure to always succeed on stage and never do anything that would reflect negatively on him or his performance. Any action that would be a sign of individual failure if done by a white person would, if done by a black, be taken as a sign of black people’s incapacity or incompetence.

It is possible that because of this fear, Davis could not afford to take risks in performing and try edgy and unflattering roles, the kinds of things that might have made him an even greater actor. He may have suffered from ‘Sidney Poitier Syndrome’, which I have named after another great actor who seemed to always play characters that were kind, noble, clever, ‘perfect in every way’ as Mary Poppins said.

This was carried to an extreme in the nauseating film “Guess Who’s Coming to Dinner?� in which Poitier played this brilliant, wonderful, flawless human being whose white fiance’s parents’ struggle to overcome their prejudices and accept him as their son-in-law.

I wonder how much Davis and Poitier regret that, even at the height of their powers, they were not able to expand their skills and improve their craft by playing unflattering, evil, sinister, or criminal characters, the way white actors like Harvey Keitel or Robert De Niro do. Perhaps they take comfort in knowing that their sacrifices enabled later generations of black actors to do so.

Similarly, it was not that long ago that Doug Williams faced a similar stereotype threat when he was the first black quarterback to play in a Super Bowl (XXII in 1988). There was a ridiculous notion floating around then that quarterback was a ‘brains’ position and that perhaps blacks could not handle it. I remember Williams saying that he felt pressure to succeed just to prove that blacks could do it.

Fortunately Williams, like Davis and Poitier, was a gracious man and handled the pressure exceedingly well (340 passing yards, four touchdown passes) and his Washington Redskins handily defeated their Denver opponents. He was even awarded Super Bowl MVP honors.

These days the presence of top-flight black quarterbacks at all levels of the game is taken for granted, and it seems hard to imagine that anyone could have questioned their abilities. But we do not know how many black quarterbacks before Williams, or actors before Davis and Poitier, did not handle the pressure as well as these pioneers and hence had lesser success or even outright failure and did not make it to the heights that they did.

But even though stereotype threats have been somewhat suppressed in football and acting, it is still alive and well when it comes to schooling and is likely to continue to suppress academic performance of the affected groups until we break free of that kind of thinking.

Sources:

1. Claude M. Steele and Joshua Aronson, “Stereotype Threat and the Intellectual
Test Performance of African Americans,� Journal of Personality and Social Psychology 95, no. 5 (1995): 797–811.

2. David J. Lewin, “Subtle Clues Elicit Stereotypes’ Impact on Black Students,� Journal of NIH Research, November 1995, 24–26.

February 02, 2005

High self-esteem does not lead to high student achievement

After wasting space on Michelle Malkin last week, the Plain Dealer redeemed itself on Monday, January 31 with an intriguing op-ed piece by Roy F. Baumeister on the misguided attempts to cure various social ills by boosting the self-esteem of the people responsible for those ills. This was based on the theory that low self-esteem people resorted to violence, for example, in order to feel better about themselves. Thus it was believed that if we can raise their self-esteem, they will stop being violent.

A 1996 paper in Psychological Review by Baumeister and co-workers debunked that hypothesis by showing that violent individuals, groups, and even nations actually already think highly of themselves, and resort to violence when they do not receive the inflated respect they feel they are entitled to. Promoting high self-esteem that is unsupported by actual achievements or abilities turns out to be harmful.

Baumeister (who used to be a Professor of Psychology at Case until just a few years ago) now finds similar results in the research literature for student educational achievement. Inflated high self-esteem not only does not result in better academic achievement, it can sometimes even lower it.

These conclusions should be taken very seriously by educators, many of whom have put great stock in raising the self-esteem of under-achieving students as a strategy to boost their performance. The Education Trust reported in a 2001 study that children in high-poverty schools are given few assignments, that even those are of low-quality, and are then given As for work that would merit Cs and Ds elsewhere, all in a misguided effort to improve their self-esteem

In my own work with professional-development programs, an earnest and well-meaning teacher once told me of her frustration with attempts to improve students’ self-esteem in her exclusively black school district. After teaching a section of the mathematics course, she would give her students a practice test. She would then grade the tests and hand them back to the students, along with the answer key, and discuss the test. The “real� test, which was exactly the same as the practice test, was then given, with the students being aware beforehand that this was going to be done. The teacher told me that she adopted this strategy so that the students would score well on her tests and thus experience a boost in their self-esteem. Yet she was frustrated that her students still did badly on the test.

It is not hard to understand why the math teacher’s students were not putting in any effort to just memorize the answers to the practice test and reproduce them on the real test. It was because the “real� test was not a real test of anything meaningful. The task was so trivially simple as to be insulting.

This does not apply to just underachieving students at lower grade levels. Just yesterday a faculty member in the School of Engineering here at Case (which has ambitious, hard-working, and high achieving students) was expressing puzzlement because in order to get more class participation he would ask very easy questions but no one was volunteering to answer them.

But from the point of view of the students, this response is perfectly rational. If the question is obviously easy, then no kudos accrue to a student for answering it correctly. But if you do volunteer an answer and get it wrong, then you appear stupid in the eyes of your peers. So the safest course is to avoid answering.

The research on motivation suggests that students (and people in general) respond best not to praise and blame, but to neutral feedback that gives them a realistic sense of what they can do and what they need to do to improve. They also respond best to moderate levels of challenge. If the assignments are too hard, then they get frustrated. If they are too easy, then there is no sense of achievement in doing them. The challenge for any teacher is to gauge the right levels of challenge, provide appropriate support, and give informative and prescriptive feedback.

Baumeister’s work confirms that trying to raise self-esteem is not the way to go. While high self-esteem does provide some minor benefits (it feels good and supports initiative), he suggests that we might get better results by focusing more on self-control and self-discipline. It is a message that should be taken seriously.

Sources:

1. Roy F. Baumeister, Jennifer D. Campbell, Joachim I. Krueger, and Kathleen D. Vohs, “Does high self-esteem cause better performance, interpersonal success, happiness, or healthier lifestyles?�, Psychological Science in the Public Interest, May 2003, vol. 4, No. 1, 1-44
2. Roy F. Baumeister, Laura Smart, Joseph M. Boden, “Relation of Threatened Egotism to Violence and Aggression: The Dark Side of High Self-Esteem�, Psychological Review, 1996, vol. 103, No. 1, 5-33
3. Kati Haycock, Craig Jerald, and Sandra Huang, “Closing the Gap: Done in a Decade,� Education Trust: Thinking K–16 5, no. 2 (Spring 2001)
4. Kati Haycock, “Closing the Achievement Gap,� Educational Leadership, March 2001, 6–11.

January 28, 2005

Patronizing students

Sometimes it seems to me that there is no half-baked idea that originates anywhere in the known universe that does not quickly find influential adherents anxious to institutionalize it in Ohio.

Barely has the dust settled on the push to include Intelligent Design into Ohio's science standards than we now have Marion state senator Larry A. Mumpers introducing Ohio Senate Bill 24 in order to “prohibit instructors at public or private universities from “persistently� discussing controversial issues in class or from using their classes to push political, ideological, religious or anti-religious views.� (Sorry, no link to this quote from the subscriber only Columbus Dispatch news item by Kathy Lynn Gray on 1/27/2005.)

This is bound to raise the free-speech, academic freedom debate in all its full-blown glory and I am not going to revisit that. But one statement by Senator Mumford jumped out at me. He feels that college students need this kind of legal protection because “These are young minds that haven’t had a chance to form their own opinions.�

Such words can only be uttered by someone who has never really listened to adolescents and young adults or tried to persuade them to change their minds. Does he really think that young people have not already formed strong opinions about things?

The education literature is full of research on how people’s minds are resistant to new ideas. Students cling to Aristotelian ideas of motion, and harbor serious misconceptions about the seasons and the phases of the moon, even though they may have been taught the standard views many times in the course of their education.

And this happens in the area of physics, where students do not even have a commitment to retaining their old ideas, or are often unaware of what those ideas are until asked to explicitly articulate them. Imagine how hard it would be to change their minds about politics and religion, which are much closer to the surface of their consciousness.

Many a professor (including myself) has been aghast at discovering that all their careful lectures and arguments have had little impact on what students really believe, even though the students may be highly adept at reproducing the professor’s views on exams.

This kind of comment betrays at best a naivete, and at worst a contempt, for the ability of college students to think for themselves and resist indoctrination by their teachers. But this is not going to prevent politicians like Senator Mumpers from going ahead in their condescending efforts to “protect� students.

Get ready for the legal battle…