Saturday, December 27, 2008

A stumble for the Republican Party

A recent story about four of the candidates vying for the Republican National Committee chairmanship indicates the troubles that may lie ahead for the GOP in its attempt to rebuild itself after its stunning electoral defeat this year:
Chip Saltsman, a candidate for chairman of the Republican National Committee, sent committee members this month a holiday music CD that included "Barack the Magic Negro," a parody song first aired in 2007 by talk show host Rush Limbaugh.

Created by conservative satirist Paul Shanklin, the song puts new lyrics to the tune of "Puff the Magic Dragon," and it is performed as if black activist Al Sharpton were singing it. Limbaugh played it after the Los Angeles Times ran an opinion piece with the same title, arguing that a vote for Barack Obama could assuage white guilt.

"A guy from the LA paper said it made guilty whites feel good, they'll vote for him and not for me cuz he's not from the hood," the song goes. "Oh, Barack the magic negro lives in DC, the LA Times they called him that because he's black but not authentically."
I'm sure Limbaugh thinks that Democrats who take offense at this ditty are simply proving the age-old truth that liberals have no sense of humor. (You won't get much argument there from Stephen Colbert.) And it is worth considering Clarence Page's defense of the song.

Still, I'm not sure that Republicans have earned the right to be hip and ironic on this subject. I found Sacha Baron Cohen's "Throw the Jew Down the Well" routine very funny, but I wouldn't feel comfortable if an Arab American organization began playing it. Sometimes you're just not in a position to be making certain kinds of jokes.

Another candidate for RNC chair is Katon Dawson, who recently resigned his 12-year membership in a whites-only country club.

When I read about these incidents, I have to think: are the Republicans out of their heads? They just lost a presidential election to a black man who received 96% of the African American vote. They know they cannot go on forever being the party of whites. If current projections are accurate, white Americans will be a minority less than half a century from now. To survive electorally, the Republican Party must make inroads into the African American community.

Unfortunately, a situation like this may drive them toward careless tokenism, as it has in the past. There are currently two African Americans running for RNC chair, Michael Steele and Ken Blackwell. As a Marylander, I am all too familiar with Steele. He may pass the Joe Biden test for black politicians--he is "articulate and clean and a nice-looking guy." The trouble is, he's also a bit of a nitwit.

I will mention one anecdote from his unsuccessful Senate run in 2006. A Roman Catholic, Steele was talking before the Baltimore Jewish Council when he was asked about his views on stem-cell research. His answer became the most notorious remark of his campaign: "You of all folks know what happens when people decide they want to experiment on human beings, when they want to take your life and use it as a tool. I know that as well in my community, out of our experience with slavery, and so I'm very cautious when people say this is the best new thing, this is going to save lives."

Jews did not take kindly to that remark. Somebody, somewhere along the line, should have informed Steele that Jews, unlike Catholics, overwhelmingly support embryonic stem-cell research. I think most Jews would respect any Catholic politician who took a principled stand on the issue. But if there is any sure way for a non-Jew to irritate a Jewish audience, it is by making an inappropriate Holocaust comparison. What was Steele thinking?

Still, I was willing to cut him some slack. Many capable politicians have said dumb things from time to time, and I don't like our gotcha culture where one regretful remark follows a politician around for the rest of his career. But I was not impressed by how Steele handled the aftermath. He quickly apologized for the remark, but he then proceeded to make an incoherent flip-flop on the issue. He stated that he actually supports embryonic stem-cell research--just so long as it doesn't destroy the embryo.

The rest of his campaign ran along similar lines. Maybe he didn't stand a chance: he was in an uncomfortable position as a conservative in a very liberal state. But Robert Ehrlich, a Republican, managed to win one term as governor with a good triangulation effort that attracted many Democrats. Steele didn't have that finesse. He seemed to lack both vision and leadership.

As for Ken Blackwell, the other black politician running for RNC chair, I know nothing about him. But the name! I mean, can we get any more subtle? I can imagine it now. "Meet our new token black, Mr. Blackwell."

If Republicans can find someone with the stature of Colin Powell or Condi Rice, they might be in good shape. But their credibility problem among blacks cannot be solved just by having blacks in prominent positions.

After all, polls show that African Americans tend to hold some conservative views, yet they continue to vote overwhelmingly Democrat. Though they aren't uncritical of the Democratic Party establishment, as the Clinton-Obama fight earlier this year demonstrated, they see the Republican Party as an old white boys' club. If Republicans want their votes, the first thing they're going to have to do is purge their party of any hint of racism. Since blacks are the future, a party that continues to alienate them will have no future.

Tuesday, December 23, 2008

The virtues of pluralism

Grammar books tell us that the word media should be treated as a plural. You're supposed to say, "The media are covering the story," not "The media is covering the story." Nobody can explain what "a medium" is in this context, and the media people themselves often do not follow this rule, or they follow it inconsistently. I've read articles and even books where in one paragraph it's plural, in the next it's singular, and in the next it's plural again. It's beginning to take on the qualities of a noun like sheep, which can be either singular or plural.

The word is actually one of several Latin-derived plurals that have gradually become singular in English. Another example is data, which is a shibboleth among techies. You're supposed to say "The data are misleading" rather than "The data is misleading," and you'll get a stiff whacking if you get it wrong, even if hardly anyone uses the singular datum.

Of course, there are other examples that not even the pickiest of grammar cops treats as plural. You never hear anyone say "The agenda are ready," even though agenda is as much a Latin plural as media and data. Then there are Italian plurals like opera and spaghetti that English speakers have always treated as singular. If you heard someone say "The spaghetti are cooking," you'd give that person a stiff whacking!

But unlike any of those examples, the plurality of media is more than just a grammar issue. People almost always use the word to suggest that news coverage is distorted, slanted, or subtly manipulated. This connotation does not exist in terms such as "the newspapers" or "the networks." Although by now the plural use of media is mostly a formality, its transformation into a singular noun contributed to the popular image of the news business as a single unified entity conspiring to present the news in a particular way. All the critics left, right, and center talk about the media this way, even when they attach the term to a plural verb just to show what a smartypants they are.

Sunday, December 21, 2008

Religion and influence

There is a school of thought suggesting that atheistic Jews like Marx and Freud were creating essentially surrogate forms of Judaism. I was intrigued to learn that James Joyce's novel Portrait of the Artist as a Young Man suggests a similar thing about lapsed Catholics. The protagonist Stephen Daedalus rejects the Catholicism of his upbringing, yet as he is explaining his philosophy of art to a friend, the friend coyly observes, "It is a curious thing...how your mind is supersaturated with the religion in which you say you disbelieve."

When I had to do a paper on the novel in college, I seized on this idea. I posed the question of why Stephen abandoned his Catholicism, and my answer was that he found in art what he had been seeking in religion, namely a way to transcend the temptations of the flesh. Stephen's philosophy is that when examining art, a person should separate his impression of a piece from any physical or emotional reaction it may provoke. Superior art, according to Stephen, is created through a detachment between the artist and his work, whereby the artist's personality "refines itself out of existence, impersonalizes itself."

I traced the development of Stephen's philosophy throughout the novel. As a child and teenager, Stephen reacts to artistic pieces---books, poems, and the like--by letting himself be overcome by them. Conflict arises one day when he wanders into a bad part of town where he has his first sexual encounters. He knows he faces eternal torment for his actions, but at first he feels "a cold lucid indifference." Then his school has him attend a three-day retreat by a priest who gives a harrowing description of what Hell is like. Here is a brief excerpt from the priest's vivid speech:
In earthly prisons the poor captive has at least some liberty of movement, were it only within the four walls of his cell or in the gloomy yard of his prison. Not so in hell. There, by reason of the great number of the damned, the prisoners are heaped together in their awful prison, the walls of which are said to be four thousand miles thick: and the damned are so utterly bound and helpless that, as a blessed saint, Saint Anselm, writes in his book on Similitudes, they are not even able to remove from the eye a worm that gnaws it....

The horror of this strait and dark prison is increased by its awful stench.... Imagine some foul and putrid corpse that has lain rotting and decomposing in the grave, a jellylike mass of liquid corruption. Imagine such a corpse a prey to flames, devoured by the fire of burning brimstone and giving off dense choking fumes of nauseous loathsome decomposition. And then imagine this sickening stench, multiplied a millionfold and a millionfold again from the millions upon millions of fetid carcasses massed together in the reeking darkness, a huge and rotting human fungus. Imagine all this, and you will have some idea of the horror of the stench of hell.
For all its raw emotional power, the speech has an important limitation. The priest describes the speech as being about "death, judgment, hell and heaven," but it is almost entirely about Hell, with scarcely a word about what Heaven will be like. Stephen becomes a fervent worshipper, but soon doubts begin to surface, and he wonders if his new devoutness is driven more by fear than by sincere belief. He has trouble finding a more positive basis for his faith.

What he gains most from that period of atonement is considerable practice at inhibiting his physical reactions. He walks with eyes to the ground; avoids eye contact with women; subjects himself to loud noises and unpleasant smells; and refuses to make himself comfortable in bed. By quelling his receptiveness to sensory experience, however, he undermines the very quality that allowed the priest's speech to influence him in the first place. He ultimately leaves his Catholicism behind when he satisfies his need for a chaste vantage point from which to observe life, without the "chill and order" of the priesthood that first attracted him. And he achieves that purpose through his newfound appreciation of art.

When I looked back on my essay later, I noticed a curious irony. Stephen's philosophy of art was almost the polar opposite of mine. Whenever I'm examining a work of fiction, or film, or music, the first question I ask myself is, "What effect did it have on me?" That question leads me to the most sincere and, hence, authentic, answers. Depersonalizing the process only leads to an artificial response, in my view.

In fact, that's exactly how I approached Joyce's novel. I wasn't sure what my paper was going to be about. But the point in the novel that had the most immediate impact on me was the priest's description of Hell. I knew I had to pivot my reaction to that scene into a larger thesis, and that's exactly what I did.

I wonder if my approach to art has something to do with my Jewish background, just as Stephen's has to do with his Catholic background. Judaism emphasizes the idea of a person being transformed through his actions. That's why Jewish thought is relatively weak on theology. Even Torah study is viewed in this light: you're encouraged more or less to lose yourself in it and then see how it affects you.

Then again, I could be totally off about this theory. I'm generalizing based on two limited examples, my own personal philosophy and that of a fictional character (albeit one based on Joyce himself). But I do believe that it is hard for people to escape their initial influences in life.

Saturday, December 13, 2008

Opinions with results

On August 11, 2008, writing for the ironically named conservative publication The American Thinker, Steven M. Warshawsky proclaimed, "As I wrote last December, '[t]he pundits can talk until they are blue in the face about Obama's charisma and eloquence and cross-racial appeal. The fact of the matter is that Obama has no chance of being elected president in 2008.' I am more convinced of this conclusion than ever."

On October 9, shortly after the second presidential debate, Warshawsky wrote, "I have received numerous emails from Republicans and Democrats alike, asking whether I still think Obama will lose the election. Yes, I do. But what about the polls, they ask? The polls show that Obama is winning. No, they don't, as I will explain." And he did.

Even by October 25, he did not back down: "In a few more weeks, the political environment in this country is likely to become a heckuva lot nastier. For there are real signs pointing to a McCain victory this year, whether or not the mainstream media wants to acknowledge them."

Needless to say, after Election Day he was shocked: "I cannot understand how a man like Obama became president. It contradicts everything I know, or thought I knew, about American history, culture, and politics." But he didn't conclude that his own thinking was at fault. He acted as if the country had pulled a fast one on him. He even made a whole new set of predictions about the damage that Obama and the Democratic Congress would do, and he ended by saying, "I hope that the Democrats will prove me wrong again." Judging from his track record, his hopes will likely be fulfilled!

He is hardly the only commentator to have underestimated Obama. But unlike any mainstream pundit I'm aware of, he continued to predict Obama's demise after primary season, and with firm conviction ("no chance of being elected"). To suggest as late as August that Obama might lose was reasonable, but to suggest that he definitely would lose was insane.

The sheer insularity is striking. Not surprisingly, The American Thinker has also been a repository for wild conspiracy theories about Obama, from the birth certificate business to the claim that William Ayers ghostwrote Dreams from My Father. I have no doubt that the people who believe these things will continue to believe them in the years to come. That's the beauty of having opinions: nothing can shake your belief in them, as long as you choose to consider them true. Making a specific, concrete prediction about the near future is another matter. Once you expose yourself to objective reality, you can't hide from it.

The fallacies in Warshawsky's analysis weren't hard to spot. His most telling statement was, "Why am I so confident that John McCain is going to win the election? In short, because Barack Obama is not an acceptable choice to lead the country." It didn't seem to occur to him that the American public might not share his standards of what is acceptable.

His refusal to believe the polls was also notable. The success of polls at predicting presidential winners has increased dramatically since the days of "Dewey Defeats Truman." In a very close race, there may be uncertainty. But by mid-October this year, McCain was consistently trailing Obama by at least five points, and the electoral map looked even worse for him. It was conceivable that public opinion might change before Election Day, but there was no reason to believe he was already in the lead.

Warshawsky may be an extreme case, but the punditocracy is littered with erroneous forecasts, and the pundits are rarely taken to task for them. I believe that the quality of prognostications can be a gauge of a commentator's analytical skill. We should pay more attention to them.

Let's consider some of the factors that inspire bad political predictions. One is wishful thinking. Another is projecting one's own outlook on the public. A notorious example of the latter was the title of conservative columnist Shelby Steele's early-2008 book A Bound Man: Why We are Excited about Obama and Why He Can't Win. Steele later apologized for the "stupid, silly subtitle that was slapped on to the book" and claimed it did not represent what the book was arguing. Not having read the book, I'll take his word for it.

Some people will predict a candidate's victory in the hopes of creating a self-fulfilling prophecy. That's why candidates themselves rarely admit they're losing even when it's obvious they are. The final nail in the coffin of Fred Thompson's candidacy may have been when he admitted to the press that he wasn't likely to get the nomination. For a politician in those circumstances to lie is understandable, but we expect more honesty from commentators, who should always tell at least what they believe to be the truth.

None of the aforementioned factors explain why many Democrats doubted Obama would win, even days before the election when he seemed practically unbeatable. The crucial factor here was paranoia, inspired by past defeats. They felt their own party had a knack for "snatching defeat from the jaws of victory," and they believed that Republicans might steal the election again. (To this day, there are liberals who think even the 2004 election was stolen.) They also worried about the Bradley Effect, the alleged phenomenon that public opinion polls overestimate a black candidate's support because some respondents are afraid of revealing racist motivations.

One Democrat whose predictions were spectacularly vindicated was Nate Silver of FiveThirtyEight.com. A baseball statistician by trade, Silver developed a unique method of averaging together the presidential polls to determine the winner. His final estimate had Obama beating McCain by 52.3% to 46.2%. The initial results on Election Night were 52.4% to 46.3%--within a tenth of a percent of Silver's predictions. (Subsequent counting, however, has widened Obama's lead by a whole percentage point, making Silver's estimate less accurate.) He also correctly predicted the winner in every state except Indiana, which Obama won narrowly. It's worth asking whether Silver would have been as accurate in a year that was bad for Democrats. My impression is that he doesn't let his biases interfere with his mathematical estimates.

Sometimes good predictions come from unexpected quarters. During the 1996 primary campaign, liberal comedian Al Franken correctly predicted that Dole's running mate would be Jack Kemp. He based his conclusion on a quip by Newt Gingrich that Kemp (a former NFL player) has showered with more blacks than most Republicans have shaken hands with. Franken followed this quote with a list of "Politicians Who Have Showered With Blacks," consisting of former athletes who went into politics.

In 1999, Franken released a book describing a bizarro version of the upcoming 2000 election in which Franken himself becomes the Democratic nominee, with an all-Jewish staff. And guess who his running mate is? Why, Joe Lieberman, of course. His stated reason is that he wants to balance the ticket because "I'm Reform and he's Orthodox."

After these two episodes, I began to pay more attention to Franken's musings. Maybe buried beneath the comedy was some sound insight, I figured. That's why I was puzzled when his 2005 book hinted that the next president would be Barack Obama. At the time, Obama had told the press "unequivocally" that he would not run in 2008. I thought to myself, "I guess Franken is finally wrong about something." Hmmmph. (To be fair, I should mention that the book also predicted that the Republican nominee for 2008 would be Bill Frist.)

What's Franken's secret? Good instincts, or dumb luck? You be the judge. Nobody can know with certainty what the future holds. But I value predictions, because they are opinions with results. They test a person's capacity to think objectively, without letting wishes or fears get in the way. They also tell us something about the quality of the person's reasoning. And they help us weed out the shills.

Saturday, December 06, 2008

The cartoon that made science cool

Stephen King and Stephen Jay Gould each wrote a foreword to a Far Side gallery, and they had such opposite perspectives on a comic they both praised that it illustrated different approaches people can take toward humor.

King, in his foreword to The Far Side Gallery 2, refused to explain what made Larson's cartoons funny. He wrote that The Far Side was a "uniquely unique" comic that "will make you laugh your butt off," but "I can't tell you why," because "There's no way to explain humor any more than there is a way to explain horror."

Gould, introducing The Far Side Gallery 3, apparently disagreed. His essay, which was twice as long as King's, examined various Larson cartoons to give a sense of why The Far Side was especially popular among scientists. According to Gould, Larson (who never worked as a scientist) understands "the intimate details of our lives and practices." As an example, Gould points to a Far Side cartoon showing astronomers fighting over a telescope, and the caption says "All day long, a tough gang of astrophysicists would monopolize the telescope and intimidate other researchers." Gould claims that this cartoon resonates with scientists because "telescopes are in desperately short supply, and...scientists (particularly Ph.D. students low on the totem pole) often wait months for a few hours of evening viewing (tough darts, and back to 'go,' if it's cloudy)."

Another cartoon showed an "average size" American family consisting of two parents, a whole child, and, literally, a half-child sitting in front of a TV set (alluding to studies that say things like that the average American family has 1.5 children). To Gould this illustrates that "language is not a neutral medium of optimal communication, but also a reservoir of illogic, cultural chauvinism, and literally senseless cliché."

Gould also talks about how Larson "places animals into human situations--using the differences to show how less than logical or universal our unquestioned practices can be." One recurring theme he sees in Larson's cartoons is the idea that "Animals have intelligence different from ours; they are not just primitive models of our achievements."

I prefer Gould's perspective to King's. Of course King is right that you can't rationalize humor. Either you laugh, or you don't, and nobody can persuade you to laugh at something you don't find funny. But Larson's cartoons are more than silly diversions. They have a point of view that resonates with people.

Larson's own thoughts on the matter can be found in his 1989 book PreHistory of the Far Side, a behind-the-scenes look at the process he goes through as a cartoonist. Among other things, he shows the doodled cartoons in his sketchbook alongside the final published versions. For example, a sketch of "If dogs drove," in which dogs are seen driving cars while hanging their heads out the window, was made into a cartoon called "When dogs go to work," in which dogs are sitting on a bus hanging their heads out the window, and the driver is also a dog doing the same thing.

Larson claims that he doesn't know where he gets his ideas: "Some cartoons spring forth from just staring stupidly at a blank sheet of paper and thinking about aardvarks or toaster ovens or cemeteries or just about anything" (p. 42). But in the process of developing an idea, he can be pretty analytical. He deals at length with how the subtle features of his cartoons--a character's facial expression or clothing, or something in the background--can have an effect on the humor.

Unlike the slightly conceited Bill Watterson in his annotated Calvin and Hobbes book, Larson is charmingly self-effacing. He kids himself a lot and never seems fully aware that he rules the field when it comes to single-panel cartoons. If you think what he does is easy, I'd like you to name another nationally syndicated single-panel cartoonist who even begins to approach Larson's mastery of the genre. I've seen plenty of comics try, but come up short.

One such comic is Mother Goose and Grimm. What struck me about Larson's sketchbook was how many of his initial ideas were at about the level of published Grimm cartoons, then he'd improve them. In one of his doodles, for example, two spiders are standing by a web and one of them says, "Nice threads." That's the sort of dumb pun I've seen in Grimm many times. But the final version is almost unrecognizably different. Four spiders are sitting around a little table next to a web that appears to have flowers stitched into it. The caption goes, "You and Fred have such a lovely web, Edna--and I love what you've done with those fly wings."

While he often starts with a simple gag, the cartoons he develops are usually more complex. They typically tell a little story that we are expected to unravel. I think of the one in the pet shop, where we see a piranha on display, and at the far end of the shop is a cat with wooden forelegs. This cartoon has no caption; it doesn't need any. The picture speaks for itself.

As Gould discerned, Larson's most frequent conceit is putting animals into human situations in order to comment on human behavior. In one cartoon, we see a mother chicken feeding a bedridden chicken a bowl of soup and saying, "Quit complaining and eat it!... Number one, chicken soup is good for the flu--and number two, it's nobody we know." The absurdity of the situation is what creates the humor, but we also end up thinking a little about human morality.

Larson admits to being flattered by his popularity among scientists, but he says it has its downside: every time he makes a scientific error, he receives a flurry of letters correcting it. For example, in one cartoon a husband mosquito walks into his house and sighs that he has been "spreading malaria" across the country. Numerous readers pointed out to Larson that only female mosquitos bite. Larson's response: "Of course, it's perfectly acceptable that these creatures wear clothes, live in houses, speak English, etc." (p. 124).

One often overlooked point is that The Far Side appealed not just to scientists but to scientifically informed laymen. Gould's favorite Far Side features a group of Protozoa watching a slide show, when one says, "No, wait! That's not Uncle Floyd! Who is that? Criminy, I think it's just an air bubble!" Scientists get the joke, but so does anyone who has ever operated a microscope.

In many cases, Larson takes a familiar biological fact and mixes it up with a stereotypical human situation. In one cartoon, an insect couple are sitting on a sofa in their house. Dad is reading the newspaper, and Mom is yelling to their daughter, who is walking out the door: "Hold it right there, young lady! Before you go out, you take off some of that makeup and wash off that gallon of pheromones!"

Larson did for cartoons what Douglas Adams did for fiction: take scientific ideas and derive humor from them that general audiences could understand. Even nonscientists appreciate Larson's attention to detail. His drawing ability is underrated, perhaps because of the crude, broad style of his faces and bodies. But when he draws animals, even highly anthropomorphized ones, the details are far more authentic than you'd expect from a syndicated cartoonist.

Larson also explains how he came up with some of his worst cartoons. A notorious example is "Cow tools," which features a cow standing by a table on which rest four oddly shaped objects, one of which looks like a saw. He admits the cartoon simply didn't work, but he explains what he intended it to mean, and while his explanation doesn't make the cartoon funny, we do get a sense of his thought processes that usually lead to better cartoons.

I've never thought of The Far Side as subversive, but a few of his cartoons have provoked controversy. Some readers are put off when he shows animals being hurt. None of these cartoons ever bothered me in the least, and I'm an animal lover. I love cats, but I was always amused by the cartoon where two dogs are playing "tethercat." As Larson points out, dogs beating up on cats is just an old cartoon convention: "I could understand the problem if these were kids batting an animal around a pole, but the natural animosity between dogs and cats has always provided fodder for humor in various forms" (p. 158).

On the other hand, when he shows some cartoons of his that his editors refused to publish, I admit he did push the boundaries of good taste. In one of them, a snake is crawling through a baby's crib and has a lump in the middle of its body at exactly the place in the crib where the baby should be. According to Larson, "editors, I'm convinced, have saved my career many times by their decision not to publish certain cartoons" (p. 176).

Other aspects of the book include the somewhat inspirational story of how he got into cartooning; examples of his first comic, Nature's Way; cartoons he based on personal experiences or short stories he wrote; some of the embarrassing mistakes he and his publishers made; and a lengthy gallery of his own favorite Far Sides.

Lastly, I should mention my own personal favorite, which happens to appear in the book. A dinosaur is standing behind a podium, speaking before an audience of dinosaurs. He says, "The picture's pretty bleak, gentlemen.... The world's climates are changing, the mammals are taking over, and we all have a brain about the size of a walnut."

I won't try to analyze it for you.

Sunday, November 16, 2008

Why Obama will probably be a two-termer

Obama's presidency is certain to disappoint, no matter what he does or doesn't do. The expectations for him are not only unusually high, but contradictory. His liberal supporters are happy to finally get an unabashed liberal in the White House, while his conservative supporters hope he will be a pragmatist. If he compromises on rolling back the controversial policies of the Bush years, he will disappoint many supporters, but if he pushes too hard, he risks alienating many other people. He has already made a range of promises that may be hard to implement in the current financial crisis, which may come to define his presidency regardless of how he handles it.

Does this mean he will be un-elected in 2012? Possibly. But that is less likely to happen than many people realize, no matter how far his star falls. He won by defeating the party in power. So far, our nation has seen fifteen presidents reach the office that way and not die in their first term. Those were (in reverse order) Bush, Clinton, Reagan, Carter, Nixon, Eisenhower, FDR, Wilson, Harrison, Cleveland, Grant, Lincoln, Pierce, Polk, and Jefferson. Of that group, eleven were reelected (though Cleveland lost his first reelection bid due to an Electoral College fluke). The remaining four--Polk, Pierce, Harrison, and Carter--each had unique circumstances stand in their way.

Polk never ran for reelection, due to poor health. Pierce ran but wasn't nominated by his party. Harrison was nominated and went on to lose in the general election, but he hadn't won the popular vote the first time around (due to the aforementioned Electoral College fluke). Carter is the only president in U.S. history to win the popular and electoral vote against the party in power, serve one term, get nominated for reelection, and lose (popularly and electorally)--an indication of how badly his presidency went. Even then, it took a Reagan to defeat him. In the 1980 election, Carter was actually leading in the polls until Reagan gave a strong debate performance a week before Election Day. Astonishing as it may sound, a weaker Republican candidate than Reagan could easily have guaranteed Carter a second term.

Many of Obama's opponents would like to believe he'll be the next Carter. From the evidence of the campaign, it's unlikely. In the 1976 election, Carter managed to shrink a 33-point lead down to a virtual tie, then win in a squeaker. Obama, in contrast, exceeded the expectations, winning in an electoral landslide that seemed like a stretch just months ago. Regardless of the job Obama does as president, he has proven himself to be a far more skillful politician than Carter ever was. If Republicans fail to accept that fact, they will continue to lose power. They might not be where they are today if they didn't already make that mistake.

Wednesday, October 29, 2008

The myth of the myth of the popular vote

In making the case against the Electoral College, I frequently run into an insidious argument which states that the "popular vote" not only doesn't determine the winner but is a meaningless concept in our system. The argument goes as follows. Because of the Electoral College, candidates campaign in some states and not others, and this affects the outcome. Al Gore may have received half a million more votes than George W. Bush, but that was the end result of two campaigns that had been conducted on a state-by-state basis. We have no way of knowing what the nationwide totals would have been in a non-Electoral College system, because the campaigns would have been conducted differently, yielding different results. Therefore, Gore's apparent popular-vote lead doesn't mean anything.

I heard this argument several times from Republicans in 2000. I heard it most recently from Charles M. Kozierok, a blogger and self-described Democrat who is presumably not speaking from partisan bitterness over what happened eight years ago. I find the argument insidious because it attempts to whitewash the damage that a split between the electoral and popular vote does to public confidence in our system. The fallacy of the argument is that it confuses the causes of public opinion with the measurement of public opinion.

The purpose of an election is to ascertain the will of the people. It cannot ever be a perfect measure, since we can't read minds. A flat tire on the way to the polling station can distort the outcome. So can a medical emergency, or bad weather, or problems in the voting machines, or any number of other factors unrelated to people's intentions. Nevertheless, an election is meant to reflect as accurately as possible what the public thinks at a particular moment in time.

Campaigns do affect the outcome, of course. But they have nothing to do with the accuracy of the election in measuring public opinion. Rather, they have an effect on the public opinion itself before it is measured. For example, if a candidate campaigns in Missouri but not Kansas, the election will probably turn out differently than if he were to campaign in both states. But that simply means he has influenced the voters in a particular way, before their views were measured in the polling booths. The combined vote total in both states is still an accurate and meaningful measure of the collective will of Missouri and Kansas voters, whom the candidate helped influence. So too with the collective will of voters in all fifty states (plus DC). It may not determine the winner, but it is independently significant--a "valid metric," to use Kozierok's terminology.

Still, I agree about one thing: the state-by-state campaign strategy used by U.S. presidential candidates is the most obvious consequence of our having an Electoral College. A true split between the popular and electoral vote is in fact quite rare. It has apparently happened four times in our history. What is seldom pointed out, however, is that only once was the split uncontroversial. That was in the 1888 election, when Benjamin Harrison lost the popular vote but won the election, apparently without any controversy over the results.

The other three cases were a different story. In 1876 and 2000, the election ended in a months-long battle over the voting results in Florida, and the man who finally triumphed was widely viewed as an illegitimate president, not because he lost the popular vote but because his triumph in Florida was called into question. Thus, in both cases the popular-electoral split was arguably an illusion.

The 1824 election was the strangest. Andrew Jackson received a plurality of the popular and electoral votes. But because there were four major candidates, he failed to reach an electoral majority, so the election was thrown into Congress, which made John Quincy Adams president. (Jackson would gain the presidency four years later.) Partly that was because of Speaker of the House Henry Clay, whom Adams subsequently appointed Secretary of State, an act that struck many people (including Jackson) as bribery. Constitutionally the outcome was legitimate, but an expression of the public will it was not.

Given the relative rarity of these kinds of situations, the difference between the Electoral College and a direct-vote system might seem more theoretical than practical. But one very tangible difference that shows up in every election is the suppression of third parties. The most striking example was the 1992 election, when Ross Perot, running as an independent, received 19% of the popular vote but not a single electoral vote.

Earlier that year, Perot had led in the polls. What would have happened if he had maintained that lead into November? Probably he would have won enough electoral votes to throw the race into Congress, which would then almost certainly have gone for one of the major-party candidates (probably Bush).

That outcome would not have impressed the Founders, who opposed the idea of a two-party system. Actually, the Founders failed to anticipate many things about the Electoral College. And no wonder. That was a time when Thomas Jefferson referred to Virginia as "my country," and when the term United States was treated as a plural. None of them predicted the gradual weakening of the states and strengthening of the federal government over the course of two centuries. That's why I'm amazed whenever I hear defenders of the Electoral College talk about the "wisdom" of the Founders in creating this system. If you're going to defend it, at least acknowledge that its value is due as much to luck as to foresight.

Friday, October 24, 2008

Secret Catholic terrorist

It's been enlightening reading about past presidential campaigns in this country.

For example, when John Frémont ran for president in 1856, he was rumored to be a secret Catholic. (He was actually Episcopalian.)

When Al Smith, the first actual Catholic nominee, ran in 1928, people said that as soon as he entered office he would extend the Holland Tunnel to the basement of the Vatican.

Nowadays, this issue has receded so much from our national consciousness that no one seems to notice we're on the verge of electing the first Catholic vice president.

It's striking how much has changed, yet how little.

Sunday, September 28, 2008

Party swap

At this year's Republican Convention, Mike Huckabee said, "Abraham Lincoln reminded us that a government that can do everything for us can also take everything from us." I've been trying to figure out what he meant by that. In Lincoln's day, it was the Democratic Party that preached laissez-faire, free trade, and states' rights, while the Republicans advocated increased taxation, protectionism, and an activist federal government. Was Huckabee mythologizing Lincoln as a small-government conservative? Or was he criticizing the massive government expansion that Lincoln in fact engendered? I suspect it was a little of both, because nowadays the party of Lincoln is also the party of neo-Confederates.

I often see Republican politicians walk that tightrope, invoking the mantle of Lincoln without directly praising Lincoln's politics. It's striking that Democrats rarely do this with their presidential godfather, Thomas Jefferson, who, similarly, didn't have much in common with today's Democrats. ("That government is best which governs least.") Huckabee's reference to Lincoln was one of several during the Republican Convention, but the Democratic Convention featured just one reference to Jefferson, and it was in a speech by Jim Leach, a Republican.

I can understand why. Historians of all political stripes consider Lincoln the greatest U.S. president, who kept the nation from splitting apart and oversaw the abolition of slavery, perhaps the most important moral development in our nation's history. When reading about Republicans in the nineteenth century, it is hard not to think of them simply as the good guys and the Democrats as the villains. While the picture was more complicated than that, the Republicans did begin as an anti-slavery party and continued to support the interests of African Americans after the Civil War, even as Democrats were loudly proclaiming the inferiority of the Negro. The Democrats' racism continued well into the twentieth century, with their support for the Jim Crow laws.

Modern-day Republicans like to point out these ugly facts to undermine the Democratic Party's legitimacy on race issues. But the fact remains that the Democrats, to a large extent, were the ones who first embraced the civil rights movement of the 1960s. That a white-supremacist party evolved into a civil-rights party--and, ultimately, became the first party to nominate a black man for president--is one of the more remarkable facts about our nation's political history.

How and why these realignments happened is the subject of Lewis Gould's 2003 book Grand Old Party. Gould argues that certain features of the Republican Party have remained constant even as its philosophy of government, as well as its demographics, changed. Among other things, Republicans always had a close relationship with the business community. That they initially saw no conflict between this relationship and their regulatory views suggests how radically different society was back then.

According to Gould, Teddy Roosevelt's departure from the Republican Party was a seminal event in solidifying the party's conservative philosophy. The other Roosevelt's presidency, on the other hand, represented the beginnings of the Democratic Party's embrace of welfare capitalism. That was when blacks began migrating to the Democrats. Southern whites remained attached to the party and wouldn't start to become agitated until Truman's administration.

It was in the 1960s, especially in the candidacy of Barry Goldwater and in LBJ's passage of key civil-rights legislation, that the white South became solidly Republican, while African Americans became solidly Democrat. Goldwater's role in this process was not entirely fair. He was generally supportive of civil rights, and he had helped desegregate the Arizona National Guard. But his opposition to the Civil Rights Act of 1964 had an important symbolic impact.

Both parties eventually reached a consensus on the issue of desegregation. But it is hard to forget what initiated the realignment of the South. One notable Dixiecrat-turned-Republican was Strom Thurmond, famous for the longest filibuster in history to stop the Civil Rights Act of 1957 (which Goldwater supported). On the other hand, ex-Klansman Robert Byrd remained in the Democratic Party. Of course, neither of these men continued to preach racism after the 1960s.

That's what makes the question of "Where did the racists go?" so complicated. Some of them had a genuine change of heart, regardless of which party they ended up in. They all grew old while the younger generation forged its identity in a world more accepting of diversity. But African Americans have not forgotten how the parties developed to their current state, which is why the vast majority of them vote Democrat to this day despite holding some conservative views.

There is still evidence of racism among whites in both parties. A recent study suggested that one-third of white Democrats and independents hold negative views of blacks. Blogger Nate Silver has criticized the survey for both its methodology and its attempts to draw conclusions about the current election, but I have observed throughout this year that many Democrats are beginning to notice the old-fashioned racists still lurking within their own party. It is time to engage in a little reflection and stop placing the blame solely with the other party.

Sunday, September 07, 2008

Tea/No Tea '08

The Republican race this year has begun to remind me of a point in the old Infocom text adventure Hitchhikers Guide to the Galaxy.

In that game, you are making your way through a spaceship when you find an object called tea on the ground. You can pick it up by typing "TAKE TEA." There is also an object called no tea which you can pick up: "TAKE NO TEA." But you cannot pick up one while holding the other, since you cannot be simultaneously holding tea and no tea.

The problem is, you have to do just that in order to access the ship's computer. To make this seemingly illogical act possible, you must temporarily become a microscopic entity inside your own brain and remove the Common Sense Particle. Once it is removed, you are free to hold tea and no tea at the same time.

After campaigning on experience and deriding his opponent for a lack of it, John McCain has now selected an inexperienced running mate. She not only lacks foreign policy experience, she has virtually no record of even expressing foreign policy views. In an interview from last month, she didn't even recognize that we have an exit plan for Iraq!

The hypocrisy was so transparent it even caught the attention of many conservatives, including David Frum, Charles Krauthammer, George Will, Ben Stein (who called her "the most peculiar vice-presidential choice there has ever been"), and former McCain strategist Mike Murphy. Others did 180-degree turns on things they had said, prompting a great bit on The Daily Show.

But those who thought McCain could no longer invoke the experience argument were quickly rebuffed by the RNC, which flaunted McCain's experience and Palin's non-Washington status. Somewhere along the line, he removed the Common Sense Particle, figuring he could convince voters to elect tea and no tea at the same time. The odd thing is, he may be right.

While I'm too much of a wimp to make any definitive predictions, I see Democrats falling into a trap. It's the same trap they fell into with Bush in 2000, setting the expectations so low that very little was needed to exceed expectations. You know something's seriously out-of-whack when all a candidate must do to quell many people's doubts about her readiness is capably deliver a speech she didn't write.

I won't go into a detailed refutation of the RNC's attempts to puff up Palin's record while tearing down Obama's. Many other sites have already taken up the task. What's telling is the unstated assumption that her experience must be measured against Obama's. Obama never ran on experience; McCain did. Had Palin been a presidential candidate earlier this year, McCain would almost certainly have assailed her lack of experience, as he in fact did against Romney, Giuliani, and Thompson, all of whom have considerably more experience than Palin.
"We don't have time or opportunity for on-the-job training, and the other candidates for president I don't believe have the qualifications that I do to hit the ground running and immediately address these serious challenges," the four-term Arizona senator and Vietnam veteran told reporters following a speech on the military.

"The country would be safer with me as its leader," McCain added. He said that while he respects his opponents, "this is all about who is best equipped to take on the challenge of radical Islamic extremism."
The selection of a running mate is important not just because of who gets picked, but because it tells us something about how the person at the top of the ticket makes decisions. Obama made a pragmatic if unexciting choice. McCain made a political choice. If experience matters to him as much as he has claimed, what does his selection tell us about his commitment to putting "country first"?

At the convention, Republicans adopted the Lloyd Bentsen strategy. Their message was, "We know Sarah Palin. Sarah Palin is a friend of ours. Senator Obama, you're no Sarah Palin." The trouble is, that first sentence is a lie, and anyone who's been paying attention realizes it.

Thursday, September 04, 2008

Design creating the designer

Given the achievements of theoretical physics in the last century, it can come as a shock to realize the amount of unbridled speculation in the field. The way Paul Davies presents the topic in his book The Goldilocks Enigma occasionally gives it the aura of classical mythology. For example, tell me the following account of the early universe doesn't sound like some primordial battle between a good and bad deity:
Whenever matter and antimatter mingle, they quickly annihilate in a burst of gamma rays.... So that presents a puzzle: how did the big bang make 1050 tons of matter without also making 1050 tons of antimatter?.... however it is done, the story of the origin of matter would go something like this. The heat radiation released after the big bang created copious quantities of both matter and antimatter, all mixed together, but containing a slight excess of matter. As the universe cooled, the antimatter would be totally destroyed by virtue of its being in intimate contact with matter, leaving unscathed the small residue of excess matter--about one part in a billion. (p. 105)
The coincidence of my coming upon this book just a few weeks after I wrote my post about pantheism, which covers similar ground, wasn't lost on me. Both deal with the question of why the dead universe around us seems uniquely suited for life. To rephrase the old conundrum, how come anyone's around to hear the tree make a sound?

Physicists over the past few decades have discovered that many of the physical laws of the universe seem "just right" for the development of life. If the numbers had been slightly lower, or slightly higher, life as we know it could not have come to exist. I will mention just a few examples, because the topic is vast and has been covered thoroughly in numerous books:

1. If the neutron were slightly lighter, the proton would be unstable and atoms probably could not have formed. If it were slightly heavier, nuclear fusion would not be possible and stars could not have formed.

2. If gravity were slightly stronger, all stars would be giants with relatively brief lives. If it were slightly weaker, heavy elements necessary for planet formation would not have been produced.

3. If the nuclear resonance level of carbon were any different, our sun could not have produced high quantities of the element.

One must exercise caution when examining these apparent facts. Perhaps a very different sort of life, or lifelike phenomenon, would have emerged under other conditions. Perhaps we aren't exercising our imaginations enough. But examples like these have piled up, and so far they haven't gone away.

Like me, Davies isn't satisfied with the standard copout, "That's just the way the laws are, and if they weren't that way, we wouldn't be here to discuss it." To illustrate the flaw in this argument, he takes off from an idea in Carl Sagan's novel Contact. The number pi consists of decimal digits going on into infinity. The digits are completely arbitrary except for the fact that the number is derived from nature. Let's say you created a computer program displaying the number in binary, where a light pixel would represent one, and a dark pixel zero. Most likely, the screen would be flooded with meaningless "snow." You wouldn't expect to see a coherent image, such as a circle, much less a smiley face. But what if one did appear after just two minutes? Assuming the program wasn't rigged, the only conclusion most scientists would permit would be that it's just a freakish coincidence.

Davies explores several possible solutions to this dilemma. (I will deal with only a few of them here.) The most popular is the hypothesis of multiple universes. In an infinity of universes, some are bound to produce life. Those that don't will obviously go unnoticed. We're here simply because our universe is one that happened to have the right set of laws needed for our existence.

The multiverse hypothesis isn't purely ad hoc; it seems to follow from certain versions of the Big Bang theory, as well as from certain versions of quantum mechanics. But there are major problems with the hypothesis, as Davies explains. For starters, it is borderline untestable. It also seems to violate Occam's razor, the principle that theories should be as simple as possible. And it leaves unanswered the question of where the universe-generating mechanism came from.

One bizarre twist on this hypothesis bears mentioning. Davies quotes Oxford philosopher Nick Bostrom as saying, "There is a significant probability that you are living in [a] computer simulation. I mean this literally: if the simulation hypothesis is true, you exist in a virtual reality simulated in a computer built by some advanced civilisation. Your brain, too, is merely a part of that simulation."

That's just a modern variation on an age-old philosophical idea, but Bostrom makes a case for it based on the multiverse hypothesis. See if you can follow this. If countless universes exist, there are likely to be ones containing civilizations that have reached the point of creating simulated universes. Any civilization with that capacity is likely to exercise it numerous times. Therefore, there are likely to be more fake universes than real ones, and so, by the laws of probability, we are more likely to inhabit one of the fakes.

Davies has fun with this idea. If we're in a simulation, who is to say we aren't in a simulation-within-a-simulation? "Logically there is no end to this nested sequence.... The real universe could be lost amid an infinite regress of nested fakes. Or it may not even exist at all. Reality might consist of an infinite sequence of simulations, period" (p. 185).

As the saying goes, that way lies madness. Bostrom's argument has many holes, but the most basic is that the conclusion undermines the premises. If we live in a fake universe, how do we know the physical laws we have discovered--on which the multiverse hypothesis rests--accurately describe the reality outside the simulation?

Davies, in any case, prefers a one-universe model, but he still suspects that the seemingly life-friendly laws cannot be due to chance. He proposes that there must be something leading our universe in the direction of producing conscious, thinking beings like ourselves. What that something is, he leaves open, but he puts forward a series of related theories that he maintains are compatible with modern physics.

While I'm not sure I understood all the details, his basic idea is that conscious life itself, in the far future, somehow influences the early universe to produce life in the first place. The laws create life, and life creates the laws, in a sort of circular time-loop with no ultimate origin. I couldn't help thinking of the following Escher drawing:


Davies contrasts this idea with the classic grandfather paradox, where a time traveler kills an ancestor. He says it is more like a time traveler saving the life of a girl who will one day become the time traveler's mother. Davies insists that this scenario, while strange, does not create a paradox.

Actually, the scenario is called an "ontological paradox." One notable example I discussed on this blog is from the movie Somewhere in Time. A man receives an antique watch from an old lady. He later goes back in time and gives it to a young woman, who will become the old lady who gives it to his younger self--and so on, ad infinitum. The paradox is that the watch was never built by anyone, at any time. It just eternally exists, fully formed. Since Davies is as bothered as I am by the question of why anything exists, I would think he'd stay far away from such scenarios, which only compound the question.

Considering the difficulties with all these theories, why not accept the traditional idea of a creator God? Davies thinks this answer presents at least as many difficulties as the others. For one thing, did God choose to create the universe, or was it a necessary act that flowed from His very nature? Either possibility leads to additional questions. Also notice the problem with describing creation as an event in time--God exists outside of time, at least according to traditional religions.

I think that such questions miss the point. God by definition is where rational inquiry ends. To believers, the purpose of belief is to transcend the rational in order to connect with the unfathomable. The usual response by scientists is that God, then, should remain forever outside of scientific discussions. To some extent, I agree. But when scientists are reduced to positing fake worlds within fake worlds, or self-created entities from an unexplainable time-loop, we are justified in wondering if the end has already been reached.

Tuesday, September 02, 2008

Word relativity

While in college, I once gathered entries from Dictionary of Changes in Meaning by Adrian Room, a book that gives obsolete definitions of common English words and traces their evolution to their current meanings. Here are a few of the examples I collected:

algebra: bone-setting
buxom: obedient
coffin: basket
computer: person who does computations
corpse: a living person's body
friend: lover
garbage: animal food
girl: child of either sex
grammar: the study of Latin
hussy: housewife
jargon: twittering of birds
jest: noble deed
kill: to strike or beat
knight: boy, youth
lair: bed
larva: ghost
lewd: not a member of the clergy
litter: portable couch
nice: foolish
nosy: having a large nose
passenger: traveler on foot
poison: any liquid mixture, not necessarily toxic
sagacious: having a keen sense of smell
silly: blessed
snob: shoemaker
toilet: cloth wrapping
tomboy: boisterous boy

Wednesday, August 27, 2008

Why I love color (but not colorization)

Somebody recently created an online version of Raiders of the Lost Ark divided into fourteen chapters and made to resemble those old serials that were a big influence on the film. I'm unsurprised how good it looks in black-and-white. The movie's visual style has always struck me as owing a great debt to black-and-white movies. I think of that moment when a man's shadow appears behind Marion in the bar, and we know it's Indy because of the outline of the fedora.

I thought back to the 1980s when Ted Turner began colorizing numerous classics, causing an uproar among filmmakers and critics alike--including Siskel and Ebert, who described the process as "cultural butchery." Though the technical aspects of colorization improved over time, I agree that the process inherently detracts from a film. But Ebert went further and argued in an essay (titled "Why I Love Black and White" in his Movie Home Companion) that there was something special about black-and-white that color films could never capture.
Black and white movies present the deliberate absence of color. This makes them less realistic than color movies (for the real world is in color). They are more dreamlike, more pure, composed of shapes and forms and movements and light and shadow. Color films can simply be illuminated. Black and white films have to be lighted. With color, you can throw light in everywhere, and the colors will help the viewer determine one shape from another, and the foreground from the background. With black and white, everything would tend toward a shapeless blur if it were not for meticulous attention to light and shadow, which can actually create a world in which the lighting indicates a hierarchy of moral values.
Ebert stopped short of arguing that black-and-white was intrinsically superior. As he put it, "On a properly controlled palate, a color movie can be a thing of wonder." I think his point was simply that black-and-white pays special attention to elements that color ignores, and hence colorization inevitably mars a picture. But he never paused to ask why some viewers prefer color, other than force of habit.

Unlike Ebert, I grew up in the color era, though I watched many black-and-white movies as a kid. I appreciate black-and-white cinematography for all the reasons he mentions, and I agree that black-and-white films ideally should stay black-and-white. Yet on some level I find color more pleasing to the eye.

I know I'm not alone in this. For most people, I think, the beauty of the world involves the many colors our eyes can perceive. Think of flowers in a garden, or a deep blue sky on a sunny day, or a spectacular display of fireworks at night. Black-and-white objects can also have a simple beauty to them, but I thank God I am not colorblind.

I like color for much the same reason I prefer light over darkness, day over night. In fact, one of the striking things about the black-and-white Raiders was the difficulty in distinguishing night from day. A blue sky would end up looking overcast, and everything just seemed a lot darker than it did in the original film. Granted, it was an amateur's transformation of a color film, but black-and-white movies always make me feel like I'm entering a darker world than the one I inhabit. That's not necessarily a bad thing, but it begins to explain why many viewers today are turned off by black-and-white movies, thus stimulating the need for colorization.

It's a little like the dubbing of foreign films: it does hurt their quality, but it also makes them accessible to people who would otherwise not be inclined to watch them. I personally cringe at both dubbing and colorization, but I understand why others feel differently. It's the alteration that ruins it for me; I do not object to the fact that most films since the late 1960s have been shot in color. What the shift signifies, in my view, is not so much technological advancement as social change that has made the symbolism of color resonate more strongly than it did in the past.

Most black-and-white films in the modern period fall into one of the following categories: (1) experimental indie flicks, like Darren Aronofsky's Pi (2) period pieces, like The Man Who Wasn't There (3) films seeking to evoke older cinema, like Young Frankenstein.

Mixing black-and-white with color usually comes off as pretentious, though a few films through the ages have made great use of this effect. The most famous is, of course, Wizard of Oz. A recent example is the ultra-violent, noirish Sin City, where the black-and-white and color blend together so seamlessly you truly feel you've entered a bizarre alternate universe.

The convention of filming flashbacks in black-and-white was put to great use in 1998's American History X, where the protagonist's days as a racist skinhead are shown in black-and-white, and his life after he reforms is shown in color. The symbolism here is relevant, because the film suggests not only that he stops seeing the world in black and white, but even more that he stops seeing the world as divided into blacks and whites.

Earlier that same year, Pleasantville also used black-and-white to suggest both simplicity and racism. The film depicts two teens from the 1990s who get magically transported into a 1950s sitcom, and their modern behavior gradually causes other characters to acquire color. The town is scandalized by this development, and pretty soon we see a shop with the sign "No coloreds allowed." Color here represents all the aspects of modernity that '50s television tried to suppress, from racial integration to sexual liberation.

Despite Ebert's praise of the "dreamlike" qualities of black-and-white, it cannot show the full range of our dreams. A skilled filmmaker can exploit this limitation to great effect, but it's still a limitation, one that can never quite contain the nuances of our modern age. Black-and-white movies will always have their place, but for the most part they are the mark of an earlier, simpler era.

Sunday, August 24, 2008

Thought-provoking quote

Reuven Firestone, in An Introduction to Islam for Jews, writes (pp. 235-6):
More than once [while living in Egypt] I heard people there criticize American culture for its innately violent nature and declare that Americans are an aggressive and brutal people who lack respect for human life. Some Egyptians who made the case pointed to the extraordinary level of violence in American film and television. Some cited the results of American studies published in the Arab press that establish the murder rate in the United States as one of the highest in the world, and off the charts when compared to nations with a similar standard of living and cultural level. As one Egyptian acquaintance told me, "You Americans start wars all over the world, but you never fight for your own soil or on your own land. You exploit the fears and pain of others in order to take over somebody else's natural resources or exploit their labor."

I was shocked the first time I encountered this view because, although I consider myself critical of many aspects of American culture, what I heard is simply not the perception that I have of myself and my fellow Americans. It also gave me pause about many Americans' opinions about Muslims and Arabs, because, in fact, it is common to hear virtually the same critique by Americans leveled against Arabs: "Arab culture (or Islam) is innately violent, and Arabs (Muslims) are an aggressive and brutal people who lack respect for human life."

Wednesday, August 20, 2008

A shaggy-something story

I had my very first bear experience at Swallow Falls State Park early this Tuesday morning. I doubt it will be my last. While I never camped until I grew up, I've heard plenty of bear stories from other campers before and since.

The prospect always excited as well as frightened me. I have a childlike fascination with wild animals, but being attacked by a large carnivore is not the way I would like to go. Travel guides claim that black bears are rarely dangerous unless you do something stupid like taunt them. A sign outside the park listing animals in the area described the black bear as "not aggressive" but warned people not to feed one.

I woke up four in the morning and left the tent to read a book by the light of a propane lantern. After about thirty minutes I decided to go back inside. I was getting cold and had no jacket, and I didn't want to walk all by myself to the shower room until the sky got lighter.

As I stood up and stretched my muscles, I heard movement. I looked into the forest, and about fifteen feet away was an animal walking on all fours. I registered it in my mind as a raccoon, though it seemed too big to be one.

After re-entering the tent, I peered outside and noticed that the "raccoon" had climbed on the picnic table to investigate the remains of our meal from the previous night. My friend briefly woke up, and I told him there was an animal outside. Right as I said that, it went away, probably having heard us talking.

By that time, I knew it was no raccoon. A few weeks earlier I had seen a raccoon on the road near my home, and it was no bigger than a cat. This animal took up at least half the table it was on. As far as I know, raccoons do not stalk camp sites waiting for campers to retire for the night so they can steal food. This deliberate, rather intelligent, behavior is associated chiefly with bears.

But I couldn't make out its color or markings, and its head shape though not its body did remind me of a raccoon's. I never previously thought of raccoons as resembling bears, even though I know scientists have had trouble deciding which one of the two a giant panda is. (Nowadays, they usually place it in the bear family.) Somehow I doubt a panda made its way to a western Maryland campground.

Only gradually did I realize what it was I had seen. For some reason, its flat-footed gait and round, bulky frame did not immediately register with me. It looked no bigger than a large dog, and it hardly made a sound the entire time. I think it was a relatively small bear, not fully grown, but I could have miscalculated its size in the dim light. It looked so innocuous I can understand why some campers make the mistake of trying to interact with them.

Sunday, August 10, 2008

Trusting the enemy

Linked to at DovBear's blog

After hearing some of my friends repeating the rumor that Barack Obama had a Muslim upbringing which he has concealed, I checked and found it was flatly untrue. But that was several months ago. Only recently did I discover a fact that puts an ironic twist on the whole matter: the man who initiated the rumor, designed to make Obama look frightening to Jews, is himself a rabid anti-Semite.

The Washington Post traces the rumor to a 2004 article by Andy Martin, a politician who had run that year for the same Senate seat Obama ultimately won. The article wasn't as extreme as some of the later forms the rumor took--Martin didn't attempt to tie Obama to "radical" Islam, and he acknowledged that Obama was currently a practicing Christian--but it was the beginning. Nobody knows who started the anonymous chain emails, but Martin does take credit for being the first to argue publicly that Obama was raised Muslim.

As Wikipedia reveals, Martin is quite a character. (What's printed on Wikipedia is not necessarily accurate, but in this case it provides a range of credible sources, including official court transcripts.) Numerous federal and state courts have dubbed him a "vexatious litigant" due to his having filed hundreds of mostly frivolous lawsuits. He was denied a license to practice law in the state of Illinois because of his unprofessional behavior on various occasions. But I haven't gotten to the most interesting part. According to an article in the Chicago Tribune (reprinted on an Illinois Republican Party website):
In his past, Martin also has expressed anti-Semitic views. When he ran for Congress in Connecticut in 1986, the name of his congressional campaign committee included the phrase "to exterminate Jew power in America," Federal Election Commission records show.

In a 1983 personal bankruptcy case, he referred to a federal bankruptcy judge as a "crooked, slimy Jew, who has a history of lying and thieving common to members of his race." In a related court filing in the case, he also expressed sympathy to the perpetrators of the Holocaust.
Though he denies having made those remarks, despite what the public records show, he is still pretty open about his views on Israel. A columnist for a Florida newspaper summed it up (the embedded links are my own, pointing to more recent articles where Martin said these or similar things):
While he touts a two-state solution for the Middle East in his "Andy Martin Peace Plan," says he's close to the peace movement in Israel, and has proposed increased compensation for Holocaust victims, the candidate also called for the Bush administration to attack Israel instead of Iraq. He has compared Ariel Sharon to Adolf Hitler and has written in defense of Hamas suicide bombers.
Here are some highlights from his rambling 1983 affidavit which makes Mel Gibson's drunken rant seem mild by comparison. (I got the text from Justia.com, a legal site with thousands of court records.)
Although I was not a Jew hater when these cases began, any love for the Jews I may have had has been dissipated by barbaric tortures inflicted on me by the Jews. I can see now that anti-semitism has a real root in the ageless manipulation, chicanery and murder by the Jews.... Jews killed the son of God, and seek to deny the fact, and seek to murder and loot and steal from anyone who opposes their efforts at world domination.... I do not believe I can receive Justice from a pack of Jew thives [sic], judges and lawyers.... "Judge" Krechevsky is not neutral or detached. He is part of a Jew conspiracy to steal my property.... I am able to understand how the Holocaust took place, and with every passing day feel less and less sorry that it did, when Jew survivors are operating as a wolf pack to steal my property.
There's something sadly ironic about the fact that Jews concerned about Obama's relationship with the Jewish community have accepted the words of a real anti-Semite. It may seem strange that he would appeal to a fear of anti-Semitism. My guess is that he enjoys manipulating what he perceives as Jewish power. What's sad is how many of us have taken the bait.

Friday, August 08, 2008

Escaping the cage of language

The following is a transcript of a speech I gave at Toastmasters two days ago. I based it on my blog post "The cage of language," with strong help from Geoffrey Nunberg's article "If it's 'Orwellian,' It's Probably Not." My project assignment was to deliver a keynote address. I presented myself as the keynote speaker to the convention of the Language Guardians Party, who are nominating George Orwell, the first dead Englishman to run for president of the United States.

I am so pleased that you have handed me the opportunity to shoulder the burden of heading this convention so that we can face the issues of our day without knuckling under the pressure and mouthing empty platitudes just so we can elbow our way in to the American electorate.

No wonder politics gives people such a headache.

Language is a wonderful thing, but it is also a sneaky thing that can blind us when we aren't paying attention. Language can be used to express our deepest thoughts and insights, but it can also be used to confuse and distort and conceal. It's vital that we pay close attention to the dead metaphors and clichés that litter our language, because if we don't, they will take control of our thinking.

One person who truly understood this point was our nominee, Mr. Orwell, who explained his views most forcefully in his essay "Politics and the English Language." How many of you have read this essay? It's one of the most widely read essays of the twentieth century, and in many ways one of the least understood.

Mr. Orwell tells us that modern English is in a state of decay because the people who speak and write it have become trite, wordy, and vague. Mr. Orwell argues that this situation poses serious problems for our society.

I have noticed that people have three levels for understanding Mr. Orwell's message, with Level One being the most superficial, and Level Three being the deepest. I hope and believe that everyone in this room can progress to Level Three.

Level One is the idea that all Mr. Orwell was doing was telling us to communicate more effectively. Mr. Orwell says we communicate very poorly today, and he illustrates this by giving his own translation of a famous verse in Ecclesiastes. Here is the original version from the King James Bible:
I returned, and saw under the sun, that the race is not to the swift, nor the battle to the strong, neither yet bread to the wise, nor yet riches to men of understanding, nor yet favour to men of skill; but time and chance happeneth to them all.
Now, here is Mr. Orwell's translation of that verse into modern English:
Objective considerations of contemporary phenomena compel the conclusion that success or failure in competitive activities exhibits no tendency to be commensurate with innate capacity, but that a considerable element of the unpredictable must invariably be taken into account.
Let me ask all of you, does Mr. Orwell's translation make the verse more clear and understandable? [Audience: No!]

Does it make the verse simpler? [Audience: No!]

Does it make the verse more poetic? [Audience: No!]

Is this the kind of communication we want to encourage? [Audience: No!]

Should we aim to improve our English by making it clearer, simpler, and more elegant? [Audience: Yes!]

Then we've got a problem.

Why is it important to communicate clearly? "Well, uh, if we don't communicate clearly, then, uh, people will have trouble understanding us." Alright, why is it important for people to understand us? "Well, uh, if people don't understand us, then we won't be contributing to public understanding."

It's hardly self-evident that we should be clear. People can get very far in this country without being good communicators. In the academic world, it is often to your advantage to be as unclear as possible. Some of our most successful businessmen and entrepreneurs can barely string a sentence together unless it's written in C++.

Understanding the importance of good communication brings us to Level Two of Mr. Orwell's message. We need to be on guard against the people in power who manipulate the language to keep the masses in line. This is actually what most people think of whenever George Orwell's name is mentioned, the way that the power centers of society--the government, the media, the CEOs of major companies--use windy, confusing phrases to conceal their true intentions. I'll give some real-life examples of this Orwellian language: referring to a tax increase as a "revenue enhancement," or referring to deaths of patients in hospitals as "failure to fulfill their wellness potential." I can think of some of my own examples! Blackout: "precipitous circuit conclusion." Falling down a flight of stairs: "unpremeditated diagonal excursion." Forest fire: "vegetative borough ignition."

I've got another question for all of you. If Mr. Orwell were alive today, what would he think is the most Orwellian term of modern times? [Members of the audience give possible answers.]

I'll tell you. If he were alive today, he would say that the most Orwellian term is "Orwellian." Everyone today is always accusing someone else of being Orwellian. "My communication is clear and direct, but you, you're Orwellian." You hear this criticism from the left, from the right, all across the political spectrum. People use the term "Orwellian" so often that it has become exactly the kind of hackneyed, overused expression that Mr. Orwell was warning us against, the kind of expression that people use to mask lazy, conventional thinking.

That brings us to Level Three. You really thought all Mr. Orwell was telling us was to watch out for a bunch of silly euphemisms? If only it were that easy. All the Orwellian terms I've mentioned so far are so obviously ridiculous, most people aren't going to be fooled by them. The truly Orwellian expressions are the ones that pass unnoticed.

For example, the Republican Party officially claims to be "pro-life." Yeah, that's why they support the death penalty. The Democratic Party officially claims to be "pro-choice." Sure, that's why they support gun control. Pro-life and pro-choice are true examples of Orwellian language, yet very few people seem to realize it. That's why these expressions are so effective, because most of the time they pass beneath our radar. As a matter of fact, that very term, "beneath our radar," is itself an Orwellian expression, a vague, hackneyed metaphor that you just don't notice until I point it out to you.

Because we barely notice these expressions, they have the greatest potential to influence our minds without our realizing it. That's why we need to reflect, to look at our own language. The next time you find yourself calling someone else Orwellian, take a look at yourself and ask, "Am I really being clear? Or am I using vague slogans, clichés, and catchphrases? Because if I am, then I'm not thinking independently."

By appreciating all three levels of interpreting Mr. Orwell's message, we will learn to take control of our language before it takes control of us. We will learn to consider how we communicate, not just how others do. No one escapes the cage of language; the best we can do is be conscious of how it surrounds us.

Wednesday, July 30, 2008

God's third party

The most poorly understood philosophy about God is pantheism. To pantheists, God isn't the creator of the universe, God is the universe. To other people, this sounds more like a semantic trick than a coherent philosophy, as if pantheists are simply calling the universe by a different name, without making any unique statements about reality.

Curiously, many self-described pantheists almost seem to agree with that characterization of their beliefs. Pantheism.net, for example, approves of Richard Dawkins's description of pantheism as "sexed-up atheism." According to the website, "our beliefs are entirely naturalistic, and compatible with atheism, humanism and naturalism. Also with those forms of paganism that see magic and the gods as symbols rather than realities."

Hard-nosed skeptics find pantheism infuriating. They don't know how to deal with a system that renders proof irrelevant. Not that all traditional theists claim their beliefs are provable. But the statement "God exists" at least qualifies as a truth claim. The statement "God is the universe," on the other hand, merely redefines God as something which even atheists admit exists. Yet a fellow blogger makes a strong case that pantheism and atheism do in fact differ in their beliefs about reality:
The major difference lies in the appreciation for existence. What is existence really? Is it some random backdrop in which we find ourselves or is it an integral part of who and what we are?

Pantheists are generally philosophical Monists [who believe that] everything is 'one thing' and all comes from the same source. All things within the universe are interconnected....

The ultimate difference lies in what each side considers the basic substance of the universe to be like. The atheist conceives of nothing but subatomic particles whizzing about or random quantum fluctuations while the pantheist imagines a fundamental well-structured ground of being.
In practice, there is a fine line between pantheism and the views of traditional believers. Western forms of mysticism have challenged the simple assertion that "God exists." To the mystics, God is beyond existence in the usual sense. Many of them have come to think of God as the totality of everything, including, but not limited to, the universe. This view is called panentheism. It's pantheism with an extra syllable, which apparently makes all the difference as to whether it's acceptable to mainstream Judaism and Christianity.

The raison d'être of pantheism concerns two interrelated questions about the universe. Why does anything exist? And why is the universe that does exist capable of producing conscious beings--in effect, becoming aware of itself? Theistic philosopher Roger Scruton ponders this second question in a recent essay:
Dawkins writes as though the theory of the selfish gene puts paid once and for all to the idea of a creator God -- we no longer need that hypothesis to explain how we came to be. In a sense that is true. But what about the gene itself: how did that come to be? What about the primordial soup? All these questions are answered, of course, by going one step further down the chain of causation. But at each step we encounter a world with a singular quality: namely that it is a world which, left to itself, will produce conscious beings, able to look for the reason and the meaning of things, and not just for the cause. The astonishing thing about our universe, that it contains consciousness, judgement, the knowledge of right and wrong, and all the other things that make the human condition so singular, is not rendered less astonishing by the hypothesis that this state of affairs emerged over time from other conditions. If true, that merely shows us how astonishing those other conditions were. The gene and the soup cannot be less astonishing than their product.
Since atheists have no answer to the question of why anything exists, all they can do is neutralize it by asking "Who created God?" But the idea of an uncreated Creator as the conscious source of everything raises fewer questions than the idea of an uncreated universe which happens to have the properties needed to become conscious of itself.

It's no wonder so many atheists fall back on the hypothesis of multiple universes, even though an infinity of time and space in which anything can happen is little different in effect from an infinite Creator. Others pretend the question isn't objectively meaningful. "The world exists because it exists," they say, and they go on to suggest that our ability to come up with such questions must have evolved in our primitive ancestors.

That's a major point of divergence from pantheism, which attempts, however imperfectly, to bridge the gap between theism and atheism. Its tenets superficially resemble those of atheism, but it has a greater appreciation for the mystery of existence. The consequence of viewing existence as one interconnected whole, of which conscious beings that can reflect on the matter are an integral part, and not just a byproduct, is subtle but real.

Thursday, July 24, 2008

Apt author

A woman once told Stephen King that she enjoyed his anthology Skeleton Crew but skipped the section where he explained how he came up with the stories. "I'm one of those people," she said, "who don't want to know how the magician does his tricks." Recalling the experience later, King remarked, "I am not a magician and these are not tricks." (Nightmares and Dreamscapes, p. 675)

I suspect many authors would disagree. But it speaks volumes about King's outlook, a key reason he's one of my favorite writers. I went through a phase reading rival fear-meister Dean Koontz before I realized he was all technique and no soul. With King, I'm barely conscious of technique even though his books are more powerful than Koontz's.

These thoughts came back to me recently as I read "Apt Pupil" from Different Seasons, a collection of novellas with a more serious bent than his horror fiction. Three of the novellas have been made into movies, two of them quite excellent: Stand By Me and The Shawshank Redemption. But I never got around to reading "Apt Pupil," partly because of the unfavorable reviews the movie received, and partly because I doubted King could handle a subject as weighty as the Holocaust.

I am pleased to report that "Apt Pupil" brought into focus all the things I admire about King: his vivid imagination, his sharp attention to detail, his perverse sense of humor, and his mastery at crafting a battle of wills between two characters. But I was also impressed that he tackled material this challenging. He not only had to present a believable Nazi, he also had to confront the question of what makes people evil, all the while telling a compelling story about two unsympathetic characters surrounded by idiots.

The story is set in the 1970s. A pampered suburban youth named Todd Bowden discovers that an elderly neighbor of his is an escaped Nazi commandant named Kurt Dussander. Instead of turning him in, Todd blackmails him into recounting his hideous crimes. Todd once did a research paper on the camps and greatly impressed his teachers, who don't realize he is fascinated by the subject for all the wrong reasons.

The story tempts us to ask which character is more evil. Though Dussander has done worse things than almost any human being alive, Todd has ghastly potential. King depicts both characters as lacking in guilt but filled with fear, haunted by the threat of exposure. Dussander, unlike Todd, rationalizes his actions, giving the standard line about having been just following orders. Todd is simply a sneaky bully who puts on a public face of being a nice, well-adjusted kid.

Even I, a grandson of Holocaust survivors, almost found myself rooting for Dussander. He's smarter and more charming than the boy, and since he begins the story as victim, I had to marvel at the way he maneuvers the situation and turns it to his advantage. It is easy to forget that his cold rationality is in many ways more frightening than Todd's sick perversion. King exploits this deceptive quality of fiction by not letting us get to know any of Dussander's victims until late in the story.

Another question left unanswered is how much Todd's descent into violence is influenced by Dussander. He might have become that way on his own, but we can't be sure. His most obvious internal change surfaces when he privately rationalizes his lack of attraction to his girlfriend by thinking she must be secretly Jewish. (The real reason is that he has violent homoerotic fantasies which take the place of ordinary sexual feelings.) Did he get his anti-Semitism from Dussander, or was it there to begin with? His liberal parents show no signs of prejudice but are trapped in a world of empty platitudes that keep them from seeing what's in front of them. Joseph Reino's book Stephen King (Boston: G.K. Hall & Co., 1988) explains:
King is not saying that benign and "liberating" clichés are inherently wrong or that they cause Todd's inclination toward social misbehavior. Rather, his gothic perspective is that benevolent philosophies, reduced to thoughtless aphorisms and innocuous clichés, are utterly powerless against the boy's adamantine malevolence. (p. 123)
There are political overtones to the story, set at the end of the Vietnam War. Dussander defends himself by accusing America of hypocrisy: "The GI soldiers who kill the innocent are decorated by Presidents, welcomed home from the bayoneting of children and the burning of hospitals with parades and bunting.... Only those who lose are tried as war criminals for following orders and directives" (p. 130). Here and elsewhere, King hints at the idea that Americans tend to have a sense of incomprehension at evils committed by other countries yet fail to see the parallels when the evil is homegrown.

The introspective nature of the story may help explain why the movie (set in the 1980s) didn't work. Ian McKellen gives a fine performance as the aging Nazi, and some of the early scenes are very effective. But the movie quickly becomes artificial, contrived, and tasteless--all the qualities I worried the novella would exhibit.

The problems are various. The process of abridging the plot for screen time makes certain elements seem arbitrary. The racial aspects of Nazism are largely ignored. Most significantly, the film softens the character of Todd, depicting him more as a confused kid who gets in over his head than as an unrelenting psychopath. This change leads the movie to have a very different ending than in the novella.

I suppose the producers felt that audiences needed to be able to relate to the young protagonist, but it creates an imbalance that obscures the story's message about the nature of evil. The film can't even decide what exactly Todd and Dussander are guilty of doing. There are several confusing scenes that leave us unsure whether the two have been murdering animals or simply imagining doing so.

I had the feeling the filmmakers were interpreting the novella as a typical horror story because it was written by Stephen King. They underestimated the source material, a thoughtful fable with something valuable to say about the world. King applied his talents as an entertainer to a subject requiring more depth, and he would not have succeeded if he were merely a magician doing tricks.

Sunday, July 20, 2008

The chicken-and-egg of language

Steven Pinker is an experimental psychologist involved in research into the human mind, but he is also an unabashed popularizer whose books are full of pop culture references (especially comic strips). Apart from a few tedious sections, The Stuff of Thought: Language As a Window into Human Nature (recommended to me by a fellow blogger who merely read an article about it) is one of his best books. It applies a scientific perspective to a favorite subject of mine, the relationship between language and thought. But it does it with style, exploring a range of Americana from the semantics of Bill Clinton's lies (a topic that has already received far more attention than it deserves) to the grammar of profanity. I find the following hard to read without smiling:
Woody Allen's joke about telling a driver to be fruitful and multiply but not in those words assumes that Fuck you is a second-person imperative.... But Quang makes short work of this theory. For one thing, in a second-person imperative the pronoun has to be yourself, not you--Madonna's hit song was titled "Express Yourself," not "Express You." For another, true imperatives like Close the door can be embedded in a number of other constructions:

I said to close the door.
Don't close the door.
Go close the door.
Close the door or I'll take away your cookies.
Close the door and turn off the light.
Close the door when you leave tonight.


But Fuck you cannot:

*I said to fuck you.
*Don't fuck you.
*Go fuck you.
*Fuck you or I'll take away your cookies.
*Fuck you and turn off the light.
*Fuck you when you leave tonight.
(pp. 362-3)
The book's overarching theme is how the human mind influences the structure of language. Like most linguists, Pinker largely dismisses the notion that the influence goes the other way. That notion is the basis of the controversial Sapir-Whorf hypothesis, which predicts, for example, that if you grew up speaking a language like Hopi, which lacks verb tenses, you would end up with a different perception of time than if you grew up speaking a language like English.

Pinker discusses some of the alleged evidence for this hypothesis before disposing of it. For example, one Mayan language has no words for left and right. The speakers orient themselves using the mountain slope where they live, with the words "upslope" and "downslope" corresponding roughly with south and north, respectively. Researchers found that the speakers have trouble distinguishing left from right but can locate north and south after having been spun around blindfolded while indoors!

Pinker spoils the picture by revealing that another Mayan people with the same aptitudes does have words for left and right. Apparently, since both groups spend most of their lives outdoors, they have a stronger sense of north and south than we do but little use for the concept of left and right. The absence of those words from the language of one group is an effect, not a cause, of the group's traits.

Distinguishing cause and effect is the subject of the book's most fascinating chapter, where Pinker explains how the whole concept of causality, so central to our common experience, is tantalizingly hard to define. We perceive the flow of time as consisting of nothing but causes and effects, and this intuition is deeply entrenched in language. But "the world is not a line of dominoes in which each event causes exactly one event.... The world is a tissue of causes and effects that criss and cross in tangled patterns" (p. 215). The challenge of identifying which causes are most relevant and guessing what would have happened if not for certain events--effectively imagining an alternate universe--underlies everything from scientific knowledge to moral responsibility.

One of his examples is President Garfield's assassin, who argued that "The doctors killed him; I just shot him." The wound was potentially nonfatal, but the doctors were wildly incompetent even by the standards of their day. Did this get the assassin off the hook? The jury didn't think so, and they sent him to the gallows.

A more recent example came in the aftermath of 9/11. Insurance companies were pledged to reimburse for each destructive event. But was the destruction of the Twin Towers one event or two? This question held billions of dollars at stake.

Questions like these are almost unanswerable because the world, contrary to our perceptions, is a continuum without clear boundaries between things. This dichotomy can be seen in the two categories of nouns, count and mass. Count nouns are words like book, which you can count: you can talk about one book, two books, etc. Mass nouns are words like jello which lack that property. You can't talk about one jello or two jellos; there's just jello.

Curiously, some mass nouns, like furniture, refer to material that should be countable. (We get around this problem by talking about "pieces of furniture.") And many nouns can perform both roles: rock is a mass noun in the sentence "The ground is made of rock" and a count noun in the sentence "I'm holding two rocks."

Speakers will occasionally transform a count noun into a mass noun by imagining that something discrete is made up of an amorphous substance. Pinker's example is the distasteful statement "After he backed up, there was cat all over the driveway." His point is that the count/mass distinction doesn't force us into any particular way of thinking, because we can escape that thinking by manipulating the language. But the distinction does reveal how we choose whether to view matter as a collection of objects or as a lump of "stuff."

I've only mentioned a fraction of what the book covers. With each topic, Pinker builds on the thesis that language reflects more than affects our minds, which can see past the constraints it imposes on us. (You might think this undercuts the point I made in my post "The cage of language," but actually I think it reinforces it.) Identifying these constraints helps us understand how we perceive the world and thus provides a way for us to transcend those perceptions.